Facebook suspends 16,000 accounts for trading fake reviews
Facebook suspended the accounts for selling or buying fake reviews of products and services. It also committed to making it harder for people to find groups and profiles that buy and sell fake reviews.
The move comes after a CMA investigation found evidence of misleading content on Facebook – despite the site being warned about this previously.
In January 2020, Facebook committed to better identify, investigate and remove groups and other pages where fake and misleading reviews were being traded, and prevent them from reappearing.
Facebook gave a similar pledge in relation to Instagram, which it owns, in May 2020, after the CMA identified similar concerns.
But a follow-up investigation in August 2020 found evidence that the illegal trade in fake reviews was still taking place on both Facebook and Instagram, and the CMA intervened for a second time.
Facebook has now removed 16,000 groups that were dealing in fake and misleading reviews. It has also made further changes to its systems for identifying, removing and preventing such content on its social media platforms to ensure it is fulfilling its previous commitments.
The tech company has committed to suspending or banning users who are repeatedly creating Facebook groups and Instagram profiles that promote, encourage or facilitate fake and misleading reviews.
It will also introduce new automated processes that will improve the detection and removal of this type of content.
Facebook will also put in place dedicated processes to make sure that these changes continue to work effectively and stop the problems from reappearing.
Andrea Coscelli, chief executive of the CMA, said: “Never before has online shopping been so important. The pandemic has meant that more and more people are buying online, and millions of us read reviews to enable us to make informed choices when we shop around. That’s why fake and misleading reviews are so damaging – if people lose trust in online reviews, they are less able to shop around with confidence, and will miss out on the best deals. It also means that businesses playing by the rules miss out.
“Facebook has a duty to do all it can to stop the trading of such content on its platforms. After we intervened again, the company made significant changes – but it is disappointing it has taken them over a year to fix these issues.”
The CMA will continue to keep a close eye on Facebook and Instagram, and will take further action if necessary.
The move follows the government’s announcement that a dedicated Digital Markets Unit (DMU) will be set up within the CMA from April 2021. Once the necessary legislation is in place, this will introduce and enforce a new code for governing the behaviour of platforms that currently dominate the market.
Rocio Concha, director of policy and advocacy at Which?, said: “We’ve previously raised the alarm about fake review factories continuing to operate at scale on Facebook, leaving online shoppers at huge risk of being misled. The tech giant failed to meet its earlier commitment to the CMA, so it is positive that the regulator has stepped in and demanded more robust action.
“Facebook must deliver this time round – it has shown it has the sophisticated technology to eradicate these misleading review groups and needs to do so much more swiftly and effectively.
“The CMA and Facebook now need to monitor the situation and if the problems persist the regulator must take stronger measures to ensure that trust in online reviews does not continue to be undermined.
“Online platforms should also have greater legal responsibility for tackling fake and fraudulent content and activity on their sites.”