The morning news on October 29, Beijing time,Facebook’s global security director told British lawmakers on Thursday that Facebook’s algorithm would downgrade rather than promote polarized content, adding that the company welcomes effective government regulation.Governments in Europe and the United States are working hard to regulate social media platforms to reduce the spread of harmful content, especially to young users.
Former employees criticize Facebook for spreading hateful content, exacerbating world conflicts
The UK takes the lead. If social media companies cannot delete or restrict the dissemination of illegal content, they will face a fine of up to 10% of their turnover.
If these measures do not work, it may propose secondary legislation that makes company directors liable.
On Monday, Facebook whistleblower Frances Haugen told the same committee of councillors that Facebook’s algorithm pushed extreme and divisive content to users.
Antigone Davis, Facebook’s global security director, denied the allegations.
“I don’t think we are amplifying hatred,” Davis told the committee on Thursday. “I think we are trying to ensure that we are downgraded to deal with divisive or polarized content.”
She said that she cannot guarantee that users will not be recommended for hateful content, but Facebook is using artificial intelligence to reduce its probability to 0.05%.
“We are not interested in amplifying hatred on our platform, creating a bad experience for people, and they won’t return to our platform again,” she said. “Our advertisers will not let this happen either.”
In addition, Davis also said that Facebook announced on Thursday that it will be renamed Meta, hoping that regulators will contribute to the safety of social media platforms, such as research on eating disorders or body image.
She said: “Many of these are social issues, and we hope that regulators can play a role.” She added that Facebook welcomes a regulator with “appropriate and effective law enforcement powers.”
“I think the criminalization of directors is a very serious step. I am not sure whether we need to take this step.”