A top executive at Facebook said Sunday that the social media giant is “extensively” supporting regulators to access its algorithms, as well as “greater transparency” to raise concerns about the company’s impact on children and the spread of harmful misinformation.
Nick Clegg, vice president of global affairs, said that while he believes Facebook has done its utmost to filter out harmful content and that its discontinued program “Instagram Kids” will be part of the solution, he supports more supervision and control.
“We need more transparency,” he told CNN’s Dana Bash. “The systems we have … should be accounted for according to the rules, so that what our system says matches what people tell us to do.”
The advice he shared was to amend Section 230 of the Federal Communications Dissentation Act. This regulation, passed in 1996, protects online service providers from liability for content published by its users, as well as from liability for control.
“My advice would be to do the protection that is provided to online companies like Facebook, depending on the system and their supposed application of their policies,” he said. “And if they fail to do so, they will remove that liability protection. This seems to us to be at least a reasonable change to consider at least section 233.
Although Clegg said he supports more oversight, defending the company’s current user content management, he told ABC News ‘George Stefanopoulos in a separate interview on Sunday that the algorithm is a necessary “spam filter” to rank content on Facebook and Instagram users’ newsfeeds. “
“If you sort across the board, remove the algorithm, the first thing that will happen is that people will see more, not less, hate speech. More, not less, misinformation. More, not less, harmful content,” he said. .
Clegg said Facebook also plans to introduce new tools for parents. This will include parental supervision. He will also receive automatic reminders for teenagers to take a break if the program has been used for a long time and will have an “eyeball” to look away from content that “may not be good for their well-being.”
Clegg said only so many companies can do it themselves, and lawmakers need to do more.
“We cannot make all these decisions, and we can provide all these social solutions ourselves. That means, or lawmakers need to work, ”he said.
Sen. Amy Cloboucher (D-Min.), Who responded to Clegg’s comments in a later interview with CNN, said she appreciated his willingness to talk about it, “but now is the time to act.”
Klobuchar additionally supported the expansion of the Children’s Online Privacy Act to include protection for children under the age of 13. The current law, which only protects children under the age of 13, requires websites to have specific protections for children, including those who do not collect personal information. Klobuchar has called for more powerful no-confidence laws that would prevent companies like Facebook from becoming too powerful.
“I like competition. I believe in capitalism,” he said. Any serious review of Merger, not just with Facebook, with all these technology companies, with pharmaceutical companies. ”