NEW YORK, AP – In the aftermath of damning testimony that its platforms harm children, Facebook plans to introduce several features to protect young people.
These would include prompting teens to take a break from using photo-sharing app Instagram, and ‘nudging’ those who repeatedly look at the same content that is not conducive to their well-being.
Facebook is also planning to introduce controls to allow parents or guardians to supervise what their teens are doing online.
The initiatives come after Facebook announced late last month it was pausing work on its Instagram for Kids project. But critics say the plan lacks details and are sceptical the new features will be effective.
The controls were outlined on Sunday by Nick Clegg, Facebook’s vice president for global affairs, who made the rounds on various news shows in the US, where he was grilled about Facebook’s use of algorithms as well as its role in spreading harmful misinformation ahead of the January 6 Capitol riots.
Top Australian Brokers
- City Index - Aussie shares from $5 - Read our review
- Pepperstone - Trading education - Read our review
- IC Markets - Experienced and highly regulated - Read our review
- eToro - Social and copy trading platform - Read our review
“We are constantly iterating in order to improve our products,” Clegg told Dana Bash on State of the Union Sunday.
“We cannot, with a wave of the wand, make everyone’s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use.”
Clegg said Facebook has invested $US13 billion ($A18 billion) over the past few years in keeping the platform safe and that the company has 40,000 people working on user safety.
The flurry of interviews came after whistleblower Frances Haugen, a former data scientist with Facebook, went before Congress last week to accuse the social media platform of failing to make changes to Instagram after internal research showed apparent harm to some teens.
She also accused the company of being dishonest in its public fight against hate and misinformation. Facebook denies the claims.
Josh Golin, executive director of Fairplay, a watchdog for the children and media marketing industry, does not think introducing parental controls would be effective since many teens set up secret accounts anyway.
He was also dubious about how effective nudging teens to take a break or move away from harmful content would be.
Facebook needs to show exactly how they would implement the tools, and offer research that shows they are effective, he said.
“There is tremendous reason to be sceptical,” he said.
When Clegg was questioned about the use of algorithms in amplifying misinformation ahead of the Capitol riots, he responded that if Facebook removed the algorithms people would see more, not less, hate speech and misinformation.
The algorithms serve as “giant spam filters”, he said.
Senator Amy Klobuchar, who chairs the Senate Commerce Subcommittee on Competition Policy, Antitrust, and Consumer Rights, said it is time to update children’s privacy laws and offer more transparency in the use of algorithms.
“I appreciate that he is willing to talk about things, but I believe the time for conversation is done,” said Klobuchar, referring to Clegg’s plan.
“The time for action is now.”