Facebook is giving you more control over what you see when you use the social media platform — in particular what appears in your News Feed and who’s allowed to comment on your posts. The social network announced Wednesday that it’s introducing a number of changes, all designed to boost transparency and trust in the way its algorithms operate.
Facebook’s vice president of global affairs, Nick Clegg, published a blog post at the same time the tools were announced, in which he took issue with the idea that Facebook’s algorithms fuel polarization and manipulate people in pursuit of profit. Clegg said that newly introduced tools are part of a broader move to show people how their own relationship with the algorithm affects their experience on the social network and to boost transparency into how the company works.
“These measures are part of a significant shift in the company’s thinking about how it gives people greater understanding of, and control over, how its algorithms rank content, and how it can at the same time utilize content ranking and distribution to ensure the platform has a positive impact on society as a whole,” he said in the blog post.
Facebook, like other social networks, has been under more scrutiny from lawmakers, regulators, advocacy groups and celebrities who say the company hasn’t done enough to curb the spread of hate speech, misinformation and other offensive content on their platforms. Last week, Facebookappeared before Congress alongside Twitter CEO Jack Dorsey and Google CEO Sundar Pichai to testify about how they’re handling misinformation. Giving users more control or insight over how their content gets ranked on their feeds is one way social networks are attempting to address concerns about harmful content.
Most significant among the new tools is the ability to determine what content appears in your News Feed and in what order. You’ll now have the power to determine whether you see content appearing chronologically as it’s posted, or, as is currently standard on the platform, to see what Facebook’s algorithm has decided to show you. If neither of those choices offers you the experience you’re after, you’ll also be able to filter to see content only from your favorites.
To access these new options, you’ll need to look out for the Feed Filters Bar that should appear at the top of your News Feed. This will allow you to switch between the different feeds. Back in October, Facebook introduced a tool allowing you to choose up to 30 favorite people and pages you most enjoy seeing content from, so make sure you’ve chosen some accounts to view content from before choosing that option in the Feed Filters Bar.
Facebook sometimes places content in your News Feed from people or pages you don’t actively follow, which can be confusing. Even with the new controls, it will continue to do this but will now explain why it’s showing you this content. When these “suggested for you” posts pop up in your feed, you’ll now have the option to tap “Why am I seeing this?” and that should provide you with more context around the decision.
In addition, you’ll also have new powers to control who is able to comment on your own public posts. This can range from anyone who can see the post to only the specific people and pages you tag. The hope, said Facebook in a press release, is that you’ll be able to limit unwanted interactions and engage in conversations that are “meaningful” to you.
Clegg’s remarks weren’t enough to fend off ongoing criticism of the company. On Wednesday, a group of Facebook critics known as the Real Facebook Oversight Board called Clegg’s post a “cynical, breathtaking display of gaslighting.” “He provides a blatantly false, easily disproved take on both Facebook’s political reach and incentives, which are built entirely around maximizing profits at the expense of human life and democracy,” the Real Facebook Oversight Board said in a statement.
In his post, Clegg promised that Facebook will be introducing more changes later this year to help improve transparency around how the News Feed algorithm works. The company will also launch more surveys to try to understand how people feel about their experiences on Facebook and then adjust the algorithm as a result.
Clegg argued that in spite of suggestions to the contrary, Facebook doesn’t have a polarizing effect on society and its algorithms aren’t incentivized to promote sensational and misleading content. “It’s not in Facebook’s interest — financially or reputationally — to continually turn up the temperature and push users towards ever more extreme content,” he said. Clegg added in an interview published on The Verge the company also needs to provide more data and evidence about what content is popular on News Feed.
In the post, titled “You and the Algorithm: It Takes Two to Tango,” the Facebook exec encourages people to view themselves as having a relationship with the company’s algorithms, rather than simply being passive victims of it. He said that as social media and the machines on which it is built are here to stay, it’s time to “make our peace with them.”