Facebook is testing pop-up messages telling people to read a link before they share it – TechCrunch

Years after opening a Pandora’s box of bad behavior, social media companies are trying to find subtle ways to reshape the way people use their platforms.

Similar to Twitter, Facebook is trying out a new feature to encourage users to read a link before sharing it. The test will reach 6 percent of Facebook Android users worldwide in a gradual rollout aimed at promoting the “informed sharing” of messages on the platform.

Users can still easily click through to share a particular story, but the idea is that by adding friction to the experience, people could rethink their original impulses to share the kind of inflammatory content currently dominating the platform.

Starting today, we’re testing a way to encourage the informed exchange of news articles. When you share an unopened link to a news article, you’ll see a prompt asking you to open it and read it before you share it with others. pic.twitter.com/brlMnlg6Qg

– Facebook Newsroom (@fbnewsroom) May 10, 2021

Twitter introduced prompts asking users to read a link before tweeting it again last June. The company quickly determined that the testing function was successful and expanded it to include more users.

Facebook started trying more prompts like this one last year. Last June, the company released pop-up messages to warn users before sharing content that is more than 90 days old in order to reduce misleading stories out of their original context.

At the time, Facebook said they were other pop-up prompts to reduce some types of misinformation. A few months later, Facebook posted similar popup messages detailing the date and source of all links related to COVID-19.

The strategy shows that Facebook prefers a passive strategy to move people away from misinformation and towards their own verified resources on hot button issues like COVID-19 and the 2020 elections.

While the jury is still unsure of the impact this type of gentle behavioral shaping can have on the misinformation epidemic, both Twitter and Facebook have also investigated calls that discourage users from posting abusive comments.

Pop-up messages that make users feel that their bad behavior is being watched may be used where the automated moderation is on social platforms. While users are likely to be better served by social media companies scrapping their misinformation and abusive existing platforms and building them from scratch in a more thoughtful way, small changes in behavior need to be enough.

Leave a Comment