Researchers at the University of Washington’s Center for an Informed Public are examining the science behind social media platforms like Facebook, Twitter, and others. (Jeremy Zero Photo via Unsplash)
As misinformation about COVID-19 and vaccines grows, the University of Washington’s Center for an Informed Public (CIP) is stepping up efforts to document, understand, and combat the rampant spread of unsubstantiated claims on social media platforms.
The center announced today that it has received $ 2.25 million from a total of $ 3 million grant from the National Science Foundation. The money will be used to “develop and evaluate rapid response methods for studying and communicating about disinformation,” said Kate Starbird, an associate professor of human centered design and engineering at UW who will lead the project.
The new initiative starts in October and includes support from Stanford University.
The CIP has worked on similar issues. The UW was part of a cross-university team called Election Integrity Partnership founded in the summer of 2020, which monitored and shared incorrect and disinformation about the November elections that were disseminated on social media in real time.
Launched in 2019, the CIP facilitates collaboration between professors in fields such as engineering, law, biology, and more to explore the powerful role that Facebook, Twitter, and other platforms play in global communication.
Joseph Bak-Coleman, postdoctoral fellow at the University of Washington’s Center for an Informed Public. (UW photo)
In June, CIP’s Joseph Bak-Coleman was the lead author of a paper calling for the study of “collective behavior” – how we collect and share information and make decisions – to be elevated to the urgent status of “crisis discipline”. The research was a kind of call to arms to draw attention to the massive challenges posed by misinformation and communication networks. It was published in the Proceedings of the National Academy of Sciences.
“We have global problems that require global communication. We won’t fix global warming if we can’t talk to each other. So it is great that we have these tools to enable information to be disseminated around the world, ”he said. “Unfortunately, as they are currently being built and used, they don’t seem to be necessarily optimized for it. They are sales-optimized. “
In the face of the pandemic, national elections, climate crisis, and other events, social media platforms have been breeding grounds for confusion and worse. As vaccination rates lag behind and delta COVID increases, President Biden last month accused Facebook of “killing people” for failing to effectively contain the spread of falsehoods about vaccines.
Biden later softened his message, but there are widespread calls for the platforms to be regulated more aggressively as the multi-billion dollar companies have had limited success in self-regulating inaccurate content.
“It seems almost unreasonable to suggest that we can just let go of things and that an invisible hand will lead society into the happiest and healthiest future,” said Bak-Coleman, postdoctoral fellow.
We met Bak-Coleman for an interview about his latest publication. Answers have been edited for clarity and length.
GeekWire: Your background is biology and you see parallels between people engaging in social networks and biological systems. Can you explain?
Bak-Coleman: I work in collective behavior, originally on schools of fish, but across species. And one of the things that we keep seeing is that animals often do these magical things, or things that seem like magical things – like flocks of birds deciding where to go, or fish that predators or locusts avoid that rave in unison – this is all simple local rules, and then the network structure allows [collective behavior] to emerge.
One of the more haunting examples is ants following pheromone trails, and the rule they roughly obey is: if I smell a pheromone trail laid by an ant, I stick to it, which causes me to put it down as well. When they spin in circles, they can get stuck in a state of moving in circles until they all starve and die. It’s called ant mills or ant death coils.
I learned this at the same time as the 2016 election, which was a wake-up call for many Americans that social media was making a difference. And I happened to be teaching conservation biology at the same time, so it all came together.
GW: What’s the path to more responsible social media?
Bak-Coleman: In the most optimistic part of my brain, I would hope that at some point companies will realize, and regulators and the public will have an awareness, that “Wow, this is one thing we can’t just monetize,” and then we have to find sustainable business models.
This is my most optimistic self. I don’t know that this is the case with every company. Certainly Facebook shows that this is not the direction they want to go.
So it could boil down to one of the regulators realizing that this type of chaotic social system is just a nightmare for the governance that may be a part of it. It could be the general public recognition that they don’t love the idea that big corporations regulate society and our interaction. Or maybe scientists can find clever ways to expose how damage is caused by these technologies. I think all of these things that come together will hopefully help create transparency. And then transparency ideally ensures more attention for what is happening and is a kind of feedback process.
GW: How did social media companies evade regulation?
Bak-Coleman: One of the things fossil fuel companies have done – and the same thing with tobacco companies and the Sackler family with opioids – is agnotology, trying to create insecurity. That’s your goal – you create doubt, enough uncertainty to avoid regulation.
And then you put little patches on it. You put a filter on a cigarette and now you say it’s safe. I think companies are actually following the same playbook.
If you are looking at this press release from Facebook [on July 17], it was filled with half-baked statistics. To say that 85% of Facebook users are interested in vaccines and all of this is almost like textbook disinformation trying to create the impression that this company has only done good.
GW: You are calling for an “evidence-based administration” of communication networks. What does that mean?
Bak-Coleman: All we are committed to is that scientists should start thinking about how the system works and how it is failing. And then we can allow the public and regulators to make informed decisions about our welfare systems.
There are some basic things that we as scientists still don’t understand about how to build a healthy, large-scale communications network that is ideally profitable for the businesses that we need to figure out.
We are not advocating a kind of technocracy or an elite-controlled social media system, on the contrary. We advocate an understanding that enables society as a whole to make informed decisions about how it ideally wants to design social media systems so that everyone has a voice and access to information.
GW: What are the further implications if social media is no longer the main source of misinformation?
Bak-Coleman: If we solve this problem, we will solve many other things as well. If we have a good, healthy information ecosystem, it shouldn’t be difficult to see leaders elected who advocate basic public health policies. Getting people to take safe, healthy, effective vaccines shouldn’t be difficult.
On the one hand, it’s a tougher problem in some ways, but a lot of the reasons why climate change is so severe [to respond to] is because we don’t understand collective behavior and that is the goal we want to achieve.
It’s a big problem and the problem is urgent, but we can move forward. And it might not be the full extent of utopia, but it could tweak recommendation algorithms to vaccinate more people, or it could avoid radicalization and stop genocide.
We can make real tangible progress, probably pretty easily in large parts of it, even though the great healthy ecosystem may still be a little distant.