Filippo Menczer, Indiana University –The Vulnerability of Online Engagement

Social media algorithms want you to follow the crowd.

Filippo Menczer, professor of informatics and computer science at Indiana University, examined how this can be gamed by certain users.

Filippo Menczer is a Luddy distinguished professor of informatics and computer science and the director of the Observatory on Social Media at Indiana University. He holds a Laurea in Physics from the Sapienza University of Rome and a Ph.D. in Computer Science and Cognitive Science from the University of California, San Diego. Dr. Menczer is an ACM Fellow and a board member of the IU Network Science Institute. His research interests span Web and data science, computational social science, science of science, and modeling of complex information networks. In the last ten years, his lab has led efforts to study online misinformation spread and to develop tools to detect and counter social media manipulation.

The Vulnerability of Online Engagement

AM-favicon-pink

Social media like Facebook, Instagram, Twitter, YouTube, and TikTok rely on people’s behavior to decide on the content that you see. This idea is based on the “wisdom of the crowd”, according to which the collective opinions from many people are likely “right”. In fact, our brain has evolved simple instincts based on this, like imitation and conformity.

Social media algorithms exploit this idea by watching for content that people like, comment, and share. They use these engagement signals to predict what we like and rank this content at the top of our news feeds. On the surface this seems reasonable: if lots of people like something, it must be worthy of our attention. 

But our research shows that in general, engagement bias leads to lower overall quality of online content. The reason is that algorithms amplify noisy engagement signals. Once a low-quality item becomes popular, it keeps getting amplified.

People are also tricked by engagement signals. We ran an experiment using a game called Fakey, developed by our lab, that simulates a social news feed. We found that players are more likely to share low-credibility content when they see that many others have shared or liked those articles.

The wisdom of the crowds fails because it assumes that the crowd is made of reasonable and independent actors. But both assumptions are violated on social media. People engage with junk content, and are similar to their online friends. We unfriend people we disagree with.

Worse, engagement signals can be gamed. Trolls, bots, and coordinated accounts flood the network and create an appearance of engagement. These strategies hack both social media algorithms and our brains.

What to do? Perhaps platforms could hide engagement signals and add friction to lower the volume, decreasing opportunities for manipulation.

Share
No Responses