Get Unlimited. Save 40%
Social media companies need to give their data to independent researchers to better understand how to keep users safe
Social media platforms are where billions of people around the world go to connect with others, get information and make sense of the world. These companies, including Facebook, Twitter, Instagram, Tiktok and Reddit, collect vast amounts of data based on every interaction that takes place on their platforms.
And despite the fact that social media has become one of our most important public forums for speech, several of the most important platforms are controlled by a small number of people. Mark Zuckerberg controls 58% of the voting share of Meta, the parent company of both Facebook and Instagram, effectively giving him sole control of two of the largest social platforms. Now that Twitter’s board has accepted Elon Musk’s $44 billion offer to take the company private, that platform will likewise soon be under the control of a single person. All these companies have a history of sharing scant portions of data about their platforms with researchers, preventing us from understanding the impacts of social media to individuals and society. Such singular ownership of the three most powerful social media platforms makes us fear this lockdown on data sharing will continue.
After two decades of little regulation, it is time to require more transparency from social media companies.
In 2020, social media was an important mechanism for the spread of false and misleading claims about the election, and for mobilization by groups that participated in the January 6 Capitol insurrection. We have seen misinformation about COVID-19 spread widely online during the pandemic. And today, social media companies are failing to remove the Russian propaganda about the war in Ukraine that they promised to ban. Social media has become an important conduit for the spread of false information about every issue of concern to society. We don’t know what the next crisis will be, but we do know that false claims about it will circulate on these platforms.
Unfortunately, social media companies are stingy about releasing data and publishing research, especially when the findings might be unwelcome (though notable exceptions exist). The only way to understand what is happening on the platforms is for lawmakers and regulators to require social media companies to release data to independent researchers. In particular, we need access to data on the structures of social media, like platform features and algorithms, so we can better analyze how they shape the spread of information and affect user behavior.
For example, platforms have assured legislators that they are taking steps to counter mis/disinformation by flagging content and inserting fact-checks. Are these efforts effective? Again, we would need access to data to know. Without better data, we can’t have a substantive discussion about which interventions are most effective and consistent with our values. We also run the risk of creating new laws and regulations that do not adequately address harms, or of inadvertently making problems worse.
Some of us have consulted with lawmakers in the United States and Europe on potential legislative reforms like these. The conversation around transparency and accountability for social media companies has grown deeper and more substantive, moving from vague generalities to specific proposals. However, the debate still lacks important context. Lawmakers and regulators frequently ask us to better explain why we need access to data, what research it would enable and how that research would help the public and inform regulation of social media platforms.
To address this need, we’ve created this list of questions we could answer if social media companies began to share more of the data they gather about how their services function and how users interact with their systems. We believe such research would help platforms develop better, safer systems, and also inform lawmakers and regulators who seek to hold platforms accountable for the promises they make to the public.
Social media companies ought to welcome the help of independent researchers to better measure online harm and inform policies. Some companies, such as Twitter and Reddit, have been helpful, but we can’t depend on the goodwill of a few companies, whose policies might change at the whim of a new owner. We hope a Musk-led Twitter will be as forthcoming as before, if not moreso. In our fast-changing information environment, we should not regulate and legislate by anecdote. We need lawmakers to ensure our access to the data we need to help keep users safe.
This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.
Renée DiResta is the technical research manager at Stanford Internet Observatory. Twitter: @noUpside.
Laura Edelson is a postdoctoral researcher at New York University. Twitter: @LauraEdelson2.
Brendan Nyhan is James O. Freedman Presidential Professor of Government at Dartmouth College. Twitter: @brendannyhan.
Ethan Zuckerman is associate professor of public policy, information and communication at the University of Massachusetts Amherst. Twitter: @ethanz.
Sophie Bushwick and Tulika Bose
Rodney E. Rohde and The Conversation US
Benjamin Storrow and E&E News
Jared DeCoste | Opinion
SPACE.com and Josh Dinner
Discover world-changing science. Explore our digital archive back to 1845, including articles by more than 150 Nobel Prize winners.
Scientific american arabic
© 2022 Scientific American, a Division of Springer Nature America, Inc.
All Rights Reserved.
Support science journalism.
Thanks for reading Scientific American. Knowledge awaits.
Already a subscriber? Sign in.
Thanks for reading Scientific American. Create your free account or Sign in to continue.
See Subscription Options
Continue reading with a Scientific American subscription.
You may cancel at any time.