There is no quick fix to the complex and contradictory problems posed by large social media platforms, but Yonatan is concretely interested in soliciting the perspectives and levers of accountability from inside.
Landecker Democracy Fellow Yonatan’s project will empower content moderators and other workers who typically work behind the scenes in varying conditions for large social media companies, including but not limited to Facebook. These content moderators, and other internal support staff, are sometimes directly employed. However, they are usually hired as contractors with precarious working contracts. The geographic scope of this project will focus on Berlin but a lot of the legal/educational material will be applicable more broadly within Germany.
For the past decade, Yonatan has been holding large tech companies accountable. As a professional software developer and technologist, he has a nuanced perspective on the issues pertaining to “Big Tech.” He believes in it and always has worked with a multi-pronged approach to change, including legislative advocacy, impact litigation, and empowering people with technical alternatives. However, there are tensions of legitimate interests, e.g., privacy and free speech. Yonatan’s views on those concepts have evolved particularly after moving from a free speech maximalist nation like the United States to currently residing in the more cautious German state.
Given the digital nature of social media platforms, they are governed by a mosaic of US and regionally specific hard laws (GDPR (EU privacy/ data control), Netzwerkdurchsetzungsgesetz (German specific content moderation), DMCA (US copyright) etc.) and softer laws (Terms of Service, Codes of Conduct), external pressure (which led to the formation of Facebook Oversight Board) and finally, internal pressure from employees directly, which have partly led to Twitter/Facebook’s decisions to suspend former American President Donald Trump.
There is no quick fix to the complex and contradictory problems posed by large social media platforms, but Yonatan is concretely interested in soliciting the perspectives and levers of accountability from inside, (i.e., from employees and the contractors). The problems of social media are frequently discussed in the media and academia, but the conditions and challenges faced by the people who keep them running are rarely given the same level of attention.
By empowering largely precarious and migrant content moderators to learn what their range of tools are, from legal action in the cases of blatant violations to softer advocacy, we can ensure that social media platforms are tools of legitimate discourse, and not rogue authoritarianism.
Some clear deliverables will be research insights into the issues impacting content moderators, whistleblower trainings, and published resources. The goal is to disincentivize tech companies from engaging in illegal behavior; in the ideal scenario, workers will have nothing to blow the whistle on. If this were the case, measuring progress based on the number of whistleblower reports would be fallacious. Unfortunately, many things that are morally wrong are perfectly legal, and Yonatan expects to generate public discussion in the press by asking the hard questions about where the line should be.
Updated December 2021.