Invisible Workers on the Global Assembly Line: Behind the Screen
from Digital and Cyberspace Policy Program and Net Politics

Invisible Workers on the Global Assembly Line: Behind the Screen

In her new book, Behind the Screen: Content Moderation in the Shadows of Social Media, Dr. Sarah T. Roberts reveals the inner workings of the world of content moderation on social media platforms. 
People use computers at an internet cafe in Hefei, Anhui province
People use computers at an internet cafe in Hefei, Anhui province REUTERS/Stringer

This article was originally posted on Balkinization.

Catherine Powell is Professor of Law, Fordham Law School and Adjunct Senior Fellow at the Council on Foreign Relations. You can reach her by e-mail at [email protected].

More on:

Social Issues

Social Media

Abigail Van Buren is a Paralegal at the Legal Aid Society, Criminal Appeals Bureau. You can reach her by e-mail at  [email protected]

During a recent Council on Foreign Relations roundtable we hosted, Dr. Sarah T. Roberts spoke about a global digital workforce that is largely hidden from consumers—content moderation. Content moderators determine whether user-generated content should remain online or be removed, based on a set of rules established by each digital platform that screens for content. Because tech companies often outsource content moderation to workers who can work informally from home, at first blush, these jobs seem particularly well-suited to women who are caring for young children and other workers seeking flexible work schedules. In her newly published bookBehind the Screen: Content Moderation in the Shadows of Social Media, Roberts tells a less sanguine story about the world of content moderation, which increasingly plays a major role in keeping social media firms functioning.

Dr. Roberts outlined two concerns that have emerged in the context of this largely invisible workforce: (1) the harsh conditions inherent in this type of work, and (2) the fact that the relative invisibility of this workforce helps sustain the myth that what we see online is an unmediated space of free speech and open markets. In fact, with the commercialization of the internet, content moderation is part and parcel of maintaining the brand integrity of social media platforms and maintaining norms of civility on the web. Essentially, content moderators shield us as users from hateful, violent, and traumatizing content to keep us coming back to digital platforms. The volume of misogynist—and often violently misogynist—materials online makes the work of female content moderators particularly traumatizing.

As users, we experience what appears to be a free exchange of ideas, free speech, and unrestricted movement of economic transactions. In fact, “behind the screen,” content moderators face an almost impossible mandate to filter through what not only should be taken off platforms, but what should be left on. Users of social media can post content from anywhere in the world, at any time. With access to a global workforce in multiple time zones (with literacy in multiple languages), platforms can enlist content moderators in a range of sites to continuously and seamlessly maintain constant vigilance in screening and taking down offensive content.

The stress and psychological damage of appraising thousands of the worst images and texts humanity can conjure up creates an ecosystem of harm for those who moderate content. Given what is at stake for platforms, one would expect a comparable wage to match. However, this is not the case. The workforce is not only low-waged, but low-valued. As one of us has previously written, while tech sector jobs could provide tremendous opportunities for women, women are often concentrated in low-skilled tech jobs like call centers, or work in isolated environments where they cannot advocate for themselves (or with others) as workers. Companies often outsource content moderation to workers who can work informally from home at hours in which their work is needed. However, as Roberts remarked, tech companies have intentionally designed an alienating experience for content moderators to prevent organizing and whistle-blowing. Like other workers along the global assembly line one of us has written about previously, content moderators face systemic hurdles to protesting conditions, which makes these workers particularly vulnerable.

More on:

Social Issues

Social Media

It is precisely the invisibility of this workforce that masks the ways in which our digital spaces are mediated. As content generators and users of social media, we trust our experience of “freely” posting content and connecting with one another online, unbound by restrictions and rules. We are able to enjoy this “freedom” precisely because content moderation is hidden from view—not only the rules governing content moderation decisions, but the workers themselves. Although we cannot see it, the work of content moderators goes even beyond our screens as they watch for how content can lead to physical danger in real life. For example, false rumors on Facebook about a Muslim man raping a Buddhist woman triggered deadly riots in Myanmar in 2014—demonstrating that the lack of culturally sensitive content moderators can be fatal. In fact, due to those riots, Facebook came under scrutiny for not having enough Burmese-speaking content moderators to combat hate speech in Myanmar, prompting the company to pledge to hire more.

While content moderators are undervalued, the work they do requires remarkable skill. Since harmful content can be produced by anyone, speaking any language, from any particular cultural context, content moderators must be diverse both linguistically and culturally. Unsurprisingly, many American tech companies utilized content moderators who live in the Philippines, where the United States has a long-standing colonial relationship, colloquial American English is taught, and wages are kept low.

With the global distribution of these workers, Roberts noted that competition for jobs at different sites can easily incentivize a race to the bottom in wages and conditions. Roberts mentioned that the fact that content moderators are globally dispersed has created racial tensions, as opposed to racial solidarity. With Americans losing jobs—for example, with farms foreclosing—tech firms are just as likely to look to Iowa as India for content moderators. One outsourcing firm even used the tagline, “Outsource to Iowa, not India” to sell their personnel as superior to Indian content moderators, given that Iowans are ostensibly more likely to understand the nuances of U.S.-centric rules. This kind of marketing plays on racial and nationalist fears, adding another layer of complexity in the discussion about content moderators.

Beyond these challenges, automation is on the horizon, given the rise of AI content moderation, raising the specter of job displacement. Roberts explained that while it is unlikely that human oversight will be completely eliminated from content moderation any time soon, there is a prospect of less human involvement. As seen globally, automation is a threat to workers—especially female workers, with an estimated 11 percent of the female workforce at risk of being displaced by automation in the next two decades.

As we move into a new decade in 2020, Sarah Roberts’ book stands out as one of the most important manuscripts to close out this decade. With the digital turn—and the move to automation—we need to grapple more deeply with how to ensure stronger legal protections for these invisible workers on the global assembly line.

Creative Commons
Creative Commons: Some rights reserved.
Close
This work is licensed under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) License.
View License Detail