It’s easy to say that the technology companies can do better. They can, and should. But ultimately, that’s not the problem. The problem is the media ecosystem they have created. The only surprise is that anyone would still be surprised that social media produce this tragic abyss, for this is what social media is supposed to do—what it was designed to do: to spread the images and messages that accelerate interest, without check, and absent concern for their consequences. It’s worth remembering that “viral” spread once referred to contagious disease, not to images and ideas. As long as technology platforms drive the spread of global information, they can’t help but carry it like a plague.
Fox has long been a bane of liberals, but in the past two years many people who watch the network closely, including some Fox alumni, say that it has evolved into something that hasn’t existed before in the United States. Nicole Hemmer, an assistant professor of Presidential studies at the University of Virginia’s Miller Center and the author of “Messengers of the Right,” a history of the conservative media’s impact on American politics, says of Fox, “It’s the closest we’ve come to having state TV.”
Americans’ need the same resolve in fighting for competition that their corporations have shown in fighting against it.
The challenge, as always, is political. But with US corporations having amassed so much power, there is reason to doubt that the American political system is up to the task of reform. Add to that the globalization of corporate power and the orgy of deregulation and crony capitalism under Trump, and it is clear that Europe will have to take the lead.
At the heart of this conundrum are the tangled vines of transparency and trust. When we interact with algorithms, we know we are dealing with machines. Yet somehow their intelligence and their ability to mimic our own patterns of thought and communication confuse us into viewing them as human. Researchers have observed that when computer users are asked to describe how machines interact with them, they use anthropomorphic terms such as “integrity,” “honesty,” and “cruelty.” I also refer to algorithms’ “behavior” or their “going rogue.” Our language, at least, would suggest that we expect the same degree of trustworthiness, benevolence, and fairness from the computer algorithms we deal with as we do from our human peers.
That people don’t know there are human beings doing this work is, of course, by design. Facebook would rather talk about its advancements in artificial intelligence, and dangle the prospect that its reliance on human moderators will decline over time.
But given the limits of the technology, and the infinite varieties of human speech, such a day appears to be very far away. In the meantime, the call center model of content moderation is taking an ugly toll on many of its workers. As first responders on platforms with billions of users, they are performing a critical function of modern civil society, while being paid less than half as much as many others who work on the front lines. They do the work as long as they can — and when they leave, an NDA ensures that they retreat even further into the shadows.
To Facebook, it will seem as if they never worked there at all. Technically, they never did.