As I’m writing this, my son is downstairs watching YouTube videos. When we have dinner, he’ll stream music on an iPad, typing song titles into the app’s search box either from memory or from a list we’ve written down (at his insistence) on paper. He’ll eat while stimming, except for when he takes a dance break.
There’s no app, drug, or device that’s going to transform my son or his interactions with others. And that’s just fine. He’s doing great and anyone who chooses to listen, who chooses to put in a little work, can meet him where he is.
On April 6, Louisiana became one of the first states to release Covid-19 data by race: While making up 33 percent of the population, African-Americans accounted for 70 percent of the dead at that point. Around the same time, other cities and states began to release racial data in the absence of even a whisper from the federal government — where health data of all kinds is routinely categorized by race. Areas with large populations of black people were revealed to have disproportionate, devastating death rates.
That people don’t know there are human beings doing this work is, of course, by design. Facebook would rather talk about its advancements in artificial intelligence, and dangle the prospect that its reliance on human moderators will decline over time.
But given the limits of the technology, and the infinite varieties of human speech, such a day appears to be very far away. In the meantime, the call center model of content moderation is taking an ugly toll on many of its workers. As first responders on platforms with billions of users, they are performing a critical function of modern civil society, while being paid less than half as much as many others who work on the front lines. They do the work as long as they can — and when they leave, an NDA ensures that they retreat even further into the shadows.
To Facebook, it will seem as if they never worked there at all. Technically, they never did.
For more than two thousand years, the world’s great minds have argued about the essence of time. Is it finite or infinite? Does it flow like a river or is it granular, proceeding in small bits, like sand trickling through an hourglass? And what is the present? Is now an indivisible instant, a line of vapor between the past and the future? Or is it an instant that can be measured—and, if so, how long is it? And what lies between the instants? “The instant, this strange nature, is something inserted between motion and rest, and it is in no time at all,” Plato remarked in the fourth century B.C.E. “But into it and from it what is moved changes to being at rest, and what is at rest to being moved.”
Most of us are slaves to our chronological age, behaving, as the saying goes, age-appropriately. For example, young people often take steps to recover from a minor injury, whereas someone in their 80s may accept the pain that comes with the injury and be less proactive in addressing the problem. “Many people, because of societal expectations, all too often say, ‘Well, what do you expect, as you get older you fall apart,’ ” says Langer. “So, they don’t do the things to make themselves better, and it becomes a self-fulfilling prophecy.”
It’s this perception of one’s age, or subjective age, that interests Antonio Terracciano, a psychologist and gerontologist at Florida State University College of Medicine. Horvath’s work shows that biological age is correlated with diseases. Can one say the same thing about subjective age?