Algorithms Need Managers, Too


In Shakespeare’s Julius Caesar, a soothsayer warns Caesar to “beware the ides of March.” The recommendation was perfectly clear: Caesar had better watch out. Yet at the same time it was completely incomprehensible. Watch out for what? Why? Caesar, frustrated with the mysterious message, dismissed the soothsayer, declaring, “He is a dreamer; let us leave him.” Indeed, the ides of March turned out to be a bad day for the ruler. The problem was that the soothsayer provided incomplete information. And there was no clue to what was missing or how important that information was.

Like Shakespeare’s soothsayer, algorithms often can predict the future with great accuracy but tell you neither what will cause an event nor why. An algorithm can read through every New York Times article and tell you which is most likely to be shared on Twitter without necessarily explaining why people will be moved to tweet about it. An algorithm can tell you which employees are most likely to succeed without identifying which attributes are most important for success.

→ Harvard Business Review

Who Funds the Future?

“The biggest outcomes come when you break your previous mental model. The black-swan events of the past forty years—the PC, the router, the Internet, the iPhone—nobody had theses around those. So what’s useful to us is having Dumbo ears.” A great V.C. keeps his ears pricked for a disturbing story with the elements of a fairy tale. This tale begins in another age (which happens to be the future), and features a lowborn hero who knows a secret from his hardscrabble experience. The hero encounters royalty (the V.C.s) who test him, and he harnesses magic (technology) to prevail. The tale ends in heaping treasure chests for all, borne home on the unicorn’s back.

→ The New Yorker

Reign of the Algorithm

Writers, remember: the more we play the algorithmic game, the more the algorithmic game plays us. (All hail the great Algorithm in the Cloud!)

Algorithm-oriented content is becoming ubiquitous. It doesn’t matter what you read or what topics you search for, a growing percentage of online material is designed ground-up for the acquisition of ‘Likes’ and the courtship of search engines. Food, politics, current affairs, cats, academia — everything. It doesn’t matter what you are interested in, there is an army of people writing about it with strategic intent to leverage the algorithmic landscape for their advantage.

→ James Shelley

This Image Of Mark Zuckerberg Says So Much About Our Future


The image above looks like concept art for a new dystopian sci-fi film. A billionaire superman with a rictus grin, striding straight past human drones, tethered to machines and blinded to reality by blinking plastic masks. Golden light shines down on the man as he strides past his subjects, cast in gloom, toward a stage where he will accept their adulation. Later that night, he will pore across his vast network and read their praise, heaped upon him in superlatives, as he drives what remains of humanity forward to his singular vision.

→ The Verge

Apple Is Selling You A phone, Not Civil Liberties

Let’s start with an important fact that Apple elides in its statement: Apple engineered this problem and it did so intentionally. In the wake of the Snowden leaks, Apple specifically decided to encrypt material end-to-end and at rest by default on the devices it manufactures and to not maintain any ability to decrypt material unless users specifically gave it the power to recover that material. It boasted about this decision and used it as a marketing weapon against its competitors. Reasonable people can argue about whether or not Apple did so for good reasons and whether or not doing so was the optimal way for the company to enhance the cybersecurity of its users. But the simple fact remains that Apple used to have the capacity to comply with warrants, and now it cannot without a certain amount of reengineering. And that was a matter of its own choosing made despite repeated warnings from the government that this choice would cause substantial problems for law enforcement, national security investigators, and public safety.

• • •

FBI and Justice Department officials, we think, can be forgiven if they’re a touch cynical about all of Apple’s elaborate legal argumentation and suspect that this all just masks what appears to be Apple’s genuine litigating posture towards the government: You can’t make us do anything, because we are immensely politically powerful, our CEO is on the phone with the President regularly, we are too big and way too cool to fail, and people around the world like us more than they like you. So what about that dead woman in Louisiana? Sorry, but bringing her killer to justice—and preventing his or her future violence—just isn’t as important as the data security of our devices. And about protecting people from ISIS? We’ll help out if it’s not too much trouble, but don’t ask us—ever—to do something that will make us look bad to the ACLU, even if there’s a very good legal argument that you can.

→ Brookings