Chistopher Steiner’s new book on algorithms looks interesting. (One nugget: Many companies now use software to analyze the emotional tone of customers calling in for customer service help. Sound emotional, and you’ll get routed to the more empathic call center workers.) It’s part of a growing literature on algorithms both online and off. As we search for reliable information on algorithms, they in turn may well be driving even our awareness and discussion of them. It’s another way technology shapes values, rather than being influenced or constrained by them. Consider a recent feature on an increasingly algorithm driven news industry:
Google News-powered results, Google says, are viewed by about 1 billion unique users a week. . . . Which translates, for news outlets overall, to more than 4 billion clicks each month: 1 billion from Google News itself and an additional 3 billion from web search. . . .
Google News’s head of engineering summed up the challenge: “How do I take a story that has 20,000 articles, potentially, and showcase all of its variety and breadth to the user?” . . . . Google [is] symbolic of a broader transition: producers’ own grudging acceptance of a media environment in which they are no longer the primary distributors of their own work. [It] suggests an ecosystem that will find producers and amplifiers working collaboratively, rather than competitively. And working, intentionally or not, toward the earnest end that Schmidt expressed two years ago: “the survival of high-quality journalism.”
When Google News launched in 2002, it’s worth remembering, it did so with the following . . . declaration: “This page was generated entirely by computer algorithms without human editors. No humans were harmed or even used in the creation of this page.” Since then, as news publishers have emphasized to Google how human a process news production actually is, the company’s news platform has — carefully, incrementally, strategically — found ways to balance its core algorithmic approach with more human concerns.
The article frames Google’s approach as a series of magnanimous concessions to squabbling journos—the commodity “paint” (as Lessig christened them a few years ago) artistically arranged by Google into a picture of the world. Like most business news nowadays, the commercial concerns about fairly dividing the pie of digital advertising dominate. But I have to wonder about Google News’s public role, and how it could potentially be manipulated. How do stories make it to the top of the Google News front page? How important is the sheer number of mentions of a given story, as opposed to, say, the authority of the news outlets promoting them? (For example, how long should the 47 percent meme dominate presidential news coverage?) And finally, as direct human interventions into the page increase, what are the standards for raising or lowering the prominence of the story? Will news outlets be able to pay for premium placement, like hotels appear to be doing?
If all those decisions are made behind closed doors at the Googleplex (or the Twitterdome, Facebookistan, or wherever your favorite intermediary is), expect increasingly vertiginous online sense making. At its best, Google News could be one more indicator of journalists’ sentiment and interest, helping gatekeepers decide what’s the most important news and ordinary readers tame information overload. As personalization continues apace, rival services will develop different “theories of you” to decide what to present, as Eli Pariser explains:
Google’s filtering systems . . . rely heavily on Web history and what you click on (click signals) to infer what you like and dislike. These clicks often happen in an entirely private context: The assumption is that searches for “intestinal gas” and celebrity gossip Web sites are between you and your browser. You might behave differently if you thought other people were going to see your searches. But it’s that behavior that determines what content you see in Google News, what ads Google displays — what determines, in other words, Google’s theory of you.
The basis for Facebook’s personalization is entirely different. While Facebook undoubtedly tracks clicks, its primary way of thinking about your identity is to look at what you share and with whom you interact. That’s a whole different kettle of data from Google’s: There are plenty of prurient, vain, and embarrassing things we click on that we’d be reluctant to share with all of our friends in a status update. And the reverse is true, too. I’ll cop to sometimes sharing links I’ve barely read — the long investigative piece on the reconstruction of Haiti, the bold political headline — because I like the way it makes me appear to others. The Google self and the Facebook self, in other words, are pretty different people. There’s a big difference between “you are what you click” and “you are what you share.”
As time wears on, the real “news” will be those few items that break through the “filter bubbles” of a critical mass of the populace. In Gary Shteyngart’s novel Super Sad True Love Story, a service called “CrisisNet” provides urgent updates that everyone needs to know. Meanwhile, the New York Lifestyle Times masters the most profitable ways of grabbing the attention of High Net Worth Individuals. Somehow I think a novel like his provides a scenario analysis of the future of news more prescient than most algorithmic predictions of “present and future business models to monetize the newspaper industry.”