Skip to content

Assessing Algorithmic Authority

Clay Shirky characterizes “algorithmic authority” as “the decision to regard as authoritative an unmanaged process of extracting value from diverse, untrustworthy sources, without any human standing beside the result saying ‘Trust this because you trust me.'” For Shirky, “authority is a social agreement, not a culturally independent fact.” He mentions the poor performance of certain “sources everyone accepts”–for example, “the ratings agencies Moodys, Standard & Poor’s, and Fitch”–and an error in Encyclopedia Brittanica. He implicitly contrasts these older, traditional authorities with “Google’s PageRank algorithm, Twitscoop’s zeitgeist measurement, and Wikipedia’s post hoc peer review,” which are examples of algorithmic authority.

Both traditional and algorithmic sources face the problem of unreliable inputs. For algorithmic authorities,

[T]he “Garbage In, Garbage Out” problem [is handled] by accepting the garbage as an input, rather than trying to clean the data first; it provides the output to the end user without any human supervisor checking it at the penultimate step; and these processes are eroding the previous institutional monopoly on the kind of authority we are used to in a number of public spheres, including the sphere of news.

Shirky explains algorithmic authority; I want to take a few steps toward worrying about it.

a) Shirky notes in the piece that there are several kinds of knowledge out there unsusceptible to assessments of accuracy. He cleverly calls these “epistemological potholes.” My worry is that the potholes are in fact larger than the road itself, and that we should be particularly concerned about the accumulation of algorithmic authority in news. If we merely relied on journalists for facts, perhaps a wikipedian directive of objectivity and neutrality could permit algorithmic authorities to separate the wheat from the chaff. But the media is more an engine than a camera, the font of ultimate political reality it pretends merely to mirror.

b) Now the question becomes: are these algorithmic authorities any worse than the corporate goliaths they are displacing? I’m not going to argue that they are, because of a deeper problem: at least one of them (Google) utilizes trade secret protected algorithms that aren’t open to public inspection (and are likely so dynamic that a snapshot of them would give us little chance of assessing their biases). I can’t imagine how a modern-day Herbert Gans could write an account of “Deciding What’s Google News” (though I’m deeply impressed by Dawn Nunziato’s incisive account of some problems in the service). I’ve earlier worried that algorithmic sorting could allow prejudices to enter spheres of life where once people had to “launder preferences” by giving some explicit reason for action.

c) Algorithmic authority probably has Hayekian and democratic foundations–an idea that the uncoordinated preferences of the mass can coalesce into the “wisdom of crowds” once old elites step out of the way. A power law distribution of attention on the web, like ever-more-extreme polarization of wealth and poverty, has to be legitimated by markets, democracy, or some combination of the two. Such forms of spontaneous coordination are perceived as fair because they are governed by knowable rules: a majority or plurality of votes wins, as does the highest bidder. Yet our markets, elections, and life online are increasingly mediated by institutions that suffer a serious transparency deficit. Black box voting continues to compromise election results. The Fed asserts extraordinary emergency powers to deflect journalistic inquiries about its balance sheets. Compared to these examples, the obscurity at the heart of our “cultural voting machines” (as I call dominant intermediaries) may seem trivial. But when a private entity grows important enough, its own secret laws deserve at least some scrutiny.

I have little faith that such scrutiny will come any time soon. But until it does, we should not forget that the success of algorithmic authorities depends in large part on their owners’ ability to convince us of the importance–not merely the accuracy–of their results. A society that obsesses over the top Google News results has made those results important, and we are ill-advised to assume the reverse (that the results are obsessed over because they are important) without some narrative account of why the algorithm is superior to, say, the “news judgment” of editors at traditional media. (Algorithmic authority may simply be a way of rewarding engineers (rather than media personalities) for amusing ourselves to death.)

Moreover, if personalized search ever evolves to the point where someone can type into their gmail “what job should I look for,” and receives many relevant results, new media literacy demands that the searcher reflect on the fact that his or her very idea of relevance has probably been affected by repeated interactions with the interface and the results themselves. As Nicholas Carr and Jaron Lanier have pointed out (recalling Sherry Turkle and Sven Birkerts), tools aren’t just adapting to better serve us-–we are adapting in order to better compete in the environment created by tools. Algorithmic authority can be just as disciplinary as the old forms of cognitive coordination it’s displacing. To paraphrase Foucault: “Responding precisely to the revolt of the [netizens,], we find a new mode of investment which presents itself no longer in the form of control by repression but that of control by stimulation”. . . and search engine optimization.

1 thought on “Assessing Algorithmic Authority”

  1. “Now the question becomes: are these algorithmic authorities any worse than the corporate goliaths they are displacing?”

    I would argue that they are for a few reasons:

    1. “Authority” status with them can change as often as daily.

    2. Most “algorithmic authorities” can still be easily and heavily manipulated, meaning those who focus on working the tools can appear to have more authority than someone with true influence who takes a more natural path.

    3. These “algorithmic authorities” (namely Big G) have been known to override these algorithms at will if you don’t act in accordance with the rules of the pseudo Internet police. For example, it’s well-known that they’ll eliminate or decrease your PageRank if you use an advertising model they don’t approve of (specifically because their own algorithm is faulty and couldn’t account for the natural move into paid link advertising) — if you use the model without following their own rules, they treat you like a spammer no matter how relevant the ads on your site might be — there’s no differentiation between legitimate and relevant ads that offer value and true spam. I personally saw one of my sites go from from a PR 6 to 0 quite a while back when I refused to bend to Google’s whims (and still won’t). Did the actual “authority” of the site decrease from a reader perspective? Did it suddenly have less value than others in the niche? Absolutely not. Google’s also been known to penalize sites in search engine rankings manually. So in fact there CAN be human interference with algorithmic outputs. It’s simply hidden from the average viewer / user.

    I’d be incredibly disappointed to see this particular tool (PageRank) factored into anything authority-related not only because of Google’s well-documented behavior and biases but because they themselves removed it recently from their webmaster tools because they said they basically didn’t want site owners obsessing about it so much anymore. They’re not even attempting to “convince us of the importance” anymore.

    The extreme inaccuracies of just about every online authority / influence ranking tool or algorithm aren’t new. They’ve been discussed in depth for quite some time following the “best” list craze that relied on them to paint a false picture of influence in the blogosphere.

    The fact that most people might be naive enough to believe something that’s inaccurate just because so-and-so said so doesn’t mean the source has true “authority.” It just means society’s been dumb-downed.

Comments are closed.