Google NegativeSEO: Case Study in User-Generated Censorship | MIT Center for Civic Media
Chris Peterson works, teaches, and researches at MIT. He works at the intersection of digital strategy, new media, and social change.
In addition to his research affiliation with Civic, he is on the Board of the National Coalition Against Censorship, a Fellow at the National Center for Technology and Dispute Resolution, and the founder, owner, and sole-proprietor of BurgerMap.org.
He earned his B.A. in Critical Legal Studies from the University of Massachusetts, Amherst, where he completed his thesis on Facebook privacy and/as contextual integrity advised by Ethan Katsh and Alan Gaitenby. He earned his S.M. in Comparative Media Studies from MIT, where he completed his thesis on user-generated censorship advised by Ian Condry, Ethan Zuckerman, and Nancy Baym.
Google NegativeSEO: Case Study in User-Generated Censorship
The origin myth is as familiar as that of any dominant empire.
Romulus and Remus Larry Page and Sergey Brin met as graduate students at Stanford. Page, casting about for a dissertation topic, settled on the Web as a graph of links. He and Brin began building a program called BackRub which would “crawl” the graph, visiting and counting links between pages, and aggregate the weighted counts into a value called PageRank. This value, they claimed, constituted “an objective measure of its citation importance that corresponds well with people's subjective idea of importance.” BackRub became Google, a prototype search engine which, powered by PageRank, incorporated user input into its ranking algorithm.
At first, Page and Brin were not only confident that they’d built a better search engine, but, as they wrote in a paper explaining PageRank, one “virtually immune to manipulation by commercial interests”:
For a page to get a high PageRank, it must convince an important page, or a lot of non-important pages, to link to it. At worst, you can have manipulation in the form of buying advertisements (links) on important sites. But, this seems well under control since it costs money. This immunity to manipulation is an extremely important property.
Of course, in hindsight, Google wasn't immune to manipulation at all. Instead, SEO marketers just started using bots to create lots of "non-important pages" to link to sites they want to push up in the Google results. Kolari et al have found that 88% of all English-language blogs on the Internet are so-called "splogs" mean to game Google. As Professor Finn Brunton writes in his forthcoming book on Internet spam:
Terra's [splog] links to other splogs, which link to still more, forming an insular community on a huge range of sites, a kind of PageRank greenhouse which is not in itself meant to be read by people. Splogs of Terra's type are not meant to interact with humans at all; they are created solely for the benefit of search engine spiders.
In April 2012 Matt Cutts, the head of Google’s antispam team, announced that an important algorithm update would, among other things, “decrease rankings for sites that we believe are violating Google’s existing quality guidelines.” In a video, Cutts explained to webmasters that Google’s antispam algorithms had “gotten better and better at assessing bad links” to the point where the company felt comfortable penalizing people for various kinds of “unnatural links,” including paid links, low quality syndicated articles, or splogs. This was a strong move by Google to punish this kind of bad behavior by making it backfire.
The thing about predictable and powerful backfiring, though, is that the shooter can just reverse the weapon to achieve its effect. If pointing bad links to your own site is now punishable, some SEO artists reasoned, why not just point links at your competitor to get them to sink?
Indeed, even before the Panda algorithm had been fully deployed, two members of the TrafficPlanet forums, posting under the names Pixelgrinder and Jammy, decided to test whether NegativeSEO (as they called it) was possible. As a target they selected SeoFastStart.com, a site run by a SEO consultant they disliked. Pixelgrinder and Jammy began a lightweight NegativeSEO campaign using a simple tool, while an unknown third party simultaneously organized a much more powerful linking campaign.
The campaign, which lasted a month, dramatically dropped their target’s rankings in key Google results:
The first day Google began notifying webmasters of “unnatural links” Dan Thies, the consultant targeted by TrafficPlanet, received one. He posted to the Google Webmaster Support forums that he had been warned for having thousands of unnatural links pointing to his site, despite himself having never engaged in “link building” activity himself. The warning coincided with the campaign conducted by TrafficPlanet.
In October 2012 Cutts announced a new “disavow links” tool. “If you’ve been notified of a manual spam action based on ‘unnatural links’ pointing to your site, this tool can help you address the issue,” wrote Google analyst Jonathan Simon. In a December 2012 video Cutts addressed NegativeSEO directly for the first time. He cautioned that “while a lot of people talk about NegativeSEO, very few try it, and fewer still actually succeed,” and reminded viewers that, if they were worried about the issue, the disavow links tool could be used to “defuse” any action taken against them. At present the extent and efficacy of NegativeSEO as a tactic remain largely unexplored: the next uncertain step in the ongoing dance between Google and those who would game it.