Creating Technology for Social Change

The power of a spicy chicken sandwich, or, beyond good and evil there’s a bug

IMG_0275

If you’ve spent any time in the American south — or at the food court in Burlington Mall, ten miles outside Boston — you’ve likely gone weak at the knees at the mention of Chick-fil-A. The waffle fries. The sweet tea. And tops, the spicy chicken sandwich.

And you too may be conflicted about 1) Chick-fil-A’s homophobia vs. 2) how good that spicy chicken sandwich is. (If this sounds like the Kenny Rogers Roaster episode of Seinfeld, it’s not far off.)

So it came as a perversely pleasant surprise that the language associated with one of history’s great homophobes, well, this happens:

Okay, that’s not really the angle I’m taking. It’s more how technology, and especially its bugs, can generate its own meaning and persuasive power. I bring my preconceptions about Chick-fil-A, homophobia, and WWII Germans to anything that combines the three, and the only way to get there is through an unintended consequence of a technology, like a bug in Google Translate. It’s trite nowadays — and practically goes without saying within the context of the Center’s ethic — to argue technology has a moral component, as if that’s a fresh debate. Yet it turns out, if one end of the spectrum is “neutral” and far toward the other end is “moral”, the next stop beyond moral is the bug. It’s how you intend to change the world plus the unintended consequences. Nietzsche’s syphilitic sequel could have been “Beyond Good and Evil 2: A Bug or a Feature?” Why is it that the intention of Google Translate (literal translational meaning) gets the first two German words (“spicy” and “chicken”) exactly right but add “sandwich” and you’ve got a properly capitalized Chick-fil-A product? Why is it none of the other Google Translate languages do that? Why is it that, sometimes, an algorithm’s cracked output can have meaning? There’s the intention, and then there’s the meaning-creating bug beyond.

This week we began publicizing LazyTruth, a Chrome extension that scans forwarded emails in Gmail for debunkable content and, below the email, debunks it. (The media has loved it. Based on their most common shorthand for LazyTruth, Matt Stempeck and the rest of the LazyTruth team probably should have called it the Crazy Uncle Shamer.) On the one hand, it’s a cool experiment of combining semiotics and databases. But on the other, it’s a pretty grand test of whether real-time fact-checking affects beliefs and knowledge and, just as much, whether your Crazy Uncle would even install LazyTruth…whether people are comfortable fact-checking themselves. (Nietzsche’s book analyzing LazyTruth: “Come On, It’s Not Like Everybody’s a Jesuit.”)

Yet, I want to see a broader opt-in intervention that takes more advantage of the jarring Google Translate + German + Chick-fil-A + homophobia effect. A sidebar you set to appear on certain websites that won’t go away until you click a button labelled “True?”. (As in, combining the pause for consideration of fact plus the guilt that comes with clicking it when you’re not 100% sure.) An ad-blocker that replaces ads with context-based questions you promised yourself you’d consider. A plugin that ID’s companies that fund causes you find offensive and, when you search Google Maps for the route to Chick-fil-A, reroutes you to another well-reviewed chicken place. A completely un-A.I., un-data-driven bug-shaped toolbar icon that appears when dynamic content, like Google Translate’s output, is loaded, just to remind you something you don’t control is behind the content…that it might be a bug.