Creating Technology for Social Change

Introduction to Forbidden Research

Liveblog by Sam Klein and myself

Joi (director of the Media Lab), Opening

Research is forbidden when it won’t get peer reviewed, you’ll be ridiculed, your lab won’t get any new students. Academic freedom is diminishing. We’re not killed any more for the things we say and do (mostly). But looking at Nobel prizes, people are taking career-risking moves to discover something. Civic did an event called Freedom to Innovate. Laws for criminals used to stifle innovation. Courage needed to explore these areas on Forbidden Research.

So we’re asking ourselves: how does an institution become robust? Laws etc put in place to protect the status quo. Academic institutions or society should question the status quo. All the things in history that we see as moments of social change have to do with doing taboo things. Reed Hoffman has agreed to support a Disobedience Prize ($250k). Difficult to award because “what is societally useful disobedience” ends up being complicated. We don’t have a firm date but it’s an experiment.

Ethan (director of Center for Civic Media)

This is public and on the record. Robotic cameras for live streaming. Hashtag is #ForbiddenML. Jeremy Rubin in late 2013 put together TidBit. Instead of an ad displaying, mined bitcoin and awarded it to the content creator. NJ district attorney didn’t like this, issued a supeona. EFF came to defend him. What do we as MIT do when students/members of out community find themselves in trouble?

This is about Star Simpson and Aaron Swartz as well.

Pushing the limits and coming against legal issues when asking questions which are important to ask.

Freedom To Innovate. How do we protect the freedom to take on novel new research, deal with legal barriers? Not just the legal side of things, also need to be able to audit algorithms (bumps into how we’re supposed to use websites). Research gets forbidden for all sorts of reasons. So big and consequential (who has the right to make this decision), icky/uncomfortable.

We are at a particularly dark moment in the US. A wave of violence targeted at people of color. And a wave targeted at police. Gun culture which we cannot study as a public health issue because in 1996 we cannot give money to the CDC to study gun violence. We know restrictions on what we can study and research are restrictions on an open society. Turkey is cracking down on academics as well. It’s important that we find ways to be creatively and pro-socially disobedient. Being careful about their ethics.

And on the note of exploration of activism and prosocial disobedience through scifi, here’s Cory!

Cory Doctorow

Academia is characterized by disagreement, by cut and thrust. Unfortunately, there is denialism, which is manufactured controversy. An example: smoking. A few high paid people cast doubt on links to cancer. The next movement was the AIDS denial movement was next, rivaling the cancer the denial of the Smoking Movement. In South Africa, one charlatan claimed that AIDS was caused by vitamin deficiency rather than HIV and required his brand of vitamins. He was friends with the president. 3,000 died and in South Africa the HIV rate went from 1% to 25%. When people call you out, you have to be able to silence them. They sued the Guardian for publishing the story that said the vitamins were bullshit. AIDS denial begat climate change denial.

Most interesting to the MIT crowd is Turing completeness denial. People want them to only run computers that don’t make you sad. We haven’t done that not because the nerds aren’t complying with justice but because this is how computers work. Digital Rights Management is one of the key problems coming from Turing Completeness Denial. Cryptography denial has come out of this, which we thought was over in the 1990s. Privacy denial: you have nothing to hide so you have nothing to fear.

Denialism leads to Nihilism. These problems get worse when we deny them. People smoke light cigarettes. People kept having unprotected sex without taking anti-retroviral drugs. Or we insist we can build on flood planes or emit a lot of carbon and that it’s going to be totally fine. It’s fine. In the realms of DRM, the reason artists aren’t getting paid isn’t because of bad relationships with those that distribute there work, it’s because people aren’t listening to it in the right way. If we just force people to listen to music in the right way means more money in artists’ pockets rather than better contracts with their producers.

In Crypto-denial, we build out infrastructure with known holes that law enforcement can use. But these can’t be remediated because they live out in the real world and can’t be patched. Recent baseband radio exploit. Even things with strong crypto are unecrypted on this front. We encourage the formation of business based on harvesting our data with the promise we can some day use that data for profit. Saying they’re wrong would be unacceptable, so laws are passed instead saying you can’t say anything. The flaws will still be weaponized, it will just be too late for them to know. We also create giant terms of service. We agree that they can take all of our information. [shows Pokemon Go ToS]. You’ve agreed to them having access to all your information and etc, which means it’s ok that they’re going it. It’s too much trouble to even bother with. Been smoking all these years, all this carbon in the air, etc. “I’m going to leak my data no matter what so I might as well join Facebook and get invited to some parties on the way to the information apocalypse.”

The problem becomes undeniable. CFAA makes violating ToS a felony, which means we can’t investigate who collects what and how because we’d be violating ToS. DMCA makes it illegal to publish the flaws in the very system. DRM becomes an attractive nuisance. You can sue anyone who breaks it even for lawful purposes. We have lightbulbs with DRM. A firmware update from Philips made their lightsockets reject lightbulbs from other manufacturers. Because DRM, it was briefly possible to commit a felony by choosing to plug your own lightbulb into a Phillips light fixture, until they rolled the update back after public outrage.
So who wouldn’t want to wrap their product in DRM? We have it on EVERYTHING. It is potentially lethal for you not to know how these systems work.

At some point we reach a point of peak indifference. This is the moment the activist tactics change. We’re no longer telling people it’s unacceptable, we’re telling them it’s even possible to change. A year ago, the Office of Personnel Management leaked a list of all the people who had recently applied for security clearance. Office of Emergency management, you told the gov everything that could be used to blackmail you. And that was breached and likely sold to the Chinese government. Of course, now you care about privacy.

It’s not just the government. Ashley Madison is another example. “Ratters” will compromise social media channels, dig up nude images and then blackmail their victims into performing live sex acts for them. Often the victims are underage and Ratters can rack up hundreds of victims. People find themselves unable to ignore these problems any more. Their cars are being hijacked. Information stitched together to replicate housing paperwork and sell your house. Farmers are up in arms about John Deere locking up their diagnostic information about tractors and soil density to know about what to plant where. But you don’t have access to that information, they sell it back to you along with seeds from groups like Monsanto. Farming magazines are now running ads upset about this. The moment when everyone starts caring about this is when we can avert Nihilism. This when you encourage people to install strong encryption. If you catch it you can help people move from indifference to making a difference.

Dealing with all of this requires principles. A way to defend them from future people like yourself who might waver in their commitment. A way to make it up to date. GNU Linux licensing regime is a great example of this in our community. Computers should serve rather than enslave people. You should be able to run your code, understand your code, improve your code, and share it with other people. The tactic is the GPL. There are no backsies, if you start your business under high minded ideals with opening up computers. No matter how desperate things become about payroll or acquisition or investors you can never GPL your code. And if people know that, they won’t ask you. They pressure you on different things.

Ulysses Pact” named after the mythical hero who lashed himself to the mast saying he would go down with the ship. You’re doing something like throwing out your Oreos on day 1 of your diet not because you are weak-willed then, but because you are strong-willed enough to know that a day will come when you DON’T have your strong will.

We wanted to build a beautiful thing of the internet and instead we built the biggest surveillance mechanism ever. No one is the villain of their own story. The founders of the internet each made tiny little compromises along the way that made this possible. We need rules to guard us from ourselves. We as pirates must protect future pirates from our future admiral selves.

Cory’s suggested principles

  1. Computers should always obey their owners.
  2. True facts about computers should always be legal to disclose.

Build them into our Terms. If the FDA is going to certify an implant they have to require the company to never bring DRM suit. We can incorporate them into the definition of open standards.

These are rules for rulebreaking. The werewolf sin is not turning into a werewolf but failing to lock yourself up at night before the full moon comes out. Your trick is not to stay pure, it’s to guard yourself against the times you’re a werewolf.

EFF filed a brief against the 1201 of DMCA as contrary to the 1st amendment. Apollo 1201, our new initiative, wants to end all DMCA within a decade. This is our opening salvo.


Ethan: Our Goal is 1 lawsuit per panel, so we’re on track so far.