Forbidden Research liveblog: Hacking Culture at MIT

liveblog by Willow Brugh, Natalie Gyenes, and me

Speaker: Liz George, MIT Alum Class of 2008 and MIT Hacker

Liz starts by defining hacking as any good scientific endeavor begins.

Hacking, (noun)

  1. A project without a constructive end
  2. An unusual and original solution to a problem
  3. An activity that tests the limits of skill, imagination, and wits.

If you can build a model of the system, you can push it to its limit or test a system in a way you’d never otherwise be able to do.

Hacking, (verb)

  1. Investigating a subject for its own sake
  2. Engaging in non-destructive mischief
  3. Doing something out of the ordinary or clandestine
  4. Exploring the inaccessible

You’d want these things to also be qualities of academic research. When these things are first thought of, do they have a constructive end? Transitors as an example. You’d also hope that researchers are testing the boundaries of their skill and wit.

MIT Hackers also have a code of ethics, which are written in large print in an inaccessible part of campus. It’s a rite of passage as a hacker to be shown this by a veteran hacker. Some of these are specific to MIT hacks and some are general. Some are very relevant to academic research too. Safety is the #1 priority. And it’s important to share knowledge that you learn with others and ask experienced people if you don’t know how to do something.

Beyond the ethics is MIT policy, which is in tension with MIT tradition. Liabilities are at stake. The tradition at MIT follows the same kind of promotion of playful mischief in the definition cited above. But MIT Policy reads, “Labeling something as a hack does not change unlawful behavior into lawful behavior, nor is it an excuse or justification for violations of MIT policy. Notwithstanding that they may occur in connection with a hack, violations of MIT policies may still result in disciplinary action.”

On to the most visible part of hacking culture at MIT: the creation of hacks (or how it is known to the outside world: ‘pranks’). Prominent examples include hacking the great dome, like putting up a scale model of a Wright Brothers flyer on the 100th anniversary ofthe flight, a 48 unit weight cracking the great dome during finals week, and the famous MIT campus police car.

A hack should tap into MIT culture, inspire the community, and be a feat of engineering.

Liz walks us through a hack she participated in during 2006: putting a Firetruck on the Great Dome. Components:

  1. MIT culture connection: drinking from a fire hose welcomes students in September
  2. Inspire community as a good memorial for firefighters on 9/11
  3. Feat of engineering: bigger than the police car

It starts with the Safety Office at MIT, which is responsible for evaluating all MIT hacks and then removing them. Students who want to hack the great dome look first at the State Board Building Regulations and Standards. This includes it must be able to withstand 90 mph winds. This requires knowledge of Mech E 101 statics. So, step one is meetings, evaluating its safety.

The firetruck hack had a budget of ~$757. Planning time was 3 months, and ~40 people were involved.

Hack planners need to motivate undergrads to work on the project. You have to find ways to make it easy to work on it. The easiest way is to offer dinner, which they have to eat anyway.

How did it work? The team followed the same process for the police car: building a wooden frame and putting the exterior of the car on it. They got the track body from a scrap yard owned by a friend of a friend and built and painted it all themselves.

Before putting on the dome requires a practice assembly to ensure it can go up quickly, silently, and in the dark. This hack took four practices to get under 30 minutes, assembling the 57 big parts, the heaviest being 150 pounds and the largest was 12 feet long. In the interest of mystery and awe, Liz won’t tell us how they got the thing up 150 feet high, but the CMC Rescue catalog was very important.

The hack went off successfully on September 11, 2006. It was so successful that the Firetruck Hack was left up for two days, which is a record for Great Dome hacks. The Safety Office cited that it was so well executed and documented that they could justify leaving it up.

Police and firemen came from around Boston to see the hack. And it was covered in the Boston Globe as a fitting tribute to those involved in 9/11. This was well-received by the administration because it worked. Administrators are happy to take credit for things that go well.

The Good:
MIT students gain engineering skills, ingenuity, persistence, motivating teams, project management, creative solutions, failure analysis. This is probably the biggest project an MIT student has ever worked on to-date.

The Bad: 
Sometimes people get caught. Administrators don’t want to deal with it when things go wrong. Liz received an email from a hacker once after being caught: “I am, in a word, terrified. I was arrested for hacking this past weekend. The three of us are now looking at felonies for participating in an activity for which MIT has built a physical museum.” The student had their charges eventually dropped but they suffered mental trauma, extreme anxiety.

MIT tradition celebrates hacking, but the policy punishes students when it goes wrong. In the Hacking Ethics, it says if you are caught to go willingly and cooperate fully. This requires trust on the part of the students that the Institution will treat them fairly. You degrade the trust between students and the institution when tradition is abandoned in the safety of executing policy.