erhardt | MIT Center for Civic Media

Recent blog posts by erhardt

Creating Learning Guides for Community Makers

On Saturday, October 26, 2014, Nathan Matias and I co-facilitated a session at Mozilla Festival on creating "Learning Guides for Community Makers" along with Gabriela Rodriguez, Janet Gunter (@JanetGunter), Linda Sandvik, Vanessa Gennarelli.

The main goal of the session was to help participants create a learning guides for other community-focused makers based on initiatives, projects, and workshops they have already organized, hosting them here:

youthcivictech.mit.edu

We are also interested in connecting practitioners together who are working at the intersection of code/data literacy, civic technology, and youth development. The effort was inspired in part by MIT Media Lab alumni projects like Young Activists Network (Leo Burd) and ScratchEd (Karen Brennan).

We kicked off the session by discussing "What do we mean by civic and community-focused making?" This proved an engaging topic, especially as we dug into my own definition and goals. I offered the idea that there are changes we would like to see in the world, and we would like more people to be in the business of making change, so its important to support the growth of an inclusive Civic Tech movement. We debated whether a sense of membership in some kind of "civic tech movement" was a necessary part of community-focused making. We agreed that community-focus was both about working in existing communities as well as building new communities through collaboration, forming and strengthening relationships with others.

Learning from Political Experiments and Information Cascades on Facebook

This is a liveblog of Lada Adamic's plenary keynote from Political Networks 2014.

When your friends deliver the news
Lada Adamic is a Data Scientist at Facebook and former associate professor at the University of Michigan's School of Information. Her talk is entitled "When your friends deliver the news." Using studies based on Facebook data, she invites us to think about factors of social networks that affect the spread of information.

She opens with a set of questions and concerns raised by the idea of your talk's title: what happens when your friends deliver the news through what they share on Facebook...

  • How is your exposure affected? (Your friends are not a random sample of the population nor are they mainstream media journalists.)
  • Does it affect political engagement? (How interested you are or likely to vote.)
  • What do social movements look like? How does success propagate?
  • How does any information spread, is it predictable?
  • Is it reliable?

Lada shows us the top five most shared stories from a year ago (i.e. June 2013):

  1. Drowning doesn't look like drowning (Slate)
  2. Boy's death highlights a hidden danger: Dry drowning (Today Show piece containing substantial misinformation)
  3. 22 Maps that Show how american speak English totally different from one other (Business Insider maps that were later integrated in the most read NYT story of 2013)
  4. Edward Snowden: the whistleblower behind the NSA surveillance revelations (The Guardian)
  5. 8 Foods we eat in the us that are banned in other countries (Buzzfeed)

She tells us that women in their 40s read the drowning pieces, whereas men in their 20s read the Snowden piece. There are definitely clusters of people more likely to read certain stories. But Lada asks: Is there a filter bubble? Do we get echo chambers, especially across political lines?

Using information from users' profile pages, Lada and team members rated what people's political leanings were: very liberal, moderately liberal, moderately conservative, very conservative. And they coded the different news sites by what was read by users of certain political leanings: from ThinkProgress on the far Left to FoxNews on the far right. They found that content skews liberal in aggregate over Facebook even though ideology is balanced. There is simply more liberal content shared by all users on the network. Lada cites Duncan Watts' Friend Sense app research, and asks: Can we understand what the egonetwork of a conservative looks like?

People's friends aren't exclusive to their political beliefs. They found that the distribution of friendships skews toward their own political leaning but still retains some balance across the spectrum. Then they looked at the interactive patterns between users and news across the political spectrum, breaking them down into four buckets of user-news interaction:

  • Potential: all of the news your friends are sharing
  • Exposed: what showed up in your news feed (balanced diet of liberal and conservative for conservative users)
  • Selected: what was clicked on (no effect, they were clicked on in proportion to what showed up in feed)
  • Endorsed: what is liked this is where the difference exists, conservatives are much less likely to endorse liberal news

They found that endorsement was the key difference between users of conservative versus liberal ideology. Conservative users were significantly less likely to endorse liberal news, even though they were served and even clicked to read liberal news at similar rates to the liberal users.

Research by Solomon Messing and Eytan Bakshy looked at and experimented with activity on Facebook prior to the 2012 election. The treatment involved adding more political news in certain users' news feeds. (Edit: Lada later clarified to me that the treatment involved adding more news content of all kinds to certain users' news feeds, not just political news. - EG, Oct 29 2014) They found that people reported being more interested in politics and government when, unbenkownst to them, they were getting more news in their feed. There was a greater effect for users that don't log in everyday, since people who log in everyday are more likely to read everything. The treatment group also reported they were more likely to vote, with a stronger effect again on less regular users.

Learning from Political Experiments and Information Cascades on Facebook

This is a liveblog of Lada Adamic's plenary keynote from Political Networks 2014.

When your friends deliver the news
Lada Adamic is a Data Scientist at Facebook and former associate professor at the University of Michigan's School of Information. Her talk is entitled "When your friends deliver the news." Using studies based on Facebook data, she invites us to think about factors of social networks that affect the spread of information.

She opens with a set of questions and concerns raised by the idea of your talk's title: what happens when your friends deliver the news through what they share on Facebook...

  • How is your exposure affected? (Your friends are not a random sample of the population nor are they mainstream media journalists.)
  • Does it affect political engagement? (How interested you are or likely to vote.)
  • What do social movements look like? How does success propagate?
  • How does any information spread, is it predictable?
  • Is it reliable?

Lada shows us the top five most shared stories from a year ago (i.e. June 2013):

  1. Drowning doesn't look like drowning (Slate)
  2. Boy's death highlights a hidden danger: Dry drowning (Today Show piece containing substantial misinformation)
  3. 22 Maps that Show how american speak English totally different from one other (Business Insider maps that were later integrated in the most read NYT story of 2013)
  4. Edward Snowden: the whistleblower behind the NSA surveillance revelations (The Guardian)
  5. 8 Foods we eat in the us that are banned in other countries (Buzzfeed)

She tells us that women in their 40s read the drowning pieces, whereas men in their 20s read the Snowden piece. There are definitely clusters of people more likely to read certain stories. But Lada asks: Is there a filter bubble? Do we get echo chambers, especially across political lines?

Using information from users' profile pages, Lada and team members rated what people's political leanings were: very liberal, moderately liberal, moderately conservative, very conservative. And they coded the different news sites by what was read by users of certain political leanings: from ThinkProgress on the far Left to FoxNews on the far right. They found that content skews liberal in aggregate over Facebook even though ideology is balanced. There is simply more liberal content shared by all users on the network. Lada cites Duncan Watts' Friend Sense app research, and asks: Can we understand what the egonetwork of a conservative looks like?

People's friends aren't exclusive to their political beliefs. They found that the distribution of friendships skews toward their own political leaning but still retains some balance across the spectrum. Then they looked at the interactive patterns between users and news across the political spectrum, breaking them down into four buckets of user-news interaction:

  • Potential: all of the news your friends are sharing
  • Exposed: what showed up in your news feed (balanced diet of liberal and conservative for conservative users)
  • Selected: what was clicked on (no effect, they were clicked on in proportion to what showed up in feed)
  • Endorsed: what is liked this is where the difference exists, conservatives are much less likely to endorse liberal news

They found that endorsement was the key difference between users of conservative versus liberal ideology. Conservative users were significantly less likely to endorse liberal news, even though they were served and even clicked to read liberal news at similar rates to the liberal users.

Research by Solomon Messing and Eytan Bakshy looked at and experimented with activity on Facebook prior to the 2012 election. The treatment involved adding more political news in certain users' news feeds. They found that people reported being more interested in politics and government when, unbenkownst to them, they were getting more news in their feed. There was a greater effect for users that don't log in everyday, since people who log in everyday are more likely to read everything. The treatment group also reported they were more likely to vote, with a stronger effect again on less regular users. 

Hacking Civics Education with Phillips Andover Students

On Wednesday, May 21, 2014, we hosted the Hacking Andover class, "an experiment in education for the digital age," comprising seniors from Phillips Andover Academy led by their teacher and head of school John Palfrey. We designed a two hour block connecting creative learning at the MIT Media Lab with civic technologies and civics education.

Alex Anderlik, one of the Andover students, beat us to blogging about this workshop. Do check out his great Google+ story: MIT Media Lab: Hacking Class Field Trip

Civic Fiction: The Real Insidiousness of A Gay Girl in Damascus according to Molly Sauter

At Theorizing the Web this year, MIT Center for Civic Media alum Molly Sauter delivered a powerful paper on the idea of "civic fiction" using the the case of A Gay Girl in Damascus (about how a white American man created a compelling fake lesbian Syrian blogger named Amina during the height of the Syrian resistance) to show how a fictional narrative co-constructed by a culturally homogenous author and audience (in this case Western) can do problematic political work by amplifying an Orientalist narrative. The result is a feedback loop through a media ecosystem that thinks its functioning as a bridge between narratives but is actually serving as a insidious mirror.

Her concepts of "civic fiction" and the "mirror figure" are important new constructs for civic media to wrestle with. At the Center for Civic Media, our standard "demo" slides feature an image of Mike Daisey holding an iPad with the caption "exaggeration and distortion," which we use as an example of ways we need to be skeptical about the way media is used for civic and activist purposes. In Daisey's case, his source was actually theater, but it was dropped into a news context—a situation he's reflected on with respect to how truth is negotiated with the audience.* Often we think about these not as fictions but little distortions that add up to propaganda in some cases. What's new about the Amina hoax in the case Molly presents is the possibility that we will all be in on it, unwittingly or not—our biases confirmed. And we won't be able to fact-check our way out of one of these feedback loops because the truth is inaccessible in a place like Syria. What if This American Life couldn't do the background research and produce a completely separate episode to retract Mike Daisey's "creative" version of the truth?

Below are my notes from Molly's talk, and you can also watch her deliver it thanks to the livestream capture. 

Pages