Creating Technology for Social Change

Mapping Banned Books in Massachusetts: We Asked Every Public School And Library In The State About Banned Books, And Here’s What They Said

TLDR: This post describes the history, process, findings, and caveats of our recently-conducted FOIA campaign of 1,300 public schools and libraries in the state of Massachusetts. You should also check out companion pieces in the Boston Globe, which has an overview of the project, and our research partner MuckRock, which dives more deeply into some of the more interesting results. 

Background

As readers of the Civic blog may remember, last year I mapped banned books across America, using data published by the American Library Association. My goal was to learn something about trends and patterns in banned and challenged books. Did red states challenge more books than blue states? Rich more than poor? Had the types of titles or topics changed over time? Could you track the rise and fall of Harry Potter or Captain Underpants through records of reports?

This turned out to be a harder project than I imagined. The data was quite incomplete due to collection constraints, sampling biases, and modeling choices. The ALA has a very difficult (and very important) job: providing first line support to librarians on every imagineable issue. The data in this archive was collected primarily through found media reports or through report forms sent in to the ALA’s offices. Banned and challenged book records are always underreported at variable rates: on the map I made, Texas has lots of records, but that’s in part due to a very active state ACLU which has a much higher reporting rate.

This incompleteness seemed like an interesting problem to try to tackle. Now, all data are always already incomplete, and incomplete in particular ways. You can never have “all the data.” But you might design systems, processes, and procedures which allow you to capture different data. Comparing multiple data sets, modeled in multiple ways, usually provides the most compelling picture.

So here’s what I did. I sat down with Matt Carroll and Brian Mooney of the Boston Globe, two decorated veterans of the Boston Globe with long careers of journalism in the public interest, to talk about what data we might try to get, and how we might try to get it.

Process

Most libraries have a paper form, known as a “request for reconsideration or removal”, which any patron can submit to petition a library to (re)move a book from its shelves. In theory, any book challenged by patrons should have an associated document. In 2010, Missouri Professor Charles Davis (recently appointed to head the UGA journalism school) had served every public school district in Missouri with a FOIA request for these records as a class project with interesting results. The Davis model seemed like an interesting one to try to emulate and extend in Massachusetts.

There was one problem with this plan: serving all 1,300 public schools and libraries in Massachusetts with a FOIA request (and keeping track of the results). I wasn’t a professor with a class of students; I was a student, in the middle of taking four classes and trying to write my thesis at night.

As they usually do, Matt and Brian had a lead. They connected me to MuckRock, a journalism startup based at the Globe which makes tools to help keep government accountable. MuckRock assists ordinary people in managing FOIA requests: filing them with the right agency, keeping track of responses, and nudging delinquent responders. Shawn Musgrave, the projects editor at MuckRock, was excited about the idea, and we soon started work.

The first thing we needed to do was tell MuckRock which institutions we wanted to contact and how to contact them. I hired Betsy Riley, a brilliant daughter of a local librarian studying computer science at MIT, as a UROP to help me gather the data. Matt tipped us off to the website of the Massachusetts Board of Library Commissioners, where you can search for publicly listed contact and administrative information from your local library (here’s mine). Betsy wrote a scraper which plowed through the entire directory and extracted all of the information we needed into a spreadsheet, which we then uploaded into MuckRock. We drafted a letter under the Massachusetts Public Records Law and hit send.

Problems

Collecting data is a funny thing. You design and redesign your methods and models as carefully as you can, but as soon as you actually try to use it out there in the world, it never goes according to plan. Here are some of the surprises we encountered:

  • Librarians who didn’t know what public records laws were
  • Librarians who knew what they were, but didn’t understand that an ordinary person like me could ask for them
  • Librarians who had complex org charts, which meant my request had to bounce between three or four different people / offices before it could be answered
  • Librarians who thought I was some kind of spam bot engaged in the world’s most boring phishing attempt

Mostly, however, our respondent librarians were as you would expect: highly intelligent, eager to help, and full of civic spirit. I fielded more than one phone call from librarians and administrators who were interested in the project, wanted to learn more, and asked if there was anything more they could do to help. Some sent me tips, connected me with additional support communities, and encouraged me onwards.

By far my favorite response, however, were from the various librarians who told me they had no records to give me. Some of the librarians were frankly (and endearingly) indignant at the thought of removing a book from their shelves under patron pressure. These were the sorts of responses which made me extra proud to be a card-carrying member of my local library.

Findings

At this point our campaign is effectively over. Earlier today Shawn published a summary of the campaign in the Globe. Here’s some more information on our results with links.

We sent 1,277 requests. Of these, the vast majority had no responsive documents, which is to say no records of any challenges. Of those few which had records of challenges, only two had actually acted affirmatively on them: the Dighton-Rehoboth Schools, which voted to remove the Hunger Games from the elementary school library (but to retain them at the middle and high schools), and the Essex Elementary Schools, which voted to replace Girls Life Magazine with “the more age-appropriate Discovery Girls.”

As it happens, one of the libraries which had records of a challenge was my own beloved Watertown Free Public Library. But the substance of the challenge was not what I expected:

 

 

 

 

Our mental image of those who challenge books is often that of a primitive goon, a stereotype reaffirmed rather well recently by the Randolph County Board of Education when it stupidly decided to remove Ralph Ellison’s Invisible Man from high school shelves because it had “no literary merit.” By comparison, here in Watertown, I am happy to report that even our book banners demonstrate a remarkable level of intellectual and historical sophistication.

As do our librarians in politely but firmly denying them:

 

 


Caveats

Earlier in this post I mentioned some of the incompleteness in the ALA’s self-reported records. I want to stress that the results of this campaign are incomplete too, and incomplete in important ways, some of which provide important insight into both this project and the broader landscape of book censorship.

First, and perhaps most critically, the Massachusetts records retention guidelines for public libraries only require libraries to retain “complaint and censorship records” until the complain is resolved. This is a bizarre hole, both generally as a matter of public policy and specifically for the purposes of this research. Any records we received were, by definition, above and beyond what the state requires public libraries to retain and make available. If there is one policy recommendation I have after this campaign, it is to have Secretary Galvin change this and require libraries to retain such records for a period of years for research purposes.

Second, as in the ALA self-reported data, our response rate was imperfect. 192 libraries did not respond to our request, and although they are required to by law, we don’t have the resources to litigate the cases. And even in those cases where we did get responses, we are essentially trusting the librarians to have responded accurately. I tend to trust librarians with this sort of information, but others may be more skeptical, and our data must be interpreted with that in mind.

Third, our campaign used public records as a medium of record, which means we’re blind to incidents which were not recorded. As one librarian reminded me, censorship “can be more subtle than a written request to ban a book. Very often, a patron concern about a book never gets beyond a verbal discussion. Libraries develop written policies to point to why a particular title is in the collection.” And in our initial conversations, Brian Mooney was much more interested not in what was challenged, but in what wasn’t ever added to the library in the first place. We didn’t – and probably can’t – capture the discussions, standards, and procedures used to compose the library’s collection, and that’s where 99% of the filtering happens, as opposed to the 1% of patron challenges.

And I certainly don’t believe that the low rate of successful challenges in Massachusetts is any reason to stop being vigilent. Indeed, it is quite probable that the vigilence of Massachusetts bibliophiles is precisely why the rate is so low. As my colleague Barbara Jones, Director of the ALA Office for Intellectual Freedom, told the Globe, it is often the potential for bad press which keeps books on shelves (or, as in the recent case of North Carolina, restores them rapidly once word gets out). 

Conclusion

Still, even with these caveats, I’m very proud of this project. As far as I know, our small team of people, with zero funding, just conducted the largest survey of book challenges in the history of anywhere, and perhaps one of the largest FOIA campaigns in the history of anything. Through this project we learned a little bit about book challenges in Massachusetts, but we learned even more about the structure of the system we had to navigate, from idiosyncratic public records retention requirements to bureaucratic school district organizational charts.

Our findings are incomplete, but all findings always are. What’s important is piecing together all the incomplete results, like scattered pieces of a torn treasure map, to get a better picture of the terrain you’re trying to navigate. And, with the help of a lot of terrific people at Civic, the Globe, and libraries across the commonwealth, I think we did that here.