Data for Black Lives (D4BL) is “a group of activists, organizers, and mathematicians committed to the mission of using data science to create concrete and measurable change in the lives of Black people.” This is a liveblog from the opening panel for the D4BL 2017 Inaugural Conference. Liveblogging contributed by Rahul Bhargava and Catherine D’Ignazio. They apologize for any errors or omissions.
Yeshimabeit opens with a reminder that data and technologies have far too often been weaponized against black communities.
Yeshimabeit Milner – Moderator and co-founder of Data for Black Lives.
Cathy O’Neil – Data scientist and author of the book Weapons of Math Destruction.
Malika Saada Saar – Senior counsel on Human Rights for Google
Dr. Atyia Martin – Chief Resilience Officer for the City of Boston.
Purvi Shah – Co-Founder of Law for Black Lives
Data is presented to us as facts. Cathy found in finance that data had been weaponized (during the last financial crisis). She left for data science, where she say the same thing happening. Algorithms are opinions embedded in code.
Algorithms are predictions. They use data as input, but you’ve chosen it and ignored some. Then you train it for success (you have to define success). She uses cooking as an analogy. When she decides what to cook for her kids she uses various criteria. For example, she disregards ramen noodles because she doesn’t think they are food (her kids would disagree). She’s in charge, she gets to define success (ie. if the kids ate vegetables).
The point is: whoever’s in power gets to choose what is relevant data and what success looks like. You optimize for success over time. The definition of success matters a great deal. Behind the algorithmic math are people who define success, but don’t “share in the suffering of failure”.
We have had a long history of racist police policy based on broken windows policy. The explicit goal as to arrest people on small charges which would theoretically reduce worse crimes in the future. But what that looks like is concentrated nonviolent crime charges in black neighborhoods. We do not have crime data. I want to make that point again. We do not have crime data. We have arrest data. We have an enormous amount of missing data on white crime data. Blacks get arrested five times as much as whites for marijuana possession whereas usage rates are the same.
We’re not predicting crime – we’re predicting policing.
A report by Human Rights Data Analysis Group shows that Predictive Policing Reinforces Police bias. This explores the difference between an actual crime versus an arrest as a data point. Predictive policing is perpetuating the cycle of broken windows policing even for places where they say they are not pursuing that strategy any more.
Recidivism risk algorithms is being used to get rid of the bail system. This is not simple. IT is used in sentencing, parole and bail. People are assessed on their risk of getting arrested , failing to appear for court, etc. These might have legitimate reasons (getting your kids from school) – the risks are higher for some people.
The questionnaires used for these assessments contain questions that are proxies for race and class. “Any prior convictions”, “Do you think the system is rigged against you”, “Did your father go to prison”, “have you been suspended from school” – all of these are proxies for asking race. Many of these questions would be unconstitutional in an open court but are used against you in a risk assessment. This creates a pernicious feedback loop of its own device. You get a longer sentence if your risk score is higher.
These risk scores were created to fight human bias. Judges are also racist. But unfortunately the idea of objective, scientific risk scores is not working.
Cathy O’Neil’s pernicious feedback loop caused by unjust algorithms.
We need to instill the concept of ethics into data science trainings. Cathy asked a creator if they used race. He said no. She asked if he uses zip code. He says yes. The data scientists don’t feel any responsibility for their use.
She says we should have a Bill of Rights for Data in this country that would explain how these scores work and are being used against us.
Malika Saada Saar
Recently Malika brought leading women’s rights defenders to Google to talk about tech’s role in gender violence. While talking about tech’s impact on womens lives, one advocate said she felt like part of the “resistance” fighting the Empire, but while doing that the Empire was building another Death Star.
In civil society, government and more we have done the work to demand equality. We might not have succeeded yet, but we know where to go to demand accountability. Tech is a new circle of power; a new ecosystem of abuse, violence, subordination, and exclusion. We have to see this, name it, and hold it accountable.
The other side is that we can use this to advance our rights. BLM started as an online love letter. The moment of naming and shaming violence against women is only happening because of social media. #MeToo is what has allowed for this powerful echo chamber for naming and shaming what has been done to our bodies as women and girls. Social media has done more for representation of black and brown folk than the Motion Picture Association. We have to recognize that we can use this space for purposes of mobilization and the advancement of our rights.
Malika was trained as a human rights lawyer. Being at Google has been an opportunity to do that work. Her training focused on documenting abuses, so the world knows what is happening. At Google she can document in a way she never imagined before.
You have these smartphones. You can use them to bear witness, to document human rights abuses. Whether you are documenting them in Ferguson or Uganda. You take the digital evidence of those abuses and share them on these global platforms. It is absolutely what has changed the conversation around police brutality.
So much of how abuse happens is in the context of isolation and silence. Almost every genocide, every rape, happens in the context of isolation and silence. How do we use these technologies to surmount the silence and isolation? That is the power and promise of these technologies. The proof of it is in what we have seen around police brutality.
Malika saw another manifestation of this with women in Rwanda. They used smartphones to document abuse at the hands of their husbands and then showed the videos to the judges. We can use these technologies for the purposes of taking back power and protecting ourselves.
Google has been thinking about how to use VR to “scale the prison walls”. Abuse in prisons can happens because others are behind the walls. They created an immersive experience of solitary confinement. For four minutes your are in it, and it is narrated by people who have been there. Now they are doing it for girls behind bars. Using technology in this way to bear witness is powerful… and there is danger. After four minutes we have the privilege to take the headset off and lay it down.
There are ways that we will use these technologies, reimagine them, to advance our rights. There are also ways that we have to make sure that, not only do we take what others have created for our own purposes, but also we are in the rooms to create the products themselves – for our communities, for our rights, for our safety. But it’s also about how we name and recognize problems. We must hold this new space accountable – from the same place of insistence on equality and dignity that we have done in every other circle of power and privilege.
Dr. Atyia Martin
This conference has the power to impact how we change the struggle. This is an opportunity for us… so there is hard work in involved. Dr. Martin is going to connect this to the resilience plan for Boston. The strategy has been launched, and embeds racial equity, social justice, and social cohesion. We have to ask who benefits and who is harmed from some policy or approach? How can be address this as an opportunity to work on these social justice domains? We have to see the multiple benefits this strategy can bring.
When we talk about racism we don’t always have the same framing and starting point. Our definition used in our resilience strategy is a strong academic definition but I want to share another one:
A historically rooted system of dehumanizing power structures and behavior based on ideologies that reinforce the superiority of white people and the inferiority of people of color while harming both. It is embedded in all of us. We are conditioned to adopts the behavior that fueled racism as a continuous process.
Dr. Martin has been working and re-working on this academic definition for seven years. This definition is important to understand the context of the data we work with. Data is biased. It is human and bias is part of the human condition. We live in a society that constantly presents messages about who people of color are. How do we turn data into information? Data are some pieces that we put together the story that we call information. It starts from the collection, how we collate and synthesize, and then how we leverage that to make decisions that impact millions of people. Not just in terms of policy, but if racism is embedded everywhere then this applies across the board. We often leave this part off — what is our personal responsibility as individuals?
We have to work on ourselves if we want to call ourselves social justice warriors. You can’t walk someone else through a process you haven’t gone through.
Our definition of racism has created two challenges. On the one hand it says that you are a good person if you point out racism. And you’re a bad person if you get that pointed finger and you are a racist. So we miss the complexity. It’s disrespectful to the struggle and to the people who are living it everyday.
In our culture both people of color and white people all “drink the same Kool-aid” – people of color are taking in the same media as everyone else. We have to manage the underlying things as well. In most cases we haven’t done the work to think about where that comes from. What does that mean for how I navigate the world?
The world assumes we’re happy to talk about racism everyday, because we live it. We don’t have the language for that y`et, but data can help us do this.
The other piece is white people who have internalized the other side of it. She wants to contribute some framing around the concept of power which we have been talking about. What does power actually mean?
Who gets to make decisions?
Who gets to allocate resources?
Who gets to establish the norms and standards?
Who gets to decide how we paint the picture of what’s happening right now?
Who is to blame and who are the victors?
Who gets to decide the history?
Who has the time to engage, to attend meetings? (Time itself is a form of power. People of color are more likely to live in food deserts, more likely to have long commutes.)
Relationships are one of the biggest ways we perpetuate racism and other forms of social injustice. Who we decide to have over for dinner matters. Everything in our society is based on our personal relationships. How can we leverage data to show this racism, classism, sexism?
When you change the way you look at things, the things you look at change. – Wayne Dryer
Purvi comes from the world of law. Law for Black Lives is a sister project of Data for Black Lives. She deferred going to law school because she got engaged in organizing, but hte organizers said no – we need you to understand the law.
She had a hard time the first year. She thought she was going to law school to learn about justice but what she ended up learning about were the rules and regulations of oppression. Law has always codified injustice and oppression. Purvi’s first example of this is Johnsion v. M’Intosh was a case the created the concept of property rights… the case is about a white person taking land from a native person.
The second story was August 9, 2014, when Michael Brown was shot and his body lay in the street in Ferguson. She was a lawyer for Center for Constitutional Rights and was glued to Twitter and television and watching the protests happening. She marched into her office and asked what they were going to do. At that time, their organization was suing the President, the Pope and the NYPD, so we were pretty busy. Her boss said they were busy but why don’t you go to Ferguson.
She went there a few days after the killing. In Ferguson she noticed that the media was covering the story as if it was all looters, but she say parents, veterans, and more marching each day as a ritual. Documenting what was happening online, she remembers seeing a family with all orange shirts on. They were in town for the family reunion and while she was about to tweet that they were all teargassed – babies, grandmas, and more. Purvi reminds us that this is the history of how black people are treated when the protest the killing of a family member.”There is an indifference to my life” is the attitude this builds. That’s why it is important to assert that “Black Lives Matter”.
This was a turning point for her as a lawyer. There was an execution style killing of Mike Brown and then hundreds of people were being arrested. One of the challenges in this situation was that lawyers did not want to stand with the community. We needed hundreds of lawyers and didn’t have enough lawyers to assist them. So what they ended up doing is building the Ferguson Legal Defense Committee. They started connecting to lawyers across the country, started small, did daily calls, and shared briefs.
Post-Ferguson, there was the grand jury announcement in 2014. Then there were rallies the day after that event which sparked national protests. They saw a role for lawyers operating within the system that was creating all these problems. Then in July, 2015, they hosted Law for Black Lives. They wanted to have a turning point for the legal community. They had 1000 people that were interested and hosted 2 days of conversation about the role of the law in creating the world in which black lives matter.
They discussed policing, the environment, co-ops, and more. Law for Black Lives has mobilized lawyers from Charlottesville to Charleston, supporting victims and organizing. They organize lawyers towards solutions as well. At this point they have 3500 members. Over 20 have worked collaboratively to built this over the last 3 years.
Purvi Shah talks about “Movement Lawyering” and asks what are the ways that lawyers partner with the people most impacted by social problems and take an explicit, non-neutral, values-based position in their work.
Purvi wants to share some lessons from her work for people in the room. She argues that like data, law has the veneer of neutrailty; a false neutrality.
Movement lawyering is about connecting law to social movements (buidling on a long history). This hasn’t always been identified as a clear strategy, with supporting theory and practice. What would it mean to be a “movement data” person? For them it meant partnering with the people. This is about creating an atmosphere to support people functioning and moving forward (quote from Arthur Kinoy).
“It’s not about winning cases, it’s about shifting power,” says Shah.
Solve the problems people want you to solve. The Vision for Black Lives documents this comprehensively. People have tons of ideas for this community talk about. Predictive sentencing, homelessness data, food justice – Purvi encourages us to start there.
She says a couple rules is that 1) There should be no rogue agents and 2) There should be no savior complexes. People in this room have a lot of expertise. How do you offer your expertise but in partnership with existing groups?
“Partnership means going at the speed of trust.”
She calls on this movement to center black leadership – the folks living at the most intersections and the margins. Privilege is complex – this is not about just values and ideas, but also about strategy. People who have the lived experience are the ones most likely to see the solutions.
Emotions >= data. Lawyers, like data people, are very analytical. Both are trained to be neutral. But emotions are a data point; both for the communities we work in and for ourselves. We need to create space for the trauma we feel, and the secondary trauma of experiencing the communities’ trauma.
A Bill of Rights is great but what about a Code of Values? Law for Black Lives has a code of values on their website. They believe in democratizing the law, for example.
We have to: Change ourselves, change our work, shift our institutions and shift our fields.
What is the main goal for Data for Black Lives over the next few years?
Yeshimabeit: One of the things they are focusing on is building out this network and relationships with the people here and nationally.
Cathy: I joined Occupy in 2011 and that was 6 years ago and we still meet every week. Most people don’t know that Occupy still exists but that’s not hte point. The people I met then are now working for Senators, integrating themselves into the systems. It’s a network that you build and make strong bonds, and then you keep going. It has a certain goal and mission.
Atyia: We need to re-think a lot of things. We need to reframe how we are thinking about the world based on research and information, remember our critical thinking and bias skills. We need to re-intellectualize – we have become so fractured around different interest areas rather than big picture goals. We get stuck on strategies versus outcomes. We need to put a mirror on ourselves on a regular basis. The research shows that if you think you are super smart then the more dangerous you are in terms of your bias. The Northeast region sees a lot of well-meaning folks that haven’t done the work on themselves.
Purvi: Center yourself on things that change conditions for people suffering right now. Work on the problem and the underlying thing. You have to keep people alive while working on the system.
Malika: How do we make sure that there is a constant dialogue between Google, Facebook, Twitter to what you are doing? Not just issue of access. How do we unlock these spaces for criminal justice defenders for example? The other intersection that has to play out is that there’s a divide between rights defenders and tech. What’s the intersection between those groups? If they are not talking about predictive policing then they are not doing their work. There’s a difference in knowledge and language. A lot of it is generational. Ella Baker would have said – it’s the younger folk who know how to bring a movement forward.
Cathy spoke about proxies for race and class. Every place has proxies for certain modes of oppression. How can we work towards identifying and exposing these proxies in lots of fields?
Cathy: I loved finance and started with Occupy. And I thought the weaponization of math was a finance problem. She was previously deciding who got comparison advertising when they searched for flights on travel sites. Then a venture capitalist came and his vision for the future of ads was when he got to see flights to Aruba and not see ads for University of Phoenix because that “was not for people like me.” She was ticked off and then started looking at predatory online ads from trashy, private higher education institutions like U of Phoenix. I felt that I was complicit in a technology that was making people suffer. For the second time after being in finance. I’m creating a system that I personally will not suffer from but others will.
Cathy loves her luxury yarn advertisements, but on the other side it is predatory – payday lenders and for-profit colleges. It is ruining people’s lives and we call it a service.
Malika: Within human rights and civil rights community, we don’t know this. It’s a form of rights abuse that we don’t understand. There’s a real need for folks like you and the civil rights communities to be in dialogue to map these things out.
Google was asked by the civil rights community to pull down the payday loans ads. They mobilized to bring Google to the table and explain it. They made the decision to take them down. Right now their is a conversation about bail bonds. There might be more violations of civil rights.
Atyia: What was just described by Mildred is the idea that policies and practices have disproportionate outcomes for people of color is old history. This is the hisotircal context. Every issue has a historical story for why we see the problems today. Fo
The Social Security Act of 1935 did this. They wanted to give every citizen access to money. But the “fine print” didn’t allow for agricultural workers and domestic workers (“These are proxies!” chimes in an audience member). These started off intentionally. This is not new, it just comes in new forms.
Purvi: The new piece is that our data is being collected. How you interact on social media is being used to identify you. Everyday people have to understand what’s happening here. Most of this is happening in non-transparent ways. How can we shift the ethics and values about how this is done. We can’t bring this down from the top levels.
Malika: People of color and women have to be in the room as designers and creators.
Cathy: I agree with all that but I want to add something which is that this is about power but it is also happening in an extremely secretive environment. We have no access to weigh those algorithms, test them, see if they are wrong, see how they influence people. It’s not exactly the same thing. It is historically embedded, but the tech has made it possible for people that have power to have even more power.
Atyia: The vehicle is new, but the methodology is ancient. This is important for data scientists, because you don’t need to come up with things from scratch.
Malika: The tech companies talk about being justice-driven. Tech has stood up around things like the travel-ban, the bathroom-ban, DACA and more.
Community organizer from Newport News, VA, who works with youth. Working with the black community you work on lots of issues – mental health, economics, environmental justice, and more. Is there a toolkit to train us and the youth on how to identify what data is important and what we need. The questions we have might not be the data we need to find the trend and disrupt the norm.
Cathy: This is important, and hard to answer. Different fields have different forms of evidence gathering. “Big data” is mostly online data. The kids are being surveilled by the big tech companies. They can go “incognito mode” to protect themselves. Videos and documentation are important. The ACLU tool can immediately live stream police interactions.
Purvi: With Law for Black Lives we created chapters. How do you organize yourselves in the local communities and build bridges to people being impacted? How to democratize and build toolkits? There are pieces that are very complex. But how do we take democratization as far as possible? What’s the connectivity point in this room to make that toolkit?