Book Club: Weapons of Math Destruction

unnamed.png

Our WiD monthly book club discussed Cathy O’Neil’s Weapons of Math Destruction last week and the one thought that was predominant for me was: “Is this the way AI is going to break us”? For those of you that haven’t read it yet, titular weapons are algorithms or programs that are opaque and cause unfair disadvantages for many people at once. This harm is quantified by scale, secrecy, and destructiveness to win the title of “weapon of math destruction”.

So many topics came up during our conversation: ethics, best practices and comments on the society at large, but what we were most concerned with was: How do we move forward from here? Big data hasn’t been around very long, and from O’Neil’s perspective, it’s only been a couple decades that the negative externalities related to flawed data models that exert control over lots of people at once have been able to inflict their damage. 

So what is next? Awareness is key, surely, but beyond this, have we as a society decided on how to best handle these data driven discrepancies? Do we understand how deep and wide the problem really is yet? O’Neil’s work, along with the work of activists, is to paint the severity of the issues automation can create and to help design future models that will minimize unintended harm for the common good.  

We have seen strides in data governance in recent years as the private sector has legitimized the importance of managing data and algorithms from a legal perspective, which helps. Democratizing data models and allowing for clear communication surrounding how certain models are weighted, who they reward and, most importantly, who they punish (and how) is crucial to keeping this conversation going. Is the data communicating insights in a way that’s accessible to all? Along with democratizing data, you also run into the issue of who are those best entrusted to be the stewards of data. In a recent case, Boston had declined to use a facial recognition program in their public spaces citing concerns regarding ethics and safety:

 

"It has an obvious racial bias and that's dangerous...it also has sort of a chilling effect on civil liberties. And so, in a time where we're seeing so much direct action in the form of marches and protests for rights, any kind of surveillance technology that could be used to essentially chill free speech or ... more or less monitor activism or activists is dangerous".  -Councilor Ricardo Arroyo, who sponsored the bill along with Councilor Michelle Wu


Geo-tracking, for example, is pervasive, and there are a myriad of programs that can now see where you are in a grocery store or match a photo a stranger takes of you out in public with your personal information. How often should we give permission? Who can scrape this data and use it for personal use? Commercial use? Recently there was controversy over OkCupid, Tinder and Grinder’s customers’ dating preference and location data being scraped and used by advertising and marketing companies in ways that may violate privacy laws. 

Then the conversation broadened into the issue of ethics. Data scientists in masters programs and (most) boot camps and MOOCs are taught something about ethics and A/B testing and experimental design. Is this enough? Probably not, but it’s hard to say. The misuse of data, fraudulent statistics, confirmation bias are all issues even with the best of intentions. Observers are already changing the data to some degree. Then we get to the worst of intentions, as we see O’Neil warn against the dangers of having lobbyists that protect the companies that pay them regarding issues of data privacy and weapons of math destruction. 

“As far as I’m concerned we are in a war. And on the other side are the lobbyists. And we are totally outflanked.” - Cathy O’Neil

We are witnessing a wild west in the data world right now. Regulation is newly being drafted to control a new facet of human life and, frankly, lawsuits will likely be a way that this new future will be defined. From public school teacher assessments to sports teams to online shoppers to your social media, the private sector and governments will come together to define just what those boundaries really are. 

Together we will need to focus more strongly on compassionate algorithms. If weapons of math destruction are opaque and unfair algorithms that cause harm to many people and exacerbate inequality, sexism and racism, then why can’t we build algorithms that would direct public resources where they are most needed: housing, education, upskilling, mutual aid networks, healthcare and going green. Building resources for the future will be crucial considering we’re in a recession that has seen the quality of life for many people dwindle. “Data for Good” is a phrase I hear at countless events. If weapons of math destruction are causing harm, then a focus on data models for nonprofits, supportive of social causes and Non-Governmental Organizations in a way that isn’t performative, but tangibly helps the well-being of lots of people at once, is just where we need to focus our efforts.

We joked at one point during our meeting that we need a data “Erin Brockovich”. If any of you out there are reading this and think “could this be me?” the answer is: probably. We need passionate people that care about data ethics now more than ever! Cathy O’Neil has dreams of a “Bill of Rights” for data. If we were to write such a thing what would we put on there? Which markers of transparency and autonomy are most important to us as private citizens? My feeling is when we get closer to the answer to that question, we will also be closer to a more equitable data culture.

May the odds be ever in our favor.

Irene Bratsis 

https://www.nytimes.com/2020/01/13/technology/grindr-apps-dating-data-tracking.html

https://www.wbur.org/news/2020/06/23/boston-facial-recognition-ban

https://medium.com/data-feminism/5-questions-on-data-justice-with-cathy-oneil-87f42355ce55

Previous
Previous

Book Club: Invisible Women

Next
Next

Black Lives Matter