Algorithms as WMDs

For data sensemakers and others who are concerned with the integrity of data sensemaking and its outcomes, the most important book published in 2016 was Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, by Cathy O’Neil. This book is much more than a clever title. It is a clarion call of imminent necessity.

WMD Cover

Data can be used in harmful ways. This fact has become magnified to an extreme in the so-called realm of Big Data, fueled by an indiscriminate trust in information technologies, a reliance on fallacious correlations, and an effort to gain efficiencies no matter the cost in human suffering. In Weapons of Math Destruction, O’Neil reveals the dangers of data-sensemaking algorithms that employ statistics to score people and institutions for various purposes in ways that are unsound, unfair, and yes, destructive. Her argument is cogent and articulate, the product of deep expertise in data sensemaking directed by a clear sense of morality. Possessing a Ph.D. in mathematics from Harvard and having worked for many years herself as a data scientist developing algorithms, she is well qualified to understand the potential dangers of algorithms.

O’Neil defines WMDs as algorithms that exhibit three characteristics:

  1. They are opaque (i.e., inscrutable black boxes). What they do and how they do it remains invisible to us.
  2. They are destructive. They are designed to work against the subjects’ best interests in favor of the interests of those who use them.
  3. They scale. They grow exponentially. They scale not only in the sense of affecting many lives but also by affecting many aspects of people’s lives. For example, an algorithm that rejects you as a potential employee can start a series of dominoes in your life tumbling toward disaster.

O’Neill identifies several striking examples of WMDs in various realms, including evaluating teachers, identifying potential criminals, screening job applicants and college admissions candidates, targeting people for expensive payday loans, and pricing loans and insurance variably to take advantage of those who are most vulnerable.

During the Occupy Wall Street movement, following the financial meltdown that was caused in part by WMDs, O’Neill became increasingly concerned that the so-called Big Data movement could lead to harm. She writes:

More and more, I worried about the separation between technical models and real people, and about the moral repercussions of that separation. In fact, I saw the same pattern emerging that I’d witnessed in finance: a false sense of security was leading to widespread use of imperfect models, self-serving definitions of success, and growing feedback loops. Those who objected were regarded as nostalgic Luddites.

I wondered what the analogue to the credit crisis might be in Big Data. Instead of a bust, I saw a growing dystopia, with inequality rising. The algorithms would make sure that those deemed losers would remain that way. A lucky minority would gain ever more control over the data economy, raking in outrageous fortunes and convincing themselves all the while that they deserved it.

WMDs are misuses of computers and data. She writes:

WMDs…tend to favor efficiency. By their very nature they feed on data that can be measured and counted. But fairness is squishy and hard to quantify. It is a concept. And computers, for all of their advances in language and logic, still struggle mightily with concepts…And the concept of fairness utterly escapes them…So fairness isn’t calculated into WMDs. And the result is massive, industrial production of unfairness.

WMDs are sometimes the result of good intentions, and they are passionately defended by their creators, but that doesn’t excuse them.

Injustice, whether based on greed or prejudice, has been with us forever. And you could argue that WMDs are no worse than the human nastiness of the recent past. In many cases, after all, a loan officer or hiring manager would routinely exclude entire races, not to mention an entire gender, from being considered for a mortgage or a job offer. Even the worst mathematical models, many would argue, aren’t nearly that bad.

But human decision making, while often flawed, has one chief virtue. It can evolve. As human beings learn and adapt, we change, and so do our processes. Automated systems, by contrast, stay stuck in time until engineers dive in to change them…Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that’s something only humans can provide.

This book is more than an exposé. O’Neil goes on to suggest what we can do to prevent WMDs. It is incredibly important that we take these steps and do so starting now. Many of the industrial revolution’s abuses were eventually curtailed through a heightened moral awareness and thoughtful regulation. Now is the time to clean up the abuses of algorithms much as we once cleaned up the abuses of slavery and sweatshops. I heartily recommend this book.

Take care,

Signature

One Comment on “Algorithms as WMDs”


By Nick Desbarats. January 3rd, 2017 at 7:04 am

I emphatically agree that this book is a must-read, particularly for anyone who’s actually working on algorithms of the types described by O’Neil. I’ve already recommended it to several of my contacts who fall into that group. I suspect that many of them already have some vague sense that the algorithms on which they’re working might not be the unbiased, pure oracles that they and their colleagues assume them to be but the book does a great job of bringing that fact into jarringly sharp focus.