We live in the age of the algorithm. Increasingly, the decisions that affect our lives -- where we go to school, whether we get a car loan, how much we pay for health insurance -- are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: Everyone is judged according to the same rules, and bias is eliminated. But as Cathy O'Neil reveals in this book, the opposite is true. The models being used today are opaque, unregulated, and uncontestable, even when they're wrong. Most troubling, they reinforce discrimination: If a poor student can't get a loan because a lending model deems him too risky (by virtue of his zip code), he's then cut off from the kind of education that could pull him out of poverty, and a vicious spiral ensues. Models are propping up the lucky and punishing the downtrodden, creating a 'toxic cocktail for democracy.' Welcome to the dark side of Big Data. Tracing the arc of a person's life, O'Neil exposes the black box models that shape our future, both as individuals and as a society. These 'weapons of math destruction' score teachers and students, sort résumés, grant (or deny) loans, evaluate workers, target voters, set parole, and monitor our health. O'Neil calls on modelers to take more responsibility for their algorithms and on policy makers to regulate their use. But in the end, it's up to us to become more savvy about the models that govern our lives.
Bomb parts: What is a model? -- Shell shocked: My journey of disillusionment -- Arms race: Going to college -- Propaganda machine: Online advertising -- Civilian casualties: Justice in the age of big data -- Ineligible to serve: Getting a job -- Sweating bullets: On the job -- Collateral damage: Landing credit -- No safe zone: Getting insurance -- The targeted citizen: Civic life
The information below has been drawn from sources outside of the University of Wisconsin-Madison Libraries. In most instances, the information will be from sources that have not been peer reviewed by scholarly or research communities. Please report cases in which the information is inaccurate through the Contact Us link below.