Algorithms are everywhere. They are what advertisers use to target users online and what search engines use to cough up all those results in a particular order. Even the data collected by governments is used to build algorithms that are then used to track, flag or analyze whatever the government is looking to track, flag or analyze.
Click to Read More
But there’s a growing fear that these algorithms are learning stereotypes, and therefore abetting data discrimination. Some algorithms, for instance, make an assumption about an individual’s ability to pay debt based on race. Basically, a lot of data goes into these “black box algorithms,” as they are known, and they produce results that are often discriminatory.
“I call it a black box because we don’t have access to these sorts of algorithms,” says Frank Pasquale, a University of Maryland professor of law.
The algorithms produce results based solely on the data that was fed to them, but the trouble is no one knows exactly how the algorithm is crunching that data. Yes, algorithms are racist, Pasquale says, but they are also “reflecting the preferences of thousands and possibly millions of users.”
He sees this as a problem because it’s likely to influence even those who don’t buy into such stereotypes. And they may start thinking like the algorithm. He recommends something akin to “an anti-discrimination type of approach.”
If it’s true that we can never know how these algorithms work, then we must not allow certain results, he says. “We need to move beyond saying we just reflect what people think,” Pasquale says, “and make them [algorithms] more progressive.”
Click to Read More
As reviewer David Wheeler wrote: “If you want to understand how the next couple of decades are going to unfold, this book is required reading. I find it baffling that people get upset over NSA surveillance, but they don’t mind being spied on by companies using secret algorithms that negatively affect them in their day-to-day life. Pasquale points out the many ways that companies secretly use data against you — and the system is completely hidden in secrecy.
“Are you okay with getting charged more for a product because you live in a certain zip code? Are you okay with retail stores using algorithms to discover that you’re probably pregnant (and marketing things to you accordingly)? Are you okay with facing a higher credit card interest rate because you signed up for marriage counseling, thus making you more of a “risk,” according to a secret algorithm? There’s no end to unethical ways data can and will be used against us — rape victim, gullible elderly, and other “ghoulish categories,” as Pasquale calls them. And unlike with a credit score, if a company has a data profile on you, you’re not allowed to see it. Pasquale opens our eyes to how this is already unfolding, and he offers several suggestions for how to intervene before things get even worse.”
Law Professor Frank Pasquale has been a Visiting Fellow at Princeton’s Center for Information Technology, and a Visiting Professor at Yale Law School and Cardozo Law School. He was a Marshall Scholar at Oxford University. He has testified before the Judiciary Committee of the House of Representatives, appearing with the General Counsels of Google, Microsoft, and Yahoo. Pasquale serves on the Advisory Boards of the Data Competition Institute, Patient Privacy Rights and the Electronic Privacy Information Center. He is is on the editorial boards of the Journal of Legal Education and the Oxford Handbooks Online in Law.
Buy Now
For Delivery in Australia & NZ
For Free Delivery Worldwide
iPhone, iPad, iPod Touch and Mac
Kindle