Buy Now

Booktopia-CallToAction-182x40px

For Delivery in Australia & NZ

book depository

For Free Delivery Worldwide

iBooks

iPhone, iPad, iPod Touch and Mac

Kindle

Algorithms can learn. Algorithms can acquire bias. In The Black Box Society: How Secret Algorithms Control Money and Information, eminent Law Professor Frank Pasquale exposes the invasive technology running our lives. In The Age of Terror these systems can be used to extend the control of dark agencies into every corner of our lives. Every day, corporations are connecting the dots about our personal behavior-silently scrutinizing clues left behind by our work habits and Internet use. The data compiled and portraits created are incredibly detailed and invasive. But who connects the dots about what firms are doing with this information? The Black Box Society argues that we all need to be able to do so-and to set limits on how big data affects our lives.

Data and Goliath by Bruce Schneier

Click to Read More

Hidden algorithms can make or ruin reputations, decide the destiny of entrepreneurs, or even devastate an entire economy. Shrouded in secrecy and complexity, decisions at major Silicon Valley and Wall Street firms were long assumed to be neutral and technical. But leaks, whistle blowers, and legal disputes have shed new light on automated judgment. Self-serving and reckless behavior is surprisingly common, and easy to hide in code protected by legal and real secrecy. Even after billions of dollars of fines have been levied, underfunded regulators may have only scratched the surface of this troubling behavior.

Frank Pasquale exposes how powerful interests abuse secrecy for profit and explains ways to rein them in. Demanding transparency is only the first step. An intelligiblesociety would assure that key decisions of its most important firms are fair, nondiscriminatory, and open to criticism. Silicon Valley and Wall Street need to accept as much accountability as they impose on others.

 

Algorithms are everywhere. They are what advertisers use to target users online and what search engines use to cough up all those results in a particular order. Even the data collected by governments is used to build algorithms that are then used to track, flag or analyze whatever the government is looking to track, flag or analyze.

The Future of Violence

Click to Read More

But there’s a growing fear that these algorithms are learning stereotypes, and therefore abetting data discrimination. Some algorithms, for instance, make an assumption about an individual’s ability to pay debt based on race. Basically, a lot of data goes into these “black box algorithms,” as they are known, and they produce results that are often discriminatory.

“I call it a black box because we don’t have access to these sorts of algorithms,” says Frank Pasquale, a University of Maryland professor of law.

The algorithms produce results based solely on the data that was fed to them, but the trouble is no one knows exactly how the algorithm is crunching that data. Yes, algorithms are racist, Pasquale says, but they are also “reflecting the preferences of thousands and possibly millions of users.”

He sees this as a problem because it’s likely to influence even those who don’t buy into such stereotypes. And they may start thinking like the algorithm. He recommends something akin to “an anti-discrimination type of approach.”

If it’s true that we can never know how these algorithms work, then we must not allow certain results, he says. “We need to move beyond saying we just reflect what people think,” Pasquale says, “and make them [algorithms] more progressive.”

Future Crimes by Marc Goodman

Click to Read More

As reviewer David Wheeler wrote: “If you want to understand how the next couple of decades are going to unfold, this book is required reading. I find it baffling that people get upset over NSA surveillance, but they don’t mind being spied on by companies using secret algorithms that negatively affect them in their day-to-day life. Pasquale points out the many ways that companies secretly use data against you — and the system is completely hidden in secrecy.

“Are you okay with getting charged more for a product because you live in a certain zip code? Are you okay with retail stores using algorithms to discover that you’re probably pregnant (and marketing things to you accordingly)? Are you okay with facing a higher credit card interest rate because you signed up for marriage counseling, thus making you more of a “risk,” according to a secret algorithm? There’s no end to unethical ways data can and will be used against us — rape victim, gullible elderly, and other “ghoulish categories,” as Pasquale calls them. And unlike with a credit score, if a company has a data profile on you, you’re not allowed to see it. Pasquale opens our eyes to how this is already unfolding, and he offers several suggestions for how to intervene before things get even worse.”

Law Professor Frank Pasquale has been a Visiting Fellow at Princeton’s Center for Information Technology, and a Visiting Professor at Yale Law School and Cardozo Law School. He was a Marshall Scholar at Oxford University. He has testified before the Judiciary Committee of the House of Representatives, appearing with the General Counsels of Google, Microsoft, and Yahoo. Pasquale serves on the Advisory Boards of the Data Competition Institute, Patient Privacy Rights and the Electronic Privacy Information Center. He is is on the editorial boards of the Journal of Legal Education and the Oxford Handbooks Online in Law.

Buy Now

Booktopia-CallToAction-182x40px

For Delivery in Australia & NZ

book depository

For Free Delivery Worldwide

iBooks

iPhone, iPad, iPod Touch and Mac

Kindle