Our daily lives are filled with data. Data drives everything from the ads we see all over the Internet to the GPS directions we call up on our phones, to the coupons that flood our inboxes. Corporations have long seen the benefits of using big data to streamline their operations, target new customers, and filter job applicants. Government agencies are also adopting big data and algorithmic-based decision making tools to increase efficiency, shape service delivery, and inform policy.

But big data is not impartial. Without careful design, application, and oversight, big data and technology can reinforce existing inequalities and harm the communities where these inequalities exist. If we want to use these tools in ways that are both fair and effective, we need to understand the biases that inform how they function.

With that in mind, the Ford Foundation brought together leaders from the public and private sectors to discuss how the government and the tech sector can work together to make sure big data serves all Americans. Following are some highlights from the conversations that took place at that event, “Fairness by Design.”


With big data as a factor, privacy issues can become justice issues

New technologies and big data collection often give rise to concerns about privacy. But when technology is involved, a concern about privacy can develop into a criminal justice complaint—a problem that’s especially prevalent in minority communities. For example, when a police officer enters the home of a citizen targeted because of “predictive policing,” that’s a criminal justice issue, though one that developed from an issue of privacy. What do civil rights mean in this era of big data? Answering that complicated question is a first step in understanding how big data and new technologies can inadvertently harm communities—and how to prevent that harm.


This video was created by an external party and may not be fully accessible.

Real world justice requires digital justice

The increased digitization of our world has created opportunities for equity as well as injustice. And some inequalities are being encoded into the very digital systems we’ve come to rely on. In response, foundations, organizations, and advocates fighting for civil rights and social justice need to look at the issues they champion—voting rights, education policy, housing rights—through a digital lens. Ford Foundation President Darren Walker emphasizes that philanthropy can help ensure the private and public sectors have the resources, talent, and foresight to address bias, discrimination, and injustice in the use of big data and technology—especially by the government.

This video was created by an external party and may not be fully accessible.

When it comes to criminal justice, biased algorithms have consequences

Though intended to avoid human bias, some decision-making software—especially in the criminal justice system—actually replicates it. Julia Angwin of ProPublica examined sentencing software that uses algorithms to predict a defendant’s risk of committing a future crime, and found that it was twice as likely to incorrectly score black defendants as high risk, and twice as likely to incorrectly score white defendants as low risk. Undetectable by examining the algorithm itself, the bias is only evident once the data is run through it.

This video was created by an external party and may not be fully accessible.

Not all data is good data—but no data is bad data

The lack of accurate data on policing and crime is a major impediment to democracy in the US. Existing crime data is often outdated, incomplete, or not readily available. “We just have really, really bad crime data. And…there’s a question of, where’s violence and violent crime in America? The real problem is, no one knows,” said Roy L. Austin Jr., deputy assistant to the president for urban affairs, justice, and opportunity.

Without accurate data, it is impossible to assess risk and understand the reality of public safety and recidivism. Similarly, a lack of data and transparency about police activity—including officer-involved shootings, fines, and use of force—prevents meaningful analysis of current policies, and their long-term impact on trust between communities and police officers.

This video was created by an external party and may not be fully accessible.


The bottom line: Technology and big data can help streamline and strengthen law enforcement and criminal justice—but they can also do great harm. In this digital era, we need proper analysis and oversight of these tools, to ensure that they are not abused, but are used to advance fairness and justice.