In the wake of high-profile data breaches at Cambridge Analytica, Yahoo, Equifax, and other companies, privacy issues are increasingly the focus of attention and conversation. But privacy is a complicated issue, and often a contested one: Is privacy a human right? How much privacy are we willing to give up in exchange for convenience or public safety? Should the industry self-regulate, or should government step in? These are just some of the questions still up for debate.
With the rise of emerging technologies like artificial intelligence, predictive analytics, biometric and facial recognition systems, and blockchain, we need a diversity of voices and perspectives to help answer these urgent and critical questions. Increasingly, we also need to consider how data is being used to make decisions that can marginalize people, exacerbate inequalities, perpetuate bias, and chill fundamental freedoms. To this end, activists and organizations have been documenting how over-collection of data, and the lack of comprehensive legal protections related to data and privacy, are hurting people and communities.
- The internet of things can inadvertently enable domestic and gender violence
- Advances in technology have led to increasing surveillance of human right activists and advocates
- Employers are tracking workers’ movements, and enabling a culture of digital work surveillance, and
- Predictive pretrial risk assessments can exacerbate existing inequalities.
States like California, Virginia, Vermont, Colorado are proposing new legislation to protect online privacy, and court decisions like Carpenter v. United States are weighing in on law enforcement’s access to cell phone data. Meanwhile, advocates are calling for strengthening regulatory bodies like the Federal Trade Commission to oversee industry data collection practices and set privacy standards, and industry is making a greater push for technical standards like “privacy by design.” Against this backdrop, this is a critical moment to build public support for laws, regulations, and interventions to promote privacy—and to ensure that the voices of the people and communities most affected are taken into account.
How you can help
Now more than ever, the voices and perspectives of young people, ethnic and racial communities, LGBTQ people, and people with lower skill and educational levels can make a difference.
Journalists: Explore opportunities to partner with local NGOs and communities on the ground to cover more local, state, or regional stories about how privacy issues affect people in their daily lives. The communities most impacted by privacy and data breaches are rarely reflected in news coverage—and an analysis of how the media covers privacy issues has revealed that stories often lack the analysis, explanation, and real-world examples that would help readers relate to and understand the urgency.
Direct service agencies and organizations, including those serving youth, seniors, domestic violence victims, and immigrants: Consider surveying the privacy needs and risks of the communities you serve, and reach out to journalists or digital rights privacy groups to document cases of harm.
The technology sector: Consider what you can do to explain the tenants of “privacy by design” to the general public. For help, check out the digital standards guide developed by Consumer Reports.
Advocates and academics: Explore how you can effectively communicate policy proposals and evidence-based research to the public. Work to bridge and connect the conversations taking place in industry meetings, government, and academia with the grassroots community.
Increasingly, questions about digital privacy impact almost every area of our lives. Addressing them will take support from people working in business, government, media, academia, technology, and the nonprofit sector—and at their intersections—strategizing together to find solutions that meet the needs of all communities.