Éirann Leverett: The Internet is one of the greatest public commons we have. It’s a shared safe space and should remain so.
[Éirann Leverett, Ford-Mozilla Open Web Fellow, Privacy International. A white man with gray and white hair, black framed glasses, and a white gingham shirt with the sleeves rolled up.]
I’m Éireann Leverett and I’m a hacker for change, for good. I like tracking down threats against civil society. My career in security started out working on industrial systems, and, in particular, public utilities—so, the electric grid, water facilities, sewage facilities, oil and gas, telephone companies. And that really motivated my early thinking about security as a public good—and, in particular, security of everyday infrastructure and services. And as I realized that most of those services were coming to rely on the internet, I realized that the internet is a public service and a public utility.
Traditionally, when we go into a debate about national surveillance or nation-state hacking as Privacy International, we run into this problem where the intelligence agents or the police or the government officials on the other side say, “If you want security from terrorists, you’re going to have to give up some privacy.” I personally don’t believe this is true. I think they can continue to do their counterterrorism work, but we can also make everyday people safer. And the best way to do that is to reframe privacy and security as a public good, as a consumer rights issue. Regardless of whether they’re trying to track terrorists in the rest of the world, you deserve not to have your emails read.
In the Internet of Things, security and privacy are both public goods. What do we mean by public good? We mean, I can’t get security for myself unless I can also get it for you. Right? If I secure only my phone, it doesn’t actually secure wider society. It’s very important that we start to view security and privacy as a public good, because we will fund it differently. As a technologist, you don’t always have to work to defend the powerful, to defend the companies, to defend the traditional bases of power and influence in the world. You could go and work at a regulator. You could work at a safety regulator. You know, we’ve seen the hacking of cars recently. There’s no reason that the kind of crash testing that they do on cars couldn’t be a role for a computer technologist, who’s literally crash testing the devices in a computer science sense to see if they are still safe when we have automated vehicles in the future. And that’s true of any sort of regulator, right? I can imagine people working at the Environmental Protection Agency. I can imagine people working at the UN on human rights and the intersection with technology. Just having people in society do those roles will become a very powerful change in the future.
[Ford Foundation logo: a globe made up of a series of small, varied circles. Mozilla logo.]
[Danny Weitzner, Online Privacy Expert, White House Deputy CTO for Internet Policy, 2011-2012, and Founder of the Center for Democracy and Technology. A white man wearing a gray suit and multicolored bowtie.]
My name is Danny Weitzner. I run the new MIT Internet Policy Research Initiative. We’re really in an era where the question of privacy is just fundamental. For four years now, MIT and Georgetown Law School have taught together a course on privacy, technology, and legislation. We bring together about 12 computer science students and 12 law students every year, and we present them with privacy technology challenges. That is, we say, “Look, there’s smart city technology developing now, which is going to keep track of where people are driving, the license plates of their cars, where they park, whether they pass a mosque or a church or a synagogue every day and get out on the way to work. What should be the privacy rules associated with that new set of smart city technologies?”
If you’re a lawyer, you might say, “OK, what laws apply?” And the answer would be, “Well, probably not many.” And then you get a little stuck, because you’re not quite sure what to do. If you’re a more technology-oriented person, a computer scientist, you look and say, “Wow, we can do all these cool things with all that data. We can learn all kinds of things about people. What can we do with that data? Well, we’re not really sure because we don’t really know what the rules are or what the rules should be.” We bring together these two worlds and try to figure out a solution. So, our challenge to groups of students that we get together every year is to understand the technology context really deeply so that we understand what the privacy risks are, what the privacy opportunities might be, what kinds of privacy protections could we possibly build into technology—to then actually develop a legislative proposal that could be brought to either state legislators or members of Congress.
[Photo of David Vladeck, a white man with gray hair, wearing a gray suit and striped purple tie, standing at a podium. His credentials appear: online privacy expert; Professor, Georgetown University Law Center; Director of the Bureau of Consumer Protection at the FTC, 2009-2012.]
When David Vladeck and I started this course, we thought we were teaching about privacy, technology, and law. What we’ve learned is that we’re teaching students an even more essential skill: how to be a public interest technologist—someone who can think deeply about the public policy questions that are raised by technology. What we know is that there is enormous demand for students who have training on both sides of this divide. We know that governments need students like this. We know that regulatory agencies, civil society organizations, and companies need students like this. All of us together as a society really have to be directly engaged in these public interest technology questions to make sure that we’re making the most of these new tools that we have in a way that really supports human values.
[This is tech at work for the public! Hashtag Public Interest Tech. Ford Foundation dot org forward slash tech. Ford Foundation logo: a globe made up of a series of small, varied circles.]
And if you like this video about public interest technology, watch the video about Joy and her incredible work on facial recognition and you’ll really understand the impact of this kind of training.