[A medley of diverse tech fellows and tech inequality experts from all over the world.]
Matt Mitchell: Hi. I’m Matt, and I'm a hacker.
Suchana Seth: I'm Suchana, I'm a data scientist, and I'm a champion for data for good.
Sid Rao: My superpower is to help you protect your digital privacy.
Eireann Leverett: As a technologist, you don't always have to work to defend the powerful, to defend the companies, to defend the traditional bases of power and influence in the world.
[Animation of a computer appears on screen. Hands type: “Since 2015, the Ford Foundation and the Mozilla Foundation have provided fellowships to technology experts to work with organizations advancing social justice”.]
Etienne Maynier: Because there is surveillance everywhere, because there is exclusion of a lot of people, because there is a strong inequality to how we access and understand technology, it won't improve our society if we don't fight for a better society with technology.
Berhan Taye Gemeda: I came into this fellowship and tried to understand what does it mean to be a technologist in a social justice space, who are the communities that we're trying to serve, and how can we best serve them?
Sid Rao: As a public interest technologist, we have to bridge the gap between what technologists are trying to build and what social scientists are trying to solve.
Jennifer Helsby: I worked on a technology project called SecureDrop, which is an anonymous whistle-blowing platform. It's a critical tool for journalists, just like an anonymous tip line.
Sid Rao: I built a tool which anyone can use to see how their Internet service providers can see what they're doing, and how they can build a digital persona.
Matt Mitchell: It's important that people know that it's not a matter of if you will be hacked as an organization, it's a matter of when. Which is why, as an organization, you need to have an incident response plan. "When this happens, we do this, and then if this happens, we bring it back down a step."
Eireann Leverett: Even if you're not a technologist, you should have opinions about how technology is used in society and has magnifying bias.
Steffania Paolo Costa Di Albanez: The way in which technologies are being developed—they are being developed by default with preconceived biased algorithms.
Suchana Seth: The other critical piece is to educate data scientists about the ways we have developed to identify and measure and correct algorithmic bias. How do we educate them about different kinds of fairness, metrics, and what's the right way to apply them?
[Animation of a person at a computer. A thought bubble appears asking the question, “Can computers be racist?”.]
Jennifer Helsby: In the future, I hope that the field of public interest technology can be seen on par with going to Google or Facebook or another tech giant.
Matt Mitchell: There are way too few of us doing social good work. It's something that we need to develop and raise up.
Suchana Seth: And it's up to technologists like us, who care about social good, to keep reminding everyone about this, and to keep demonstrating through our work that technology can indeed make a difference.
[Ford Foundation logo: a globe made up of a series of small, varied circles. Mozilla logo.]