From Yahoo to Sony to Target to Citibank, not much time passes these days without news of an online security breach. Consumers have gotten used to hearing about vulnerabilities across industries, companies, and products, often accompanied by advice to change their passwords. Many people feel overwhelmed by this, helpless to protect their information. Others aren’t sure why they should be concerned about companies having their data in the first place.

To add some much-needed clarity, today Consumer Reports (CR) has launched a Ford Foundation-supported effort called the Digital Standard. An open-source work-in-progress, it aims to help consumers understand how a range of digital products measure up when it comes to security and privacy, so they can make educated choices about what to buy and use—and to help guide companies in building and designing products that take consumer privacy and security seriously. The Standard is open to feedback and contributions from anyone who cares about these issues.

Across connected devices, software, and mobile apps, the Standard looks at things like product stability, data control, and transparency about terms of service. And it tests them with the same rigor CR has brought to everything from cars to mattresses to fire-extinguishers over its 80-year history. We asked Maria Rerecich, CR’s director of electronics testing, to tell us more about how the Standard works and why it matters.

Ford Foundation: Tell us a bit about your job. What does it mean to be the director of electronics testing at Consumer Reports? 

Maria Rerecich: I’m responsible for all the electronics testing we do. Traditionally that’s been functional testing, performance testing of all the electronic gadgets you use: TVs, phones, computers, smartwatches, headphones, printers…anything like that. We test comparatively; we run the same type of products through the same type of testing and the same test procedures for every model in a category. So we will test all the headphones the same way; we test all the phones the same way. And then we’re able to rate and rank them. By doing that, we’re not just saying, “Hey, this is a pretty good phone,” we’re saying, “This phone is better than that phone in certain aspects, but maybe not in others.”

Our testing has historically been based more on the hardware performance, but a product is no longer just what you hold in your hand. It’s really a system of the hardware and the software together that affects the consumer experience. So we are starting to branch out into the areas of privacy, security, and data practices, which impact products like refrigerators and thermostats in addition to traditional consumer electronics products.

As those things have changed, how has CR’s approach to consumer protection evolved? How do you get from testing cars and dishwashers for safety and effectiveness, to figuring out what to test for in a whole range of new, digital consumer products and technologies? 

It really centers on benefits and protections for consumers. What do consumers expect, what do they want—and what do they need that they may not know that they want? Going back to 1936, CR has a long history of rating products while keeping consumer safety in perspective. We’ve stood up for having seatbelts in cars, safety in child car seats, strollers, things like arsenic in rice. We can do that because we test. We’re able to show that one product has a higher level of vulnerability than another one. By showing that its possible for a product to be better, it shows that other manufacturers of that product can do better. Manufacturers do pay attention to our ratings and findings, because they respect the rigor of our testing.

What specific problems is the Digital Standard aiming to address, or gaps to fill?  

People find vulnerabilities all the time. But there doesn’t seem to be a coordinated effort to compare products. So when a vulnerability is found, companies usually address them one at a time. With the Standard, we’re trying to say: “Here’s what an ideal state looks like. Pay attention to these criteria as you are developing products.” It’s always better to design goodness in, rather than to test for it and screen out the bad afterward.

We’re trying to present an overall picture of goodness for digital products: For products that connect to the Internet, products that incorporate software, what is an ideal state? We tried to frame it from a consumer viewpoint, considering not just privacy and security, but also governance aspects that relate to privacy policies, and how good they are for the consumer.

We looked at four areas. Is the product private? That means looking at permissions and data sharing. Is it secure? That means looking at things like encryption and security updates. Then there’s governance: Does the company protect consumers’ privacy and freedom of expression? Finally, issues of ownership, which includes the right to repair. Those aren’t all necessarily related to data privacy, but general digital goodness is what we’re going for.

I think sometimes people struggle to understand why it’s important to have strong security and privacy protections—there’s a general sense that they should be concerned about it, but these issues can seem complicated and abstract. From a consumer protection and advocacy perspective, how do you explain it?

A lot of companies collect data even if they don’t need it for the function of the product. People are used to giving that data because they don’t think it has value—but it has a lot of value to the company. It also means they’re storing that data somewhere, and if there’s a data breach all that information gets out. Say you have two similar products from two different companies: one of them only keeps the information they need for the product, and the other one asks for all kinds of other information. If there’s a data breach, which one is worse? Well, the one that’s got more data on you.

I was on a panel at the FTC on smart TVs, and we were talking about how the TVs monitor what you watch, and that information goes back to the manufacturer. In many cases, they use that data to target advertisements. People think it doesn’t matter; they don’t care if they get an ad based on what companies think they want to see. But how do they figure out who gets those ads? If a particular TV is always showing cartoons between 3 and 5 in the afternoon, you can be pretty sure a child is watching that TV. What do companies do with that information?

When people don’t know these things are happening, they can’t decide for themselves if they want them or not. Consumers can make their own decisions, but they need information so they can make those decisions intelligently. Then they can decide, “Yes, it’s worth it to me to give my data because I want to get that free game. But I know what data I’m giving up, and I know what I’m getting for it.”