By Terrell McSweeny, Former Commissioner of the Federal Trade Commission
Data has been likened to the blood, oil, water – even the gold – of our Internet economy. Its value has increased alongside the migration of our lives to digital. Paradoxically, we are sharing more data than ever before, even as we struggle to understand the implications of doing so.
Most Americans believe they have lost control over how their personal information is collected and used. Our once blithe confidence in new technology is now complicated by questions about how to protect privacy, autonomy, and healthy competition in the age of connected everything. Against this backdrop, it’s tempting to presume that privacy can only be protected by severely restricting the flow of data – by closing off one provider’s data from another’s. Such an approach, however, could limit the ability of users to easily move their data around and reduce the potential for innovative new entrants to markets and all the benefits that may flow to consumers from them. Privacy is a crucial aspect of consumer rights in the digital age – and openness is another. The right balance will be found in policies that give users meaningful control over their digital identities but that also foster competition and innovation.
Most of us are now familiar with the grand bargain of our digital economy. We enjoy the benefits of the innovative products and services available to us, often for free, in exchange for our data. We experience the pleasure and ease they bring to our lives when we download apps for work, entertainment, transportation, food — almost anything. For the most part, we don’t dwell on who has our data and how it’s being used. That is, until something unexpected occurs that frays our trust and calls into question whether consumers are bearing too much risk in the deal.
For years privacy experts have rightly warned that consumers relinquish too much information with too little understanding of how it’s used and limited recourse when it’s mishandled. Under the current US data protection framework, so long as a person is properly notified and consents to how their data will be used, nothing more is required. Now this framework is being challenged – in the US and elsewhere – and one of the central issues in the global discussion about privacy is how much control consumers should have over how their data flows. This conversation is long overdue.
Europe has set a high bar by granting broad rights to consumers over their data. The General Data Protection Regulation, commonly referred to as the GDPR, gives users rights to access, correct, delete, restrict, and move their data from one service provider to another. It is a strict approach that is moving other major markets to consider strengthening their own data privacy laws, even at the state level. California, for example, recently passed a new privacy law that requires companies to disclose the types of data they collect and allows consumers to opt out of having their personal information sold. The California law does not include all of the same rights as the GDPR, but, importantly, it does include a right for users to access and move their data from one service provider to another.
Models that vest more control in users are essential consumer protections. To be useful, they should include meaningful data portability. Most of the large platforms already allow users to download data such as email archives, photos, posts, and messages. But one of the challenges is whether consumers can easily use that information once they’ve accessed it. For example, if you receive a print-out of an archive you cannot reasonably move — or port – from one service to another, what real good does that do? On the other hand, users that receive data in a machine-readable format can transfer it to other services more easily – particularly if open standards are used for moving data around. Interoperability, the degree to which data once accessed is usable, is, therefore, an essential aspect of giving consumers functional control over their information. Recent steps by Facebook, Google, Microsoft, Twitter, and other organizations participating in the Data Transfer Project to find a common way for people to transfer data into and out of services are promising steps toward facilitating a more useful data ecosystem for consumers’ data.
But even when consumers do have greater control, challenging privacy questions remain that can affect, even stymie, those rights. For example, users may grant permission for an app to access their contact list so that they can use some useful feature like a chat service. Yet granting that access implicates the privacy interests of the people in that user’s digital address book. An even trickier issue arises in sorting through the privacy and security interests involved in allowing users to move data generated by their activity on a platform – like their social graph data – which is an individual’s network of friends and their interactions with them. Some have called for policies to “free the social graph” to make it easier for people to move between social networking services without having to completely rebuild their networks manually. But if you want to move to a new social network and take personally identifiable information about your friends with you, should you have to ask your friends for consent first? What happens if one of your friends doesn’t want you to be able to move that data? Answering these questions requires a careful balancing of privacy with user control. A solution might be to build a way to allow users to easily ask friends to join a new service with them.
Protecting consumer rights is essential but it isn’t the only factor at play. When users can seamlessly transfer their data between service providers it might allow for start-ups or other competitive services to enter a market more easily. That fosters competition and innovation. If on the other hand people are deeply tethered to platforms through data that is difficult, costly, or even impossible to move, entry for competitive services may be too challenging, limiting the creation of data on new platforms.
Facilitating innovation will require data policies that go beyond strong user centered controls and privacy protections. This is particularly true for foundational platforms that serve as intermediaries between most people and the Internet. For example, the mobile app marketplace, which didn’t even exist a decade ago, is projected to generate around $189 billion in revenue in 2020. There are now millions of apps available through Apple’s App Store, Google Play, and other app stores. Consumers rely on feeding data through them to access new products and services. Businesses rely on accessing that data in order to grow.
Positive developments for consumer privacy do not need to come at the expense of innovation. But getting the balance right will require appropriate guardrails to protect people’s privacy and security and ensure their data is handled consistent with their expectations. Computers may process in bits of data but the choice between privacy and openness isn’t binary. Both are required to protect consumers rights and attain the innovative potential of the digital age.