Advertisement

How to make tech companies earn people’s trust: Editorial

Font Size

Consumers have grown accustomed to freely traversing the web, unimpeded by contractual demands from individual sites and services. And tech companies have grown accustomed to quietly tracking them as they do.

By Bloomberg

WE ALL agree, they said. Please make more rules for us, they said. Give more money to our regulators, they said. When an assemblage of savvy corporate lawyers converges on such improbable sentiments, skepticism is usually in order. Last week’s privacy hearing on U.S. Capitol Hill demanded a load of it.

Representatives from numerous tech and telecom luminaries — including Amazon, AT&T, Google and Twitter — told the Senate Commerce Committee on Wednesday that there was “widespread agreement” in their industries about the need for a new federal privacy law. It all sounded quite cooperative and public-spirited.

Unfortunately, “We care about your privacy” is up there with “Your call is very important to us” in the canon of corporate insincerity. A federal law is desirable for these companies not because privacy is a “core value” or “human right,” as the lawyers professed, but because they’d no longer have to comply with different rules in different states.

It’s hard to blame them for pursuing their interests. The problem is that their preferred cure would create more problems than it would solve.

A federal privacy law of the kind they envision — offering more “transparency” about data collection, giving users more “control” over their privacy — would be of no benefit to anyone. In the abstract, these seem like worthy principles. In practice, they invariably mean privacy policies no one reads, settings and options no one understands, and chipper compliance notices intended to reassure users while allowing companies to mostly keep doing what they’re doing.

Their business models, after all, depend on it. Much of the digital economy is premised on consumers getting free services — search, maps, e-mail, social media, and so on — in exchange for their personal information. Consumers have grown accustomed to freely traversing the web, unimpeded by contractual demands from individual sites and services. And tech companies have grown accustomed to quietly tracking them as they do. It all works brilliantly so long as users don’t think too hard about what they’re divulging. Giving them the illusion of transparency and control turns out to be just the trick.

Efforts to improve this situation bureaucratically have almost universally failed. Europe has enacted a highly specific and laborious privacy regime it hoped would compel companies to offer the best of all worlds, free services and privacy both. Instead, it is imposing immense compliance costs, impeding innovation, burdening small businesses, hampering competition, irritating users and wasting everyone’s time.

A better idea is to dispense with this approach to regulating privacy altogether. In its place, Congress could offer an “information fiduciary” standard. By becoming fiduciaries, companies would agree to a set of best practices. For instance, they’d refrain from exploiting data to manipulate users, sharing it with unscrupulous third parties, or using it in unexpected ways. Much as a doctor must protect a patient’s medical details, information fiduciaries would be required to use data in the best interests of their users. Companies would be free to sign up for the standard or not. Those that did could be offered protection from certain lawsuits and immunity from state and local privacy laws.

This would grant companies the nationwide consistency they desire and reduce their legal uncertainty, while imposing clear duties on them in return. It would give users more confidence that their data is in good hands and relieve them of the burden of trying to parse opaque privacy policies. It might even allow for new business models: Fiduciaries that prove less attractive to advertisers might charge privacy-conscious consumers for access instead, thereby putting a price on personal information and making the data-for-services model more transparent.

Under such a system, companies that claimed to care about your privacy would actually mean it — and that would be to everyone’s benefit.