Advice from the Future on Privacy: Get a Good Deal for Your Data

Adam Heimlich
4 min readMar 11, 2021

--

When the average consumer thinks about privacy, they mean “none of your business.” The very phrase implies a carve-out for people we’re doing business with, and that’s how privacy laws work. But it’s not a good way to think about privacy anymore. Google, YouTube, Instagram and Facebook are products we pay for entirely with our data. Netflix and Uber charge us while also reaping great value from customer data. You have to share a lot of data to get a bank account, a credit card or a mortgage. Sharing your medical data could lead to better outcomes for other patients. All our credit card and stock market transactions are aggregated and shared. What we get in return for our data — and what alternatives we might have to giving it all away — are well worth thinking about.

If this issue seems more philosophical than pressing, please let me assure you it won’t for long. Consumer data is like coal at the beginning of the industrial revolution, when engines were rare. That is, it’s about to become much, much more valuable. Soon, most businesses will run on data. That’s why you see companies spending billions to mine it.

As CEO of a company, Chalice Custom Algorithms, that lives in this future, I want to share what it’s like to work with newfangled data-driven engines. We think anyone seeing what we’re seeing will be as certain as we are about your data’s skyrocketing value.

In short, we are exceedingly productive. Every day, my co-founders do things with data in a few hours that used to take months. We have one small team comprised of data scientists and engineers working seamlessly together. Our monthly cost for storing, processing and using enormous volumes of data is less than what we pay for a few hours of legal advice. Just three years ago, what we’re doing required millions just to get started.

The outputs we’re producing so efficiently are predictions. While our domain is advertising, there are companies like ours doing similar work in every other field that operates online, plus many that don’t (like mining and medicine).

The reason we’re so productive is that a few big corporations and smaller, VC-funded startups recently invested godly sums in infrastructure to build prediction engines. They’re competing for billions worth of monthly SaaS fees for cloud-based data storage and processing. This competition is so extreme, the very weight of the bets on the table is tipping it. Businesses powered by data will crush competitors running on older engines, as Netflix crushed Blockbuster, if only because our cost to serve customers is so much lower.

It can be hard to get your head around what prediction machines mean for business. But everyone knows their outputs so far: Movie recommendations, voice assistants, driving directions, product suggestions and content feeds. All of these take data in, process it quickly in the cloud, model predictions and automate decisions to serve customers.

If you drive into this future while looking in your rearview mirror, you will see the end of privacy. Opting out of data collection is already tantamount to opting out of society. It’s alarming!

Unfortunately, some of the companies comfortably residing in the prediction-engine future are manipulating an anxious public in an effort to hoard their fuel: your data. These companies’ message is “trust nobody but us.” In fact, it doesn’t make sense to trust them (or us) to do anything with fuel except use it to make money. Exhibit A is a raft of opt-out designs that don’t bear scrutiny.

Governments have not been helpful so far. In crafting new legislation, regulators in several jurisdictions have privileged data collection by companies we do business with, thereby reinforcing the outmoded carve-out for “private” business. This has only sped the concentration of data resources, as companies with novel data like Fitbit are snatched up by giants like Google. Regulators are being heavily lobbied to mandate exactly the “opt-out” practices the giants have already circumvented (e.g. Google encourages you to opt out of cookie-based but not login-based tracking).

That said, there’s no doubt protecting citizens from nefarious or unfair use of personal data is a job for elected governments. Under authoritarian regimes, people will have no such protections. Unfortunately, by treating data mining of customers as private, there’s risk of democracies unwittingly helping a few companies amass more power than any government. The advance of technology absolutely does not require this.

In the new industrial revolution, the critical resource belongs to people. Citizens might have more say in whether or not companies feel pressure to compete on benefitting society as they earn money.

To get on a better forward path, a strong movement demanding consumer control over consumer data would be helpful.

There are several public and private initiatives driving in this direction, and many new concepts worth researching. The most important may be “data portability.” This is the idea that no one you choose to do business with has the right to “lock in” your data. If data becomes portable, like your phone number did, it will be easier for consumers to hold businesses accountable for how they use or misuse data.

If we amble into the Age of AI with three or four companies having a few billion of us locked in, the public/private distinction won’t matter a whole lot. Privacy will be whatever they say it is.

Earth’s data landscape, circa 2031

--

--

Adam Heimlich
Adam Heimlich

Written by Adam Heimlich

CEO of Chalice Custom Algorithms, an optimization platform for ad delivery.

No responses yet