Privacy v/s the Economy - There Can Be Only One
There are growing privacy concerns in the United States, Europe, and other regions around the world as it relates to the personal privacy safeguards in the current digital ecosystem of search, social media, and cloud providers. This concern has manifested itself in multiple recent high profile events to include:
1) The Facebook privacy concern with Cambridge Analytica where over 50 million profiles were provided to a third party to allow them to predict and influence an election
2) The passing of the General Data Protection Regulation (GDPR) in Europe in May 2018 which seeks to fundamentally change the way data is handled across nearly every sector with major fines for non-compliance (i.e. British Airways was fined $230M and Marriott was fined $124M).
3) GDPR's influence has spread with California and Japan taking up similar legislation around privacy.
4) The Federal Trade Commission (FTC) recently imposed a record $5 billion fine against Facebook for violating consumer privacy.
The net effect of this is we are seeing a backlash against large scale companies for failure to maintain consumer privacy and a resulting need for regulators to pass new laws and levy new fines to incentivize more acceptable behavior. On the face of it, this makes perfect sense. In general, people value their privacy and many are willing to make reasonable trades to protect it. For example:
1) Users would like to understand what information is being tracked and "opt out" if they so choose
2) They may be willing to pay a modest fee for an otherwise free service in order to limit the data collected about them
3) They may be willing to trade some level of convenience by limiting the effectiveness of ads and suggested products/services in exchange for improved privacy
All of these trades fundamentally come down to short term convenience and consumer choice. In that context, they are all fair and reasonable expectations. In the short term, they can and will influence companies in these regulated countries to improve their privacy programs and offer more choice for consumers. This approach will make privacy stakeholders quite happy and it should incrementally move the privacy needle forward. However, the reality is that there are not many consumers voting with their wallet on privacy. How many people do you know who have:
1) Dropped their Facebook account after the Cambridge Analytica incident?
2) Left their Marriott points behind and switched to Hilton after their GDPR fine?
The answer is not very many; a shockingly low number. The reality is that unless privacy starts to effect consumer buying decisions, many of these companies just won't take their privacy as seriously as they should. So what should we do about it? I am going to argue absolutely nothing...
I realize this is a controversial position for someone with a strong cyber security and data privacy background. However, I also believe there are a set of darker and deeper truths lurking that are not being fully acknowledged. In particular, I believe there are two fundamental truths that should be the guiding principles in how we think about privacy:
1) Privacy is an illusion and the ability to maintain it over time will steadily decrease
2) Allowing companies to store and process as much data as possible is significantly more important than our privacy
Let's start with the privacy illusion. The fact is most companies know a lot more about you than you think. For even the most paranoid people who safeguard their privacy religiously, their data is harvested from multiple public sources, purchasing records, and internet history to build a remarkably accurate profile of their demographics, what interests they have, and what they are likely to buy. In general, this is a good thing. It makes your consumer experience more efficient, your costs are lower since companies pay less to acquire their consumers, and company profits are higher. The younger generation takes this for granted having grown up in a digital economy with a large number of convenience services and easy ways to connect and share with people online. The fact is that the large majority are perfectly willing to trade some level of privacy for the convenience these services offer.
In addition, it will become increasingly difficult to maintain your privacy over time. While GDPR will help on issues like browser cookies, it will be impossible to combat the coming wave of machine vision. Everywhere you go you now leave a digital footprint behind. Cameras are ubiquitous and over time will become intelligent. They will know it is you, what brands you are wearing, where you were located during different periods of the day, and feed that data to algorithms that can better target goods and services to your specific needs. Machine vision is inevitable and progress towards improving the algorithms is moving at an exponential pace. Short of a Harry Potter invisibility cloak, how do you protect your privacy in a world of machine vision?
This brings us to the last and most important point. We have to find a way to allow our US companies to securely store and process as much data as possible. I would argue that this should be a top level national goal of US policy. While the current privacy debate has focused on privacy versus convenience, I believe that to be a false choice and a very short sighted view. The reality is that we are in an exceptionally dangerous time where the true choice is privacy v/s the economy where we are in a Highlander situation - there can be only one!
The rise of Artificial Intelligence (AI) is occurring so much faster than anyone realizes and progress is accelerating on an exponential curve (to better understand this problem, I highly recommend Thomas Friedman's book - "Thank You for Being Late - An Optimist's Guide to Thriving in the Age of Accelerations"). While nobody knows exactly when, it is widely acknowledged that between 20-50% of jobs can ultimately be replaced by AI. This could happen in 10 years or in 50 years (if I was a betting man, I would bet on sooner rather than later). Regardless of when it happens, it is inevitable that it will happen. And once it does, the country that controls the AI wins. Since most of the jobs will go away, the wealth will accrue to the country that dominates in AI (which will allow them to subsidize their job losses with things like Universal Basic Income). If we don't win the AI arms race, it will absolutely devastate our economy and shift the balance of power in the world to the AI winner.
There are only two real players in this game: China and the US. In general, they have the most computing power, intellectual bandwidth to throw at the problem, and economic ability to invest. So what is most likely to decide the game? It will all come down to data. There has been little in the way of fundamental new algorithmic breakthroughs in the last few decades. Most of the progress has been applying older algorithms but with almost unlimited processing power and huge amounts of data. With the hyper scale cloud, almost anyone can gain access to unlimited computing power. The real differentiator then will be access to data. Who then has the advantage?
China does....and it isn't even close. What advantages do they hold? Lets list a few:
1) They have over 4x the population of the United States meaning they have 4x the personal data they can collect.
2) They have no expectation of privacy meaning their government can collect any data they want, at any time they want, and in any way they want. In contrast, the US, Europe, Japan, and other large/democratic economies are moving towards increased privacy laws and less data collection.
3) In 2019, China passed the US in retail sales as the largest market in the world while also having a more digital ecosystem for buying goods and services (less reliance on cash transactions meaning more data collected).
What this means is that China is best positioned to strategically win the AI race. The currently dominant non-China economies are making short term bets on privacy in response to consumer sentiment. However, I do not believe that consumers understand the choice they are making. I also don't believe that most consumers actually care that much about their privacy. What if the choice was framed as, "would you be willing to improve the privacy of your data in exchange for a 20-50% chance you will not be able to find a job in the next 10-15 years?" I don't know many people who would make that bet on privacy; not when it is privacy versus the economy and there can be only one....
Interested in learning more about big data and privacy, Contact Us today to learn how to unlock your data and thrive in the next-generation digital economy.