Sign up for our newsletter to not miss out on tomorrow’s game-changers for your industry.
Is it possible to protect privacy in a big data age?
I’m trying to accept my loss of privacy in the digital age.
But it’s not going so well.
Many of you probably know the routine – I download privacy extensions on my browsers, I refuse to post on Facebook, I hold on to outdated e-mail addresses from a variety of companies and I try to use non-tracking search engines.
It never lasts long. I feel great for about a week but then the loss of functionality starts to grate, and maybe a few weeks later, I slide right back into the old habits.
Talking data and privacy at the AT&T Futurecast
This tension between data and privacy was the focus of the most recent AT&T Futurecast at the Ericsson Experience Center in Santa Clara. Andreas Weigend – former chief data scientist at Amazon – spoke with moderator Andrew Keen about his new book Data for the People: How to Make Our Post-Privacy Economy Work for You.
In his book, Weigend contends that we’ve already lost control of our data to the “data refineries,” meaning the companies that can take the massive amounts of raw data we produce and turn it into something valuable. And he urges consumers to accept this, but at the same time work to make the best possible deal with the companies collecting the data.
Weigend is no secret to the potential terror of surveillance either. He grew up in communist East Germany, and he talks in his book about how his father spent years in an East German prison. After the government fell in 1989, he eventually asked for his father’s secret police files. Turns out his father’s police files were lost, but then he was asked a chilling question: did he want to see his own?
And yet he still embraces open data. From an excerpt of his book in Quartz:
Your power lies in choosing to use those data refineries which offer tools that increase transparency and agency for users—including tools that allow you to evaluate how much benefit you get in exchange for the data you share—not in asking for more privacy.
You can watch the whole event on Facebook.
Trading personal data for quality services
There is no question that I’m the norm when it comes to the push-pull between privacy and data. We say we want privacy but then we always choose convenience. This is borne out in an Ericsson ConsumerLab study from 2014, which showed that 90 percent of consumers say privacy concerns affected their online behavior. But how many people then said they would use the internet less?
Only four percent.
Weigend made some good points along these lines – that privacy is a fluid concept with seemingly invasive services like Caller ID now seen as essential privacy protections, that it’s somewhat dependent on culture and circumstance. Instead of fighting these slippery battles, we’re better off with a clear negotiation between all parties:
“I would like to see incentive alignment (between companies and consumers). I would love a world with more transparency where [bad behavior] gets called out … so we can see if our interests aligned with that of the company.”
Is privacy binary?
What struck me the most at the event was the attitude – echoed strongly in the audience – that privacy has already essentially ceased to exist. The data refineries know too much. It becomes about acceptance, and like Weigend argues, making the best possible deal from what seems a position of relative weakness.
This struck me as a somewhat false dichotomy. And a dive down the rabbit hole of Ericsson privacy content brought me some unexpected comfort.
In 2012 Ericsson published a paper called Privacy in the Networked Society that made it clear that privacy can be treated as something to be protected, not dismissed, in a data-based world. And in a recent article in Ericsson Technology Review, Stefan Larsson of Lund University in Sweden, acknowledged that the public sector will continue to lag behind private data collecting practices.
But instead of putting all the responsibility on the users, Larsson cautions the companies will have to come to the bargaining table too. The consumer tradeoff of privacy for services may not last forever:
“It is therefore crucial for the private sector to take a proactive approach to addressing normative and ethical questions as part of the service design and development process. Otherwise, there is a significant risk that consumers’ trust in digital services will decline in the mid to long term. A low level of trust in new features, services and devices could substantially reduce their potential scalability, and consequently have a negative impact on the digital economy as a whole.”
Then there is this video from Pat Walshe, the former director of privacy for the GSMA, in which he cuts right to the heart of the problem – people want privacy and services. How do we give them both?
Because a loss of trust could eventually doom the data economy.
Information rules and intermediate states
Thinking about the event, I’ve been particularly influenced by a post and paper by Ericsson’s Jonathan King, who is Head of Portfolio Management for Cloud, but also an active legal scholar currently researching and publishing articles on big data and privacy as a Visiting Scholar at Washington University School of Law.
Jonathan makes the argument that privacy should absolutely not be considered a binary state. He uses the example of secrets. Just because I tell you my secret doesn’t mean it’s not still a secret. Governments circulate all sorts of “top secret” information that is seen by hundreds, if not thousands, of people and yet still remains clearly secret.
So can our private data be safely held in this kind of “intermediate state,” in which it is shared and used, yet also protected?
From his post on Ericsson’s Hyperscale Cloud blog:
“So privacy is not merely about keeping secrets, but about the rules we use to regulate information. Privacy rules are information rules, and in an information society, information rules are inevitable. When properly understood, privacy rules will be an essential and valuable part of our digital future – a future not ordained to take a single, shiny, privacy-denying form but, instead, a human creation.”
Finding a balance between big data and privacy
The history of privacy is also not necessarily as clear as “we didn’t have it, and then we got it, and now we’re going to lose it.” It does seem true that modern privacy is a new concept, and that each wave of new technology sparks a crisis of privacy, with the technology always coming first, and the society and government always following after.
And along these lines, near the end of the event, Weigend spoke beautifully about the benefits of radical transparency, referencing the civil rights movement in the US and the fight for gay rights:
“We live in a world where, because we can’t hide, we should just embrace who we are … We have come a long way. And I really hope that we come even further so that people can embrace diversity, embrace their idiosyncrasies. Many years ago people hired me for who I was not. Now people hire me because of who I am. That makes me much happier.”
But why does the development of privacy need to be a circle back to what came before? Why can’t it evolve as our technology and society do into something new? Even if I’m doomed to be perpetually at the mercy of the data refineries, it doesn’t feel right to write off privacy. I hope that King is right and that we will find a new formal interplay between privacy and technology. Perhaps the new equilibrium will be a balance I find unsatisfying. Maybe not.
But I do hope we find it.
You must accept cookies to be able to make a comment.