Skip navigation
Like what you’re reading?

Exploring privacy-preserving data analysis

Many of our clients in the automotive sector face a multitude of obstacles when sharing data. These clients need an effective solution to protect their customers’ privacy while leveraging metadata to optimize products and improve user experience. The same is true for many actors outside of the automotive sector as well.

Senior Solutions Architect, IoT & Connected Vehicles

Co-founder & CEO/CTO, DPella

Co-founder & Chief Scientist, DPella

Exploring privacy-preserving data analysis

Senior Solutions Architect, IoT & Connected Vehicles

Co-founder & CEO/CTO, DPella

Co-founder & Chief Scientist, DPella

Senior Solutions Architect, IoT & Connected Vehicles

Contributor (+2)

Co-founder & CEO/CTO, DPella

Co-founder & Chief Scientist, DPella

What is the current state of privacy regulation?

Protecting customer privacy remains a crucial issue for corporations of all sizes. The growing scale of digital operations now grants companies vast quantities of consumer data, which can unlock insights into their preferences and behaviors as well as power the next generation of innovations.

Consumers, CEOs and government regulators are rightfully concerned about the privacy of raw data and how it is used and processed. To protect consumers and shield corporations from liability, governments worldwide have instituted a number of data privacy laws and regulations. Many regard Europe’s General Data Protection Regulation (GDPR) as the global standard, which sets safeguards and mandates that personal data must be “processed lawfully, fairly and in a transparent manner in relation to the data subject”. One such safeguard is related to techniques that prevent the identification of individuals. However, once the possibility of identifying an individual either directly or indirectly is removed, GDPR no longer applies to that data.

Other regions have modeled similar rules of consent inspired by GDPR, including California’s Consumer Privacy Act and the China Personal Information Protection Law. Just like with GDPR, these regulations do not mention any specific technologies to safeguard individuals.

How to ethically process and protect user data

Regardless of the legal requirements of the operating region, there is no path to sustainable digitization without privacy protection. Poor handling of large amounts of information puts companies at risk for severe, possibly irreparable damage to their public perception as well as financial or criminal penalties.

The natural first step to protection is removing personal identifiers, but that does not guarantee success. Cross-referencing multiple sources of data can reveal a great deal of information from customers. Netflix, as an example, once shared anonymized data with a third-party community, but when that data was cross-referenced with existing IMDb (global film directory website) data, preferences could be accurately assigned to individual users, leaving them vulnerable. Another case showing the poor protection of anonymization dates back to the mid-90s, in which the Massachusetts Group Insurance Commission released anonymized data about hospital visits. A computer science graduate was able to re-identify individuals in the dataset and sent the governor of Massachusetts a detailed list, including diagnosis and prescriptions, of his own hospital visits. For more information on anonymization techniques, see Article 29 Data Protection Working Party.

Differential privacy

Automating the data analysis process could enable ethical management of large sums of data. Differential privacy (‘DP’) focuses on describing the patterns of groups within a dataset while withholding information about the individuals within it. This approach originated in academic research, ensures a higher security threshold and enables companies to gain insights more safely.

Differential privacy systems rely on statistical noise (or carefully sampled random numbers) to intentionally distort, in a controlled manner, any means of recognizing individual identities. Going further than simply scrubbing names and addresses, DP systems group similar profiles together and present the trends of that group in a protected way. This is why these systems are already used by the tech giants to better protect their user base.

Data in automotive development

While most conversations around data focus on its economic value, these discussions overlook the essential role of data in the advancement of the greater good. Just as it can be used in medicine to identify diseases and develop cures, data can make an immense impact on public safety when properly leveraged by the automotive sector.

Transportation is built with a mutual public interest in safety in mind. People driving are not only dependent on their own capabilities, but on the millions of other road users. As such, being able to aggregate metadata and gain insights into drivers’ habits, behaviors and likely reactions to various situations can have an immeasurable impact when it comes to protecting people from the dangers of being in traffic. This kind of information can not only lead to financial savings—it can save lives.

Of course, information like that should not be procured through information in which an individual can be identified, which is not where the key value lies anyway. Intelligent differential privacy systems could derive these life-saving insights without exposing everyday users.


Unfortunately, only a handful of key actors in the automotive ecosystem are truly invested in improving the sector for all road users. To advance innovation and collaboration in the automotive industry, Ericsson has partnered with six other companies – CEVT, Polestar, Veoneer, Volvo Cars, Volvo Group and Zenseact – to build MobilityXlab.

MobilityXlab is a collaboration hub founded in 2017 to create and develop new innovations within future mobility –  between the partners and together with startups. During the first five years, MobilityXlab has seen startups applying from 50 countries. The collaboration platform has resulted in 75 proofs of concept and 12 accelerations, in the form of commercial contracts or partnerships.

One example is the startup DPella, which offers deep expertise in differential privacy. Ericsson has been working with DPella to explore their software tool and models to better study data patterns and insights while protecting the privacy of the individual contributors to the raw data being studied.

Differential privacy – The future of data analysis?

Differential privacy is a novel technology that has already seen a lot of traction in the public and private sectors. In the United States, the U.S. Census Bureau pioneered the use of this technology in the public sector to protect the privacy of the U.S. population while collecting and releasing data used for political and economic decision-making. In the private sector, tech giants such as Google, Apple and Facebook have used this technology to better protect their user base, which suggests that it might be a good opportunity to test DP in Europe. As more industries embrace the advantages of differential privacy practices, mindsets will shift and anonymized insights will ultimately change how we manage, protect and share data. In the end, an ecosystem that shares data between its actors is an ecosystem that improves much faster.

With regards to the automotive industry, a lowered barrier to data-sharing paired with increased analytic potential could revolutionize the industry and improve mobility for all. Organizations willing to take an early exploratory step could find themselves far ahead of the bend in the road.

Read more

Mobility Xlab


The Ericsson Blog

Like what you’re reading? Please sign up for email updates on your favorite topics.

Subscribe now

At the Ericsson Blog, we provide insight to make complex ideas on technology, innovation and business simple.