How can we stop technology from inheriting our bias?

Developments in artificial intelligence and immersive experiences are paving the way for exciting possibilities in healthcare, business, entertainment, and education. But are these technologies also inheriting our negative and institutional biases? What can we do to build accountability or go even further and make inclusion a part of the evolution of intelligent technologies?


To explore these and other questions about the connection between technology and bias, the AT&T Foundry and Ericsson presented a private screening of the documentary Bias to a packed crowd at Landmark's Aquarius Theater in Palo Alto. After the film, Alka Roy, product and technology leader at the Foundry, moderated a panel discussing the implications of unconscious bias and how technology can affect them.

"[Unconscious bias] is one of the biggest collective challenges of our times," Roy said. "How we think about fairness, privacy and inclusion when building products and intelligent systems with AI—affects our lives and the world we build for our future. We can't afford not to have this conversation."

The event kicked off with the screening of the film by Robin Hauser, director and Producer of Bias and the award-winning film, CODE: Debugging the Gender Gap. The film follows Hauser on a journey to uncover her own hidden biases. She reveals how unconscious bias defines relationships, work settings, our justice system, and even technology—especially in the areas of gender and race.

Through compelling interviews with leading thinkers, researchers, and data scientists, the audience learns about the nuance of negative and unconscious bias, and how if left unchecked, the technology we build will perpetuate the biases of smaller groups to a much broader audience.

What can we do about bias?

The question central to the film is: What can we do about negative biases at the workplace, in the products and tools that are being built with AI and ultimately our brains? As the panel discussion afterwards revealed, there are no easy answers.

The place to start is the obvious question—are we motivated to remove bias? "We tend to do what is easiest and quickest, and that is where 'like me' bias comes into play," Hauser explained. "It's easiest to hire someone who is just like us, because that is who we feel most comfortable with."

Hauser pointed out that research has consistently shown that diverse teams perform better and lead to higher profitability. But it can be difficult for this message to break through with startups that are already generating great revenue.

Panelist Jerry Kang, Vice Chancellor for Equity, Diversity, and Inclusion and Distinguished Professor of Law at the UCLA School of Law added: "Diversity will increase the universe of possibilities, solutions, and ideas considered. But it often creates a sort of friction, so it slows things down. It's not always that it creates a different answer. But the way the answer is created, and the number of solutions are expanded—that's what's improved."

Panelists pointed to tools being built to try to remove bias in hiring and in some aspects of the criminal justice system but stressed a need for transparencies about algorithms. These tools can backfire if they're not being built by people who have a deep understanding of the problem, and indeed, many predictive tools have come under scrutiny.

"We live in a society that defines certain systems as color blind. Well, that is just not the case," said Ron Tyler, Professor of Law Director, Criminal Defense Clinic at Stanford University . "The very notion of justice is a blindfolded woman holding scales. In criminal justice, bias matters a great deal."

The film itself also highlighted how technology can be used to change our initial response to unconscious bias in this field, and possibly even retrain people's brains. Virtual simulations are being built to help coach law enforcement agents by altering the brain's initial bias trigger, to build empathy with "the other," and to create educational and journalistic experiences.

Understanding and tackling bias in technology

James Zou is Assistant Professor at Stanford University and leads the Stanford Laboratory of Machine Learning, Genomics and Health and participates in a Human-Centered AI initiative at Stanford. He recently published a paper about algorithms and bias and found that while technology can help us face our biases, the type of readily available data that is being used to build many AI systems and models is not without bias of its own.

In his research, he uses algorithms to translate English words into vectors, and found that a lot of gender, ethnic, and cultural biases already exist in those words.

"A big part of our work is to identify biases in the algorithms that you're interacting with daily and come up with ways to de-bias these algorithms," Zou said. "As a computer scientist, it's easier to de-bias algorithms than de-bias humans."

Of course, this raises an even trickier question – how do you decide what to de-bias? And how do you do it?

"The question is how do you link up the science of today with the history of the past?" Kang asked.

Ultimately, he said, the intelligence systems themselves aren't the problem, but the human beings those systems are emulating, and the standards used to build them.

Zou has one possible solution. He and his team have been exploring the biases of people in the U.S. over the past hundred years. To accomplish this, they took digitized versions of all the writings from the U.S. over that time period and had algorithms analyze them.

That's one example where, if we use technology properly, it can really become a powerful mirror to study our own biases

Zou said.

The reality is that developments in AI will continue. Technology will enable business solutions that we haven't yet imagined. "What we need to do is partner and create an open dialogue about how we build a better framework for AI and immersive solutions that are not only good for business but also good for people," suggested Roy. "While documentaries like bias can help start the conversation, it's up to the collective to be deliberate and ask the hard questions and make decisions about the kind of world we want to build."

At Ericsson, we believe the power of you defines the power of us. We are more than 100,000 people with diverse experiences, perspectives and ideas. It is our diversity and inclusion that brings us closer together, provides the catalyst for innovation and success, and helps us make a difference. Learn more about our perspective on diversity and inclusion, unconscious bias, and how we overcome it.

The Ericsson Blog

Like what you’re reading? Please sign up for email updates on your favorite topics.

Subscribe now

At the Ericsson Blog, we provide insight to make complex ideas on technology, innovation and business simple.