Diversity & Inclusion Friday news round-up: Dec 14, 2018

Welcome to the latest edition of our Diversity & Inclusion Friday News Round Up – Special Edition “Bias”. We just welcomed a new member to our family, so for the next few weeks (until January) I wont be sharing any new content, but have prepared a few “Best of” special editions for you. Happy Friday!

Diversity & Inclusion Round Up

Head of Talent Acquisition, MMEA

Head of Talent Acquisition, MMEA

Category & Hashtags

Gender Bias

Who is the boss? As a promotion for their start-up competition, Uber and Girlboss published an advert that is illustrating gender bias. Worth watching!



Class Bias

Interesting article about class bias in the workplace. So called “class migrants” (professionals who were born to blue-collar families) experience challenges on a regular basis and according to studies, biases impact them during hiring, but also once employed, e.g. at company events or for their career development.

Racial Bias

After an incident in the US, Starbucks closed 8000 locations to offer a 4 hour racial bias training workshop to all their employees. As part of the training they shared this video from the award winning documentary maker Stanley Nelson.

And just for fun: here is the video Trevor Noah and the Daily Show suggested Starbucks could use for their training.

Gender Bias

Researchers from Harvard University / United States Naval Academy and War College looked at language and gender bias in over 80000 performance evaluations of students from the Naval Academy. The military is obviously a traditionally male environment, but there has been an increased effort to overcome this and provide equal opportunities. While there were no gender differences in objective measures (e.g. grades, fitness test), the researchers found that managers used significantly more negative attributes for women than they used for men. Read more here.

Algorithmic Bias

MIT researcher Joy Buolamwini evaluated facial analysis programs from major tech companies like IBM and Microsoft and looked at their capabilities recognizing gender and different skin tones. While the error rates were marginal for white men, the programs had huge difficulties processing darker skin tones, especially for women, due to algorithmic bias (watch the video).

The Ericsson Blog

Like what you’re reading? Please sign up for email updates on your favorite topics.

Subscribe now

At the Ericsson Blog, we provide insight to make complex ideas on technology, innovation and business simple.