7: Unchecked Biases

 

SEASON 1: EPISODE 7


EPISODE OVERVIEW

With advancements in the everyday use of artificial intelligence and machine learning, researchers are finding that troubling human biases are now making their way into our tech products, as well.

On this episode of the T+P Podcast, Amber Anderson talks with scholar and best-selling author, Safiya Umoja Noble, Ph.D., about the consequences of unchecked biases in tech.

 
7 Unchecked Biases_quote Safiya Noble

TOPICS COVERED

  1. Women and the internet: In 2009, Noble began evaluating search engines and found a series of layers that revealed how racism and sexism influenced the results of what would become the world’s most popular search engine, Google. Hidden in plain view was how the world, or at least the algorhthm, percieved women of color: pornography dominated the first page of the search results for “Black girls, Asian girls and Latina girls.”

  2. People and society: More diverse teams and access to more educated resources ensures that the people making your product can relate to those buying it — a fundamental step to eliminating biases. The key is to make certain that underrepresented and/or marginalized groups are given the same level of power and authority as their peers.

  3. What is diversity: The data shows that diverse teams are more valuable to companies than non-diverse teams but it matters how inclusion is implemented. Businesses need to think far beyond gender when it comes to adding diversity to their teams. How businesses view and value the race, socioeconomic status, gender and educational background of their people is just a start. If your entire team is filled with ivy league employees with technical backgrounds then you’re missing opportunities. Try recruiting from different schools or exploring different majors. Evaluating diversity in all forms — including intellectual diversity — is key.

  4. An introspective view: Your small team will one day be bigger. It’s time to invite people to the table who ask the hard questions and embrace differing views. If you’re taught that your way is subjective then it’s hard to be introspective. How often is your team asked to examine their beliefs? Small startups should spend some time holding themselves accountable — investing in self-reflective practices and a diverse company culture promotes growth.

  5. Marketing + Ethics: The ethical limitations of traditional advertising channels are tied to the product and public policy. The question of when and where target messages should enter people’s lives is important. What is a marketer’s ethical role and where should regulations intercede?

  6. What’s next: How do we protect people moving forward? Noble believes new policies should be implemented to help protect people from biases. She also expects companies to be held more accountable for what happens on their platforms.


Safiya Noble

Safiya Noble

Safiya Umoja Noble, Ph.D. is an Associate Professor at UCLA in the Departments of Information Studies and African American Studies, and a visiting faculty member to the University of Southern California’s Annenberg School of Communication.

She is the author of a best-selling book on racist and sexist algorithmic bias in commercial search engines, entitled Algorithms of Oppression: How Search Engines Reinforce Racism (NYU Press). She holds a Ph.D. and M.S. from the University of Illinois at Urbana-Champaign.

She is regularly quoted for her expertise on issues of algorithmic discrimination and technology bias by national and international press including The Guardian, the BBC, CNN International, USA Today, Wired, Time, and The New York Times, to name a few.

Web: safiyaunoble.com

Twitter: @safiyanoble

LinkedIn


References:

Noble, S. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press.

The Intersectional Internet: Race, Sex, and Culture Online. (2016) Safiya Umoja Noble, Brendesha M. Tynes (Eds). Peter Lang Digital Formations series.

Invited Talk (June 2016).  Safiya Noble | Challenging the Algorithms of Oppression on  Personal Democracy Forum, hosted by New York University

Noble, S. U. (2013). Google Search: Hyper-visibility as a Means of Rendering Black Women and Girls Invisible. InVisible Culture: Issue 19.


Noble, S (2012). Bitch Magazine “Missed Connections: What Search Engines Say About Women”

TedxUIUC (2014), How Biased Are Our Algorithms?

Rock and Grant (2016), Harvard Business Review, Why Diverse Teams Are Smarter


Credits:

Produced by: Kai-Saun Anderson
Music by: Podington Bear
Photo: Nasa


WANT MORE? SUBSCRIBE TO AND RATE THE SHOW

iTunes  •  Stitcher  •  Google Play

Thanks for listening! We hope you’ll subscribe at iTunes, Stitcher, Google Play or wherever you get your podcasts.

Questions or comments? Email us at hi@toteandpears.com.