Pinterest’s team knows that people want to feel included. When platforms lack representation, it tells people that the way someone may look or where they come from isn't the ‘norm.’ It’s more important than ever before to design inclusive systems that remove historical biases. Nadia Fawaz is the senior staff applied research scientist and technical lead of inclusive AI at Pinterest, which hosts over 400 million users who speak over 35 languages across 8 billion boards. In this keynote, Fawaz explains how ML learns implicit bias and algorithmic fairness, how to design inclusive systems with cross-functional teams, how Pinterest learned from errors and made its models also learn from errors, and how Pinterest has built inclusive features, especially regarding skin tone and hair patterns, to create a more engaging platform. Fawaz discusses how it is possible to change technology with intent.
Fawaz’ research and engineering interests include machine learning for personalization, AI fairness and data privacy; her work aims at bridging theory and practice. Before joining Pinterest, she was a Staff Software Engineer in Machine Learning at LinkedIn, a Principal Research Scientist at Technicolor Research lab, and a Postdoctoral Researcher at the Massachusetts Institute of Technology’s Research Laboratory of Electronics.