Is it okay for machines of silicon and steel or flesh and blood to erase our contributions? Is it okay for a machine to erase you and me? Is it okay for machines to portray women as subservient? Is it okay Google and others to capture data without our knowledge? These questions and new research led by Allison Koenecke inspired the creation of “Voicing Erasure”: a poetic piece recited by champions of women’s empowerment and leading scholars on race, gender, and technology.
A recent research study led by Allison Koenecke reveals large racial disparities in the performance of five popular speech recognition systems, with the worst performance on African American Vernacular English speakers. See Original Research
Voices recognition devices are known for "listening" into our conversations and storing that information often without our knowledge.
These systems are frequently given women voices and subservient "personalities", which further accentuates the negative stereotype about women being submissive.
Lead Author of “Racial Disparities in Automated Speech Recognition” Study