Amazon’s facial-recognition software is coming under fire again, this time from a group of prominent artificial intelligence researchers.
Equitable AI is focused on empowering people. Accountable AI is focused on holding creators and sellers of AI responsible for their impacts.
Facial recognition relies on AI to learn the patterns of a human face, and it is a widely used technology by corporations and the government.
The documentary follows AJL' s founder on her mission to call for U.S. legislation to protect us against racial bias in algorithms.
AI experts, lawyers, and law enforcement urged US Congress to regulate the use of facial recognition technology.
Over 35 organizations joined the letter and it has been entered into the record at the House Homeland Security Committee hearing.
Facial recognition systems widely used by law enforcement misidentified people of color more often than white people.
Experts, advocates, community leaders, and concerned voters testify in front of the Joint Judiciary Committee on Tuesday, October 22.
A former Wall Street quant sounds an alarm on the mathematical models that pervade modern life — and threaten to rip apart our social fabric.
Sasha Costanza-Chock explores how community-led design can help dismantle structural inequality and advance collective liberation.
AJL's founder is an acclaimed TED talk speaker. Watch her famous talk highlighting her story and research that led her to launch the Algorithmic Justice League.
Debuted at the Museum of Fine Arts Boston, “The Coded Gaze” mini documentary follows Poet of Code Joy Buolamwini's personal frustrations with facial recognition software and the need for more inclusive code.
The Committee on Oversight and Reform hearing examines the use of facial recognition technology by government and commercial entities and the need for oversight.
Attendees at the workshop explore identity, gender presentation, face surveillance, and the consequences of algorithmic bias.
Gender Shades is an excavation of inadvertent negligence that will cripple the age of automation and exacerbate inequality if left to fester.
This article highlights AJL’s study that tested facial technologies from Amazon, IBM, Microsoft, Face++, and Kairos.
Joy Buolamwini is featured in a Telegraph article highlighting her revolutionary work with the Algorithmic Justice League.
An Opinion piece in the New York Times about the AJL's research into biased AI, highlighting real-world examples like the hiring practices of HireVue’s algorithms.
A collaborative video response to IBM's "Dear Tech..." Ad which speaks to the potential of technology without addressing the role of tech companies in propagating harms.
A spoken word piece that highlights the ways in which artificial intelligence can misinterpret the images of iconic black women: Oprah, Serena Williams, Michelle Obama, and more.
Directed by Shalini Kantayya, Coded Bias illuminates the harms that AI poses over people's lives, especially minorities. Official Sundance Selection 2020.
On November 4th, the Coded Gaze Exhibition debuts at the Museum of Fine Arts, Boston and the InCoding Manifesto was screened.
AJL's Founder believe the "hygiene" of artificial intelligence will determine its longer-term success.
In business, government, philanthropy, and arts, Fortune Magazine's sixth annual 50 Greatest World Leaders are transforming the world.
Forbes's annual 30 under 30 list that chronicles the brashest entrepreneurs across the United States and Canada features AJL's founder.
WIRED25 features AJL's founder and offers real hope that we can fix the mistakes of the past and still have a chance for a future we can survive.
Black women engineers, professors, and government experts speak about being on the front lines of the civil rights movement.
PBS Frontline interviews Joy Buolamwini on the biases of AI in conjunction with the release of their documentary, In The Age of AI.
The Algorithmic Justice League is quoted in an in-depth article about the harms of AI and how to avoid them. AJL's research is referenced in BBC News.
Joy Buolamwini writes an Op-Ed in TIME Magazine’s 2019 Optimists issue, guest-edited by Ava DuVernay.
Joy Buolamwini, a poet of code, tells stories that make daughters of diasporas dream and sons of privilege pause.
A unique and interesting debate at the Barbican about whether creativity is only a human trait. Joy participates in the debate.
The Open Mind was the first to interview iconic civil rights leaders like Martin Luther King Jr. on national television.
On BBC Newsnight Live, Founder of AJL Joy Buolamwini reflects on facial recognition technology.
Facial recognition programs can be biased against darker skin tones and against women. NBC's Stephanie Ruhle explores why it matters.
In an interview with Soledad O’Brien, Joy says that facial recognition software and technology need more regulation to ensure its accuracy.
AJL's founder sits down with Doha Debates Correspondent Nelufar Hedayat to talk about how existing power structures can lead to unintended bias in AI.
The Doha Debate tackles the most contentious question of all: Will AI help or harm humans globally?
Joy Buolamwini performs her spoken word piece “AI, Ain’t I A Woman?”—her response to algorithmic bias - in the Vision & Justice segment.
“The Coded Gaze” keynote at Stanford HAI 2019 Fall Conference.
As artificial intelligence continues to transform our daily lives and power our world, are we stopping to ask ourselves, "Do these technologies benefit all of us?
We risk losing the gains made with the civil rights movement and women's movement under the false assumption of machine neutrality.
In the wake of #BlackLivesMatter and #TimesUp, this exhibition highlights that democracy, time and memory are as fragile as our breath.
AJL's founder is interviewed in Part 2 of the episode "Can We Trust The Numbers?".
Joy Buolamwini, Latanya Sweeney, and Darren Walker come together to discuss the limits of technology in the face of algorithmic bias.
The Gender Shades research used 1270 faces to reveal IBM, Miscrosoft, and Face++ were better at guessing the gender of male faces than female faces.
“AI: More Than Human” looks at AI’s real-life application in fields such as healthcare, journalism and retail.
A collective of tech leaders sign a letter demanding that the tech industry stop the spread of hate and terrorism on digital platforms.
Joy Buolamwini founded the Algorithmic Justice League to make people aware of the bias embedded in our networks.
From law enforcement to talent acquisition, computer vision and algorithms are increasingly making selections with societal consequences.
“Can You See My Face?” is a poetic video by AJL's founder that illustrates the bias in coded systems.
How intelligent can artificial intelligence be? And more importantly: what effects will the advances in this field have on our society?
The 13th edition of GETXOPHOTO International Image Festival addresses the challenges faced by individuals in a present world powered by AI.
In the 19th century, criminologists used the new medium of photography to classify and predict the "criminal type."
Testimony at U.S House of Representatives Committee on Algorithmic Intelligence: Societal and Ethical Implications - Congressional Hearing"
A letter to Jeff Bezos uncovering bias in Amazon's Rekognition software and urging him to stop selling it to Law Enforcement.
Joy Buolamwini discusses “Compassion Through Computation: Fighting Algorithmic Bias” at the World Economic Forum.
Brooklyn Tenants Protest Against Facial Recognition Entry Systems and AJL signs letter to New York State Homes and Community Renewal.
The statement ahead of the White House's summit with technology companies on violent online extremism.
A 6787/S 5140 to ban the use of biometric surveillance technology in schools and require a commission to study the effects of this sort of technology on children.
The aim of the Global Tech Panel is to foster more cooperation between diplomacy and technology to address challenges and threats.
AJL at Data for Black Lives's opening panel: "We are the Leaders We've Been Looking For: Organizing for Algorithmic Accountability".
Safiya Noble raises clear alarms about the ways Google shapes our lives, minds, and attitudes.
When Safiya Googled keywords "black girls," "latina girls," and "asian girls," the first page of results were pornography.
Data skeptic Cathy O’Neil uncovers the dark secrets of big data, showing how our "objective" algorithms could in fact reinforce human bias.
A powerful investigative look at data-based discrimination―and how technology affects civil and human rights and economic equity.
Eubanks shows the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America.
Our collective enthusiasm for applying computer technology to every aspect of life has resulted in poorly designed systems.
Ruha Benjamin cuts through tech-industry hype to understand how new technologies can reinforce White supremacy and deepen social inequity.
Although algorithmic auditing has emerged as a key strategy to expose systematic biases in software platforms, we struggle to understand the real impact of these audits.
Algorithms falsely identified African-American and Asian faces 10 to 100 times more than Caucasian faces.
Big Bang Data explores the intersections of culture,technology, and society in the digital age.
AVATARS//futures documents how artists represent their bodies, their ideas, their nations, and their dead in today’s digital world.
The National Science and Media Museum’s exhibition explores trends around internet connected devices, which have outnumbered humans.
When the Future Is Now: On Understanding AI and Being a Misfit Artist in a Family of Scientists.
Coded Bias, which will premiere at Sundance, follows the journey to pass the first-ever legislation to govern A.I. in the US.
Directed by Shalini K., Code for Bias sheds light about the harms and biases of artificial intelligence impacting our society.
AJL's founder urges public and private organizations including IBM, Microsoft, Google, Facebook Amazon, and more to sign the Safe Face Pledge.
A powerful spoken word piece that discusses how even iconic Black women figures are mislabeled and mis-gendered by facial analysis tools.