Entities
View all entitiesIncident Stats
CSETv1 Taxonomy Classifications
Taxonomy DetailsIncident Number
115
Special Interest Intangible Harm
no
Notes (AI special interest intangible harm)
There is no evidence or indication that the system led to any special interest intangible harms through its use or deployment.
Date of Incident Year
2020
Date of Incident Month
07
Date of Incident Day
CSETv1_Annotator-1 Taxonomy Classifications
Taxonomy DetailsIncident Number
115
Special Interest Intangible Harm
yes
Notes (AI special interest intangible harm)
It is unclear that any harmed entities can be characterized in this incident. Although the AI did exhibit bias against women, the extent of the harm stopped at the ineffectiveness of the tool.
Date of Incident Year
2020
CSETv1_Annotator-3 Taxonomy Classifications
Taxonomy DetailsIncident Number
115
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
Some tech companies make a splash when they launch, others seem to bellyflop.
Genderify, a new service that promised to identify someone’s gender by analyzing their name, email address, or username with the help AI, looks firmly to be in th…
- View the original report at its source
- View the report at the Internet Archive
Just hours after making waves and triggering a backlash on social media, Genderify — an AI-powered tool designed to identify a person’s gender by analyzing their name, username or email address — has been completely shut down.
Launched last…
- View the original report at its source
- View the report at the Internet Archive
The creators of a controversial tool that attempted to use AI to predict people's gender from their internet handle or email address have shut down their service after a huge backlash.
The Genderify app launched this month, and invited peop…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Similar Incidents
Did our AI mess up? Flag the unrelated incidents