Jul 08, 2020
Tiny Images, Outsized Biases: Why MIT withdrew the Tiny Images dataset
MIT withdrew a popular computer vision dataset after researchers found that it was rife with social bias. Researchers found racist, misogynistic, and demeaning labels among the nearly 80 million pictures in Tiny Images, a collection of 32-by-32 pixel color photos.