Amazon’s facial analysis technology might have a woman problem. A new study from MIT and University of Toronto researchers has found that the technology tends to mistake women, especially those with dark skin, for men. Released Thursday, the study found that the facial analysis technology mistook darker-skinned women for men 31 percent of the time. Compare this to lighter-skinned women, misidentified just 7 percent of the time. And for men of any skin tone, there was virtually no misidentification. Facial detection software dates back to the late 1980s. Available for commercial use, it is tech designed to identify someone from an image or video. Companies market the technology to consumers who have to sort through collections of images for the same face, to retailers who want to know whether customers are having a positive store experience, and in other circumstances. Read more at Vox.