Deep Hysteria is a still image series that repurposes algorithmic bias in the service of unraveling a deep human bias. Artworks are generated using deep learning algorithms trained on still frames of thousands of YouTubers speaking to the camera. Generated individuals are then algorithmically gender-adjusted and the variations fed to Amazon Rekognition, a commercial deep learning based facial analysis algorithm that attempts to classify faces according to the subject’s gender, age, and emotional appearance. Despite the marketing of such tools, reading emotions solely by analyzing a person’s face is a feat that neither humans nor “AI’s” can reliably do. Further, these deep learning algorithms are themselves trained on data categorized by humans — so they reflect human biases. The side-by-side images in Deep Hysteria compare Rekognition’s interpretation of similar expressions on more masculine and more feminine versions of the same face. The comparisons interrogate how humans perceive emotion differently, and often in alignment with stereotypes, when observing people of differing genders.
In this talk, Amy Alexander will discuss historical and contemporary social perceptions of female “hysteria” and the ways such subtle social biases embed themselves in deep learning systems. She’ll discuss how recent research on race and gender bias in objective facial analysis classification systems informs current research on bias in subjective systems like emotion classifiers. She’ll discuss the process behind the Deep Hysteria project, proposing it as a way in to approaches by which art and research can redeploy algorithmic bias in the service of helping us better recognize deeply embedded human biases.