- 20th September 2017

From XPUB & Lens-Based wiki
Revision as of 16:49, 19 September 2017 by Fabian Landewee (talk | contribs) (Created page with "Preperation:<br><br> Deep Neural Networks (DNN) can detect sexual orientation from faces
 <br><br> This study by Yilun Wang and Michael Kosinski claims to show that faces...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Preperation:

Deep Neural Networks (DNN) can detect sexual orientation from faces


This study by Yilun Wang and Michael Kosinski claims to show that faces contain information about sexual orientation and that AI can interpreted this better than humans.



“Given a single facial image, a classifier could correctly distinguish between gay an heterosexual men in 81% of cases, and in 71% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54 % for women. The accuracy of the algorithm increased to 91% and 83%, respectively, give five facial images per person. Facial features employed by the classifier included both fixed (e.g., nose shape) and transient facial features (e.g., grooming style). Consistent with the prenatal hormone theory of sexual orientation, gay men and women tended to have gender-atypical facial morphology, expressions and grooming styles.”

As companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, their findings expose a threat to the privacy and safety of gay men and women.




> History of physiognomy > universally rejected as a mix of superstition and racism disguised as science. 
Jenkinson, j. (1997): Face facts: A history of physiognomy from ancient Mesopotamia to the end of the 19th century. The Journal biocommunication, 24