Scientists have discovered that artificial intelligence is capable of figuring out whether someone is gay or straight simply by looking at pictures of their faces, the Guardian reports.
A study out of Stanford University, published in the Journal of Personality and Social Psychology, analyzed a sample of more than 35,000 faces of men and women from a U.S. dating website. Researchers took features from the facial images using “deep neural networks” — which is a sophisticated mathematical system. The team of scientists found that an algorithm was able to tell whether the men were gay or straight 81 percent of the time, as compared to 74 percent for women.
The Guardian notes that the finding raises concerns about the ethics of face-detection technology and privacy concerns for LGBT people. The study also poses questions about the biological origins of sexual orientation, as it provides “strong support” that sexual orientation stems from hormonal exposure before birth (in other words, being queer is not a choice, although the lower accuracy in females may mean sexuality is more fluid in women).
Per the Guardian:
The research found that gay men and women tended to have “gender-atypical” features, expressions and “grooming styles”, essentially meaning gay men appeared more feminine and vice versa. The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.
And perhaps unsurprisingly, the study also found that the algorithm is a lot better at figuring out a person’s sexual orientation than human judges — people accurately identified men as straight or gay 61 percent of the time, and 54 percent of the time for women. The authors explained in the study that in general, “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain.”