Artificial Intelligence has a “gaydar”
Artificial intelligence (AI) is going to change the way we live in many ways but we probably didn’t know that one its applications would be in the area of sexual orientation. AI can now accurately guess whether you are gay or straight from a simple photo analysis.
This is according to a Stanford University study which says that machines can tell with an accuracy of 81 percent whether a man is straight or gay and 74 percent for women. The study which was published in the Journal of Personality and Social Psychology took a sample of over 35,000 facial images of people from profiles on a dating site in the US and this just heightens the privacy concerns people who question the use of facial recognition technology. The other issue is the security of data posted on social media sites which looks like a never ending topic.
In the case of the Stanford study though, using a deep neural network, researchers Michal Kosinski and Yilun Wang were able to analyse a large data set to tell the sexual orientation of individuals who photos were used.
The algorithm according to the study outperformed the human capacity to tell accurately if someone was gay or not. While the machine could tell accurately up to 81 percent for men, humans could only tell with an accuracy of 61 percent and 54 percent for women. In fact, when the system was given up to five images per human facial sample, accuracy increased to 91 percent for men and 83 percent for women and with every AI based system, the result might just keep getting better with improved data input.
But why is that both humans and the artificial intelligence system were able to tell someone’s sexual orientation with a degree of accuracy? The answer is simple, we use our “stereotypical” understanding to make initial judgement just about anything and it would appear that that’s exactly what the machines does too- after all it was built by man.
Gay men and women at least according to the study tended to have “gender-atypical” and grooming styles that would make them appear more like the opposite sex but there’s more. The study says other properties like jaws and nose structure were used to arrive at conclusions by the machine. Gay men typically had narrower jaws while their female counterparts had larger jaws. Gay men also typically had longer noses and bigger foreheads than their straight counterparts while gay women had smaller foreheads than straight women.
This just means our body parts contain more information about us than we think. So for example, medical experts can easily tell what might be wrong with someone by looking in their eyes and maybe the tongue. But the rest of us may not be able to tell because we don’t access to such information and just the way an expert would get better at it, so also the machine.
But critics think the “gaydar” (the AI system that tells sexual orientation) should be stopped be stopped but others see it as part of a broader facial recognition technology plan which must be stopped even though governments say they might such systems for security and faster processing of human traffic at the airports. The study however suggests that sexual orientation might be as a result to exposure to some hormones before birth which means people are actually born gay.
But this was not a full LGBT study because transgender and other racial studies were not conducted but it could very well be on its way there. The big challenge for systems like this has always been what they would be used for in the wrong hands. It could very well be a source of abuse especially in climes where homosexuality is still being kicked against at difference levels of society. Companies can use it as a discriminatory tool in fact by simply taking the photo of a potential employee from a social media site and that’s the fear critics have expressed of such a system.
Just last month, AI powered face transformation app, Faceapp had to pull a feature that let users change the ethnicity of other people. It was seen as racist by many and the company has since apologised.