AI language assistants reinforce gender bias, UN report says

AI language assistants reinforce gender bias, UN report says

Artificial Intelligence voices assistant with female voices of the strengthening of the existing gender stereotypes, according to a new UN report. “I would blush If I Could” The new UNESCO report, the eyes entitled projected effects wizard female voice procedure, Amazon Alexa to Apple’s Siri, in a way that these women are “submissive and tolerant maltreatment suggests” the report is titled response from Siri used to give when a man told me: “Hey .. Siri, she’s a b-tch” in addition, the researchers argue that technology companies to take measures to protect from offensive language or gender-specific from users it has failed. “Because the language of most of the language readership is female, it is sending a signal that women are complacent, docile and willing to meet helpers available at the touch of a button or with a dull Wie voice command, hey ‘oder, OK, “write the researchers. “The assistant holds no power in the agency on what the commander asks. It honors the commands and respond to requests regardless of their sound or hostility.” Research has established that long-AI intelligence has a problem with sex and racial prejudice. The use of smart speakers will continue to grow strongly – that research firm Canalys said last year that about 100 million smart speakers would be sold in 2018. “The technology always reflects the society in which develops, “Saniye G├╝lser Corat, UNESCO director of gender equality, says TIME. “The prejudices reflect an attitude that eine, Jungen are Jungen’fast condone attitudes and increases gender stereotyping”. Corat says that women’s voices and personalities on AI technology projected onto reinforces the impression that women usually hold assistant and should be docile and submissive. As you move forward in technological progress, it is said artificial intelligence machines, companies are moving backwards to a time Mad Men or the like in which it was expected that women to serve rather than lead. “Stereotypes are important because they return to influence how girls and young women see themselves and the way they have dreams and hopes for the young future,” he says. “It ‘s almost like on the image of the woman again, which took place in 1950 or 1960th” The calls involved the relationship for more women to create technologies of artificial intelligence machines to schools, citing research by the science that finds that such machines “must be carefully controlled with moral codes and instilled.” Researchers call for aI machines technology companies respond to human commands and questions neutrally with respect to the kind of exercise, by defining sets of gender-specific data for use in artificial intelligence applications. Most of the data used to train the machine is now sexist, find them. “Machine Learning ist, Voreingenommenheit, prejudices out ‘,” they write. “A voice assistant educational diet is crucial.”
Image copyright-picture picture Alliance Alliance images via Getty