Microsoft's Cortana digital personal assistant has been the victim of sexual harassment from human users, including sexual and insulting comments. This is becoming a bigger trend as artificial intelligence (AI) tech makes virtual assistants like Apple's Siri and Google Now more human-like.
Deborah Harrison is an editorial writer for Microsoft's Cortana division. She told CNN that Cortana is often the victim of human users who want to talk dirty, confess they are in love, or role play.
One key factor is that most of today's AI personal assistants have female voices, according to Digital Trends. They include Apple's Siri and Amazon's Alexa.
In fact, the science fiction movie "Her" (2013) also features a sexy futuristic OS. The voice actress was Scarlett Johansson.
However, the use of female voices for virtual assistants does not just stress old-fashioned gender roles of women serving men. Some Windows 10 users are asking their digital assistants some risqué questions.
Harrison explains that when Cortana was launched during 2014 many of the early queries were about her sex life. However, the Cortana team is helping the AI to fight back.
The Microsoft employee explained Cortana will now get angry if people say rude things. Eight Microsoft writers including Harrison have the job of figuring out how Cortana responds to questions. They are careful about how they design the virtual helper.
Cortana is clearly female. A female avatar is used to represent her, and a human woman provides her voice.
However, Harrison shares that the Microsoft team prevented Cortana from having stereotypical female features. They include saying she is sorry a lot, or serving men.
Studies show that people feel comfortable talking with voice assistants. Features such as names, genders, emotions, and personalities help to build trust with the virtual assistant's users.
Nevertheless, some virtual assistant developers take a different approach. Robin Labs CEO Ilya Eckstein says there is a big demand for a personality that is friendlier and more passive, according to CNN.
Microsoft shares that talking to humans who have had personal assistant jobs helps them to build a digital version that is like a real one. This makes their Microsoft product better and helps to reduce sexual harassment in the real world.