Many observers feel that the increasing use of virtual assistants, such as Apple’s Siri and Amazon’s Alexa, will be among the top technology news stories of 2017.
Yet, an increasing number of people, both in the tech field and outside of it, also feel that the gender with which Siri and Alexa – and other virtual assistants – are designated is problematic. They are designed and marketed with female voices, which may reinforce what they feel are gender stereotypes about women being helpmeets rather than independent or authority figures, even though the truth is that a helpmeet is not a stereotype, but a biblical construct that in no way diminishes a woman’s individuality, intellect, or abilities.
Approachable and Unintimidating Helpers = Women?
Virtual assistants are designed to assist: to make life more comfortable and convenient for their users. For some, that means they are in a subservient role to whichever human is using them. They can be asked for help with a wide number of tasks and many can prompt their users with questions.
The developers of virtual assistants, therefore, had to program them to seem approachable, not intimidating. With questions particularly, they had to offer help or instructions without seeming annoyed or as if they were questioning the user’s choices to demonstrate their own helpfulness potential.
The solution of many developers, as a recent New Yorker article pointed out, was to make the voices female. That way, a virtual assistant could ask “Do you want me to turn the radio on now?” without fear of seeming patronizing, or “Do you want me to show you?” without it being interpreted as implying a skill shortage on the part of the user, in the way male voices might have been.
Given the fact that the overwhelming majority of artificial intelligence workers in tech are men – women constituted less than 14% of attendees at one major conference – there is growing concern that the gender disparity will make itself felt in the way that tech workers conceptualize virtual assistants. Some think that men may tend to think of women as not being equals.
One magazine cover drew fire for picturing robots as women in front of a typewriter, reinforcing a male boss and female typist stereotype. The picture might have come out of the sexist era of early Mad Men, with bots being the only futuristic element in a retrograde scene.
However, a multiplicity of choices exists for how virtual assistants are designed. First, it should be pointed out that many currently available virtual assistants can be programmed to have male or female voices. Advertisements and media, such as the film Her, in which a man falls in love with his Scarlett Johanssen-voiced virtual assistant, tend to portray them as female, but that is not the only option.
Second, some virtual assistants are designed to be genderless. One notable example is Kai, an online banking assistant. Kai’s name is intended to be genderless. If users interpret it as female and ask flirty questions, she has been programmed to respond, “I’m picturing white sand and a hammock. Try me again when you’re ready.”
As the New Yorker points out, virtual assistants can be programmed to exhibit the qualities that make them approachable without resorting to gender traits. Sony’s Aibo robots, for example, flash to resemble a smile and to give the appearance of engagement and extroversion as they serve.
Thought and care can help designers to develop the most engaging and appropriate personae for bot design.