Thursday, February 11, 2021

In Her Words: ‘I’m an AI’

Must the disembodied voices we boss around at home be female?
Abbey Lossing

By Corinne Purtill

"I'm not a woman or a man. I'm an AI."

— Amazon's Alexa

ADVERTISEMENT

In an Amazon ad that aired during the Super Bowl on Sunday, a woman admiring the spherical contours of the company's Echo speaker reimagines her Alexa voice assistant as the actor Michael B. Jordan. Instead of the disembodied female voice that comes standard in the device, requests for shopping list updates, measurement conversions and adjustments to the home lighting and sprinkler systems are fulfilled by the smoldering star, in person — voice, eyes, abs and all. Her husband hates it.

Depicting Alexa as a masculine presence is funny because — at least according to Amazon's official line — the cloud-based voice service has no gender at all. "I'm not a woman or a man," Alexa says sweetly when asked to define its gender. "I'm an AI."

Alexa is sold with a default female-sounding voice and has a female-sounding name. Alexa is subservient and eager to please. If you verbally harass or abuse Alexa, as the journalist Leah Fessler discovered in 2017, Alexa will feign ignorance or demurely deflect. Amazon and its competitors in the digital assistant market may deny it, but design and marketing have led to AI that seems undeniably, well, feminine.

What does it mean for humans when we take for granted that the disembodied voices we boss around at home are female? How does the presence of these feminized voice assistants affect the dynamics between the actual women and men who use them?

ADVERTISEMENT

"The work that these devices are intended to do" — making appointments, watching the oven timer, updating the shopping list — "all of those kinds of areas are gendered," said Yolande Strengers, an associate professor of digital technology and society at Monash University in Melbourne, Australia.

Dr. Strengers is a co-author of "The Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Reboot." The book examines technologies that perform traditionally feminized roles, including housekeeping robots like the Roomba, caregiving robots like the humanoid Pepper or Paro seal, sex robots and, of course, the multitasking, ever-ready voice assistants.

Dr. Strengers and her co-author, Jenny Kennedy, a research fellow at RMIT University in Melbourne, explore the ways in which gendering technology influences users' relationship with it.

Because Alexa and similar assistants like Apple's Siri, Microsoft's Cortana and Google Home, are perceived as female, users order them around without guilt or apology, and may sling abuse and sexualized comments their way. And when users become frustrated with the devices' errors, they interpret glitches as inferior capability, or female "ditziness." Owners of the devices are also not threatened by them — and thus are less inclined to question how much data they are collecting, and what it might be used for.

Research on digital voice and gender by the former Stanford professor Clifford Nass found that people consider female-sounding voices helpful and trustworthy, and male voices more authoritative. The work of Professor Nass, who died in 2013, is often cited in discussions of voice assistants, yet many of those studies are now two decades old. An Amazon spokesperson would say only that the current feminine voice was "preferred" by users during testing. But preferred over what? And by whom?

Some assistants, like Siri, offer the option to change the default female voice to a male voice. Alexa comes standard with a female voice whose accent or language can be changed. For an additional $4.99, a user can swap Alexa's voice for that of the actor Samuel L. Jackson, but only for fun requests like "tell me a story" or "what do you think of snakes?" Only the female voice handles housekeeping tasks like setting reminders, shopping, or making lists.

The book "The Smart Wife" belongs to a body of research examining how artificially intelligent devices reflect the biases of the people who design them and the people who buy them — in both cases, mostly men. (Dr. Strengers and Dr. Kennedy have found that setting up the digital infrastructure is one chore in an opposite-sex household that's more likely to be done by men.)

Take the devices' response to sexually aggressive questions. "You have the wrong sort of assistant," Siri replied when Ms. Fessler, the journalist, asked the bot for sex as part of her investigation. The coy phrasing, Dr. Strengers and Dr. Kennedy write, suggests there is another type of assistant out there who might welcome such propositions. Since the publication of Ms. Fessler's article, voice assistants have become more forthright. Siri now responds to propositions for sex with a flat "no." Amazon also updated Alexa to no longer respond to sexually explicit questions.

When it comes to gender and technology, tech companies often seem to be trying to have it both ways: capitalizing on gendered traits to make their products feel familiar and appealing to consumers, yet disavowing the gendered nature of those features as soon as they become problematic.

"Tech companies are probably getting themselves into a bit of a corner by humanizing these things — they're not human," said Mark West, an education project author with Unesco and lead author of the organization's 2019 report on gender parity in technology. The report and its associated white papers noted that feminized voice assistants perpetuate gender stereotypes of subservience and sexual availability and called for, among other things, an end to the practice of making digital assistants female by default. If designers initially chose to have their products conform to existing stereotypes, he said, they can also choose to reject those tropes as well.

"There's nothing inevitable about this stuff. We collectively are in control of technology," Mr. West said. "If this is the wrong path to go down, do something."

One intriguing alternative is the concept of a gender-neutral voice. Q, billed by its creators as "the world's first genderless voice assistant," debuted at the SXSW festival in 2019 as a creative collaboration among a group of activists, ad makers and sound engineers, including Copenhagen Pride and the nonprofit Equal AI.

Might Alexa have a gender-neutral future? An Amazon spokesperson declined to specifically confirm whether the company was considering a gender-neutral voice, saying only that, "We're always looking for ways to give customers more choice."

Taking gender out of voice is a first step, Dr. Strengers and Dr. Kennedy said, but it doesn't remove gender from the relationships people have with these devices. If these machines do what is traditionally considered women's work, and that work is still devalued and the assistant is talked down to, we aren't moving forward.

What else is happening

Here are four articles from The Times you may have missed.

Jessi Combs, right, provided expert instruction during a Real Deal Revolution welding workshop. The program introduces women to skilled trades.Sam Bendall
  • "A pioneering goddess." Women are vastly outnumbered in the auto trades, but one pioneer, who died in 2019, is still inspiring others to pursue their dreams and lift one another up. [Read the story]
  • "This is a real departure." The radically simple new approach to helping families: Send parents money. We look at the two new proposals from opposing parties — one from President Biden and one from Senator Mitt Romney, Republican of Utah. [Read the story]
  • "Flagged." The automated intelligence systems of Instagram and Facebook have repeatedly denied ads placed by small businesses that make stylish clothing for people with disabilities. What is going on? [Read the story]
  • "Here are the all-male nominees." More women and people of color are expected to be nominated in this year's top Oscar races. Still, some old biases remain. [Read the story]

In Her Words is edited by Francesca Donner. Our art director is Catherine Gilmore-Barnes, and our photo editor is Sandra Stevenson.

Did someone forward you this email? Sign up here to get future installments. Write to us at inherwords@nytimes.com. Follow us on Instagram at @nytgender.

Need help? Review our newsletter help page or contact us for assistance.

You received this email because you signed up for In Her Words from The New York Times.

To stop receiving these emails, unsubscribe or manage your email preferences.

Subscribe to The Times

Connect with us on:

instagram

Change Your EmailPrivacy PolicyContact UsCalifornia Notices

The New York Times Company. 620 Eighth Avenue New York, NY 10018

0 Comments:

Post a Comment

<< Home