Amazon makes use of child’s lifeless grandma in morbid demo of Alexa audio deepfake

Enlarge / The 4th-gen Amazon Echo Dot good speaker.

Amazon

Amazon is determining the way to make its Alexa voice assistant deepfake the voice of anybody, lifeless or alive, with only a brief recording. The corporate demoed the characteristic at its re: Mars convention in Las Vegas on Wednesday, utilizing the emotional trauma of the continuing pandemic and grief to promote curiosity.

Amazon’s re: Mars focuses on synthetic intelligence, machine studying, robotics, and different rising applied sciences, with technical consultants and trade leaders taking the stage. Through the second-day keynote, Rohit Prasad, senior vp and head scientist of Alexa AI at Amazon, confirmed off a characteristic being developed for Alexa.

Within the demo, a baby asks Alexa, “Can grandma end studying me Wizard of Oz? “Alexa responds,” Okay, “in her typical effeminate, robotic voice. However subsequent, the kid’s grandma’s voice comes out of the speaker to learn L. Frank Baum’s story.

You’ll be able to watch the demo beneath:

Amazon re: MARCH 2022 – Day 2 – Keynote.

Prasad solely mentioned Amazon is “engaged on” the Alexa functionality and did not specify what work stays and when / if it’s going to be out there.

He did present minute technical particulars, nonetheless.

“This required invention the place we needed to study to supply a high-quality voice with lower than a minute of recording versus hours of recording in a studio,” he mentioned. “The best way we made it occur is by framing the issue as a voice-conversion process and never a speech-generation process.”

Prasad very briefly discussed how the feature works.
Enlarge / Prasad very briefly mentioned how the characteristic works.

In fact, deepfaking has earned a controversial fame. Nonetheless, there was some effort to make use of the tech as a instrument reasonably than a way for creepiness.

Audio deepfakes particularly, as famous by The Vergehave been leveraged within the media to assist make up for when, say, a podcaster messes up a line or when the star of a venture passes away out of the blue, as occurred with the Anthony Bourdain documentary Roadrunner.

There are even cases of individuals utilizing AI to create chatbots that work to speak as if they’re a misplaced cherished one, the publication famous.

Alexa would not even be the primary client product to make use of deepfake audio to fill in for a member of the family who cannot be there in particular person. The Takara Tomy good speaker, as identified by Gizmodomakes use of AI to learn youngsters bedtime tales with a mother or father’s voice. Dad and mom reportedly add their voices, so to talk, by studying a script for about quarter-hour. Though, this notably differs from Amazon’s demo, in that the proprietor of the product decides to offer their vocals, reasonably than the product utilizing the voice of somebody seemingly unable to offer their permission.

Moreover worries of deepfakes getting used for scams, rip-offsand different nefarious exercisethere are already some troubling issues about how Amazon is framing the characteristic, which does not actually have a launch date but.

Earlier than exhibiting the demo, Prasad talked about Alexa giving customers a “companionship relationship.”

“On this companionship function, human attributes of empathy and have an effect on are key to constructing belief,” the exec mentioned. “These attributes have change into much more essential in these instances of the continuing pandemic, when so many people have misplaced somebody we love. Whereas AI cannot remove that ache of loss, it may positively make their reminiscences final.”

Prasad added that the characteristic “allows lasting private relationships.”

It is true that numerous individuals are in critical search of human “empathy and have an effect on” in response to emotional misery initiated by the COVID-19 pandemic. Nonetheless, Amazon’s AI voice assistant is not the place to fulfill these human wants. Alexa can also’t allow “lasting private relationships” with people who find themselves not with us.

It is not exhausting to consider that there are good intentions behind this creating characteristic and that listening to the voice of somebody you miss could be a nice consolation. We may even see ourselves having enjoyable with a characteristic like this, theoretically. Getting Alexa to make a buddy sound like they mentioned one thing foolish is innocent. And as we’ve mentioned above, there are different corporations leveraging deepfake tech in methods which might be just like what Amazon demoed.

However framing a creating Alexa functionality as a solution to revive a connection to late relations is a huge, unrealistic, problematic leap. In the meantime, tugging on the heartstrings by bringing in pandemic-related grief and loneliness feels gratuitous. There are some locations Amazon does not belong, and grief counseling is one in every of them.

Leave a Comment

%d bloggers like this: