Business

Amazon’s Alexa is digitally raising the dead

Published

on

“I am haunted.”

This is one of the several reactions on social media to Amazon.com Inc’s Alexa digital assistant impersonating a grandmother reading an excerpt from “The Wonderful Wizard of Oz.”

During a company presentation on Wednesday, Alexa chief scientist Rohit Prasad attempted to demonstrate the digital assistant’s humanlike demeanour, Bloomberg reported.

Prasad stated that he was surprised by the companionable relationship users developed with Alexa and wanted to investigate this further. Human characteristics such as “empathy and affect” are essential for establishing trust with others, he said.

In the ongoing pandemic, when so many of us have lost someone we love, while AI cannot take away the pain of loss, it can certainly make their memories last, he said.

According to the presentation, Amazon is pitching the service as a tool for digitally raising the dead. In a subsequent interview on the sidelines of Amazon’s re: MARS technology conference in Las Vegas, Prasad clarified that the service was not primarily intended to simulate the voice of dead people.

“It’s not about people who are no longer with you,” he explained. “But it’s about your grandmother; if you want your child to hear grandma’s voice, you can do so if she is unavailable. That is something I would like.”

The creep factor dominated the discussion as the presentation spread across the internet. However, more serious concerns emerged. One was the possibility of using the technology to create deepfakes, which would involve using a legitimate recording to mimic people saying something they hadn’t actually said.

Siwei Lyu, a computer science and engineering professor at the University of Buffalo whose research focuses on deepfakes and digital media forensics, expressed concern about the development.

“There are certainly benefits to Amazon’s voice conversion technologies, but we should be aware of potential misuses,” he said. “For example, a predator can pose as a family member or a friend over the phone to entice unsuspecting victims, and a forged audio recording of a high-level executive commenting on her company’s financial situation could send the stock market haywire.”

While Amazon did not specify when the new Alexa feature would be available, similar technology could make such mischief much easier in the future. Amazon had learned to simulate a voice based on less than a minute of that person’s speech, according to Prasad. Previously, doing so required hours in a studio.

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version