Amazon announced new AI capabilities for its Alexa products last week, based on a model it’s calling AlexaLLM (LLM refers to the “large language model”). The technology will make Alexa “more personalized to your family” and allow it to remember relevant context throughout conversations like a human, Amazon said.

But along with those new capabilities, said Amazon’s senior vice president of devices and services, Dave Limp, Amazon would use some user voice interactions with Alexa to train its AI model. 

Amazon says the new AlexaLLM is “the largest integration of a large language model” that provides real-time services on a suite of devices. It could be integrated with smart speakers like Alexa Echo or doorbells, such as Alexa Ring, which monitors visitors at doors and controls door locks. 

Like any LLM, though, it requires training and updating. 

In response to a viewer question in a Bloomberg TV interview, Limp said that by agreeing to use a more “customized” version of Alexa, users would be volunteering their voice data and conversations for Amazon’s LLM training purposes.

It’s not clear how much voice data is actually necessary to train Amazon’s models and to what degree it might be used for other purposes. 

An Amazon spokesperson said in an email: “Customers can still access the same robust set of tools and privacy controls that put them in control of their Alexa experience today. For example, customers will always know when Alexa is listening to their request because the blue light indicator will glow and an optional audible tone will sound.”

Alexa is activated after it hears certain keywords, such as “Alexa,” “Echo” or “Computer.” With AlexaLLM, though, a new Visual ID function can be enabled, which allows users to activate Alexa not by using cue words but by simply facing their smart display devices.

Another “Alexa Let’s chat” feature allows users to have extended conversations with Alexa, making as many follow-up requests as the user wants without having to repeat the activation word.

Amazon’s spokesperson said that while audio data from the interactions with Alexa would be stored in the cloud, no images or videos would be stored. Users can disconnect their cameras by either pushing the camera-off button or using the built-in camera shutter. 

John Davisson, the director of litigation and senior counsel at the Electronic Privacy Information Center, said consumers should question Amazon’s interest in keeping and using voice data. 

“I don’t think we should accept that Amazon needs to retain those data for product improvement, and consumers often don’t understand what that means. They need affirmative opt-in confirmation to join these programs instead of being set at default,” he said.

Users do have the choice to opt out of the voice recording function. The option didn’t come until 2019, though, after the company received strong backlash over privacy concerns related to its human reviewing program. 

Davisson stressed that both audio and video are important and sensitive forms of biometric data. Moreover, Amazon has a recent track record of data privacy issues involving minors and Alexa devices.

In May, the Federal Trade Commission charged Amazon with illegally preventing parents from requesting the deletion of records relating to their children.

Davisson said, “That alone to me is a red flag for any privacy assurance they make about Alexa, and that would apply to children. So they should really be greeted with a lot of skepticism.”

The FTC also charged Amazon with mishandling user data with third-party contractors. The FTC alleged that the customer videos recorded by Amazon Ring were accessed and downloaded by the company’s third-party contractor in Ukraine, even though it wasn’t necessary to perform its job.

An Amazon spokesperson said that when Amazon Kids is enabled or when a child’s voice profile is recognized, certain features, such as “Alexa Let’s chat,” won’t be available. 

Davisson said using children’s voices to train AlexaLLM could have various consequences: “Political bias, factual accuracy and unexpected, bizarre behaviors from the model could creep into the data provided for both adults and children.”

Source: | This article originally belongs to Nbcnews.com

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Ukraine reels from missile strike on mall as the West grapples with how to oppose Russia 

Cherry pickers and cranes picked up large beams as workers toiled through…

Rochester police officer fatally shot and another injured, affiliate reports

A Rochester police officer was fatally shot and another injured after a…

Novartis to Help Make CureVac Covid-19 Vaccine

Novartis AG NVS -0.72% will help manufacture a Covid-19 vaccine that Germany’s CureVac…