Photo by Anete Lusina from Pexels

Academic researchers have uncovered a vulnerability in Amazon Echo smart speakers, which can be used to hack it. The hacker can use the speaker to make unwanted commands like unlock doors, make phone calls and unauthorized purchases, and control furnaces, microwave ovens, and other smart appliances.

The attack uses the device’s speaker to give voice instructions. If the command has the device wake word (“Alexa” or “Echo”), Echo will execute the command, observed the researchers from Royal Holloway University London and Itlay’s University of Catania.  

The requirement of a verbal confirmation before executing sensitive commands can be circumvented by appending the word “yes” approx six seconds after the command. Further, attackers can exploit the “full voice vulnerability” or “FW”, which allows Echo to command itself without lowering the volume. 

As the hack needs Alexa functionality for the speaker to issue commands, the researchers have called it “AvA”, short for Alexa vs. Alexa. The hacker needs to be in the range of a vulnerable device when it’s on to instruct the device to pair with an attacker’s bluetooth-enabled device. The attacker can command Echo as long as the attacker is within Echo’s radio range.

The attack “is the first to exploit the vulnerability of self-issuing arbitrary commands on Echo devices, allowing an attacker to control them for a prolonged amount of time,” the researchers wrote in a paper published two weeks ago. “With this work, we remove the necessity of having an external speaker near the target device, increasing the overall likelihood of the attack.”

Another version of the attack deploys a malicious radio station to command Echo. The malicious radio station attack is no longer possible as Amazon the Echo-maker released security patches after the research paper highlighted the vulnerability. The researchers stated that the attacks could target 3rd- and 4th- generation Echo Dot devices.

When a vulnerable device connects by Bluetooth to the attacker’s device, the AvA begins. After that, the attacker can stream voice commands via a text-to-speech app or other means.

The researchers discovered a range of commands that can be used in AvA attacks which can have privacy or security implications. The article lists range of malicious actions:

  • “Controlling other smart appliances, such as turning off lights, turning on a smart microwave oven, setting the heating to an unsafe temperature, or unlocking smart door locks. As noted earlier, when Echos require confirmation, the adversary only needs to append a “yes” to the command about six seconds after the request.
  • Call any phone number, including one controlled by the attacker, so that it’s possible to eavesdrop on nearby sounds. While Echos use a light to indicate that they are making a call, devices are not always visible to users, and less experienced users may not know what the light means.
  • Making unauthorized purchases using the victim’s Amazon account. Although Amazon will send an email notifying the victim of the purchase, the email may be missed or the user may lose trust in Amazon. Alternatively, attackers can also delete items already in the account shopping cart.
  • Tampering with a user’s previously linked calendar to add, move, delete, or modify events.
  • Impersonate skills or start any skill of the attacker’s choice. This, in turn, could allow attackers to obtain passwords and personal data.
  • Retrieve all utterances made by the victim. Using what the researchers call a “mask attack,” an adversary can intercept commands and store them in a database. This could allow the adversary to extract private data, gather information on used skills, and infer user habits.”

Reference:

https://arstechnica.com/information-technology/2022/03/attackers-can-force-amazon-echos-to-hack-themselves-with-self-issued-commands/