Those That


Alexa Reporting Veterans To VA IG

March 28, 2019

USVCP Staff Writers


It appears as though a few recently disabled veterans, who had their disability ratings significantly downgraded, have claimed that their home devices reported them to the Department of Veterans Affairs (VA) Inspector General (IG) agents.  The veterans are on record reporting that Alexa, Google Home and Siri, all have been recording their conversations in their homes.


While some see these devices as beneficial assistance, others view the devices as detrimental to privacy, dangerous, and as Trojan horses in the age of digital surveillance.











For example, Amazon’s voice-controlled Alexa products are considered "always-on" devices.  Alex is programmed to “listen” to everything within range and wait for instructions.


Alexa is constantly listening for a user to say a "wake word," which triggers Alexa to begin recording voice data and respond to commands.


Alexa is programmed to send information to the cloud only after a voice command is sent and then the data is recorded and sent to the cloud. 


There is speculation that VA IG cyber agent are using secret government software to gain access to the cloud, and then gain access into the account of any veteran who owns Alexa, Google Home or Siri.

Case in point, last year, a family in Portland, Oregon, discovered that its Alexa-powered Echo device had recorded conversations in their home and sent the conversations to a random person in their Alexa contacts list.


Privacy advocates have warned the public for a few years that such devices not only had the capability to capture private conversations, but also the recordings could be used to exploit users.


Amazon officials are certain they know what happens when Alexa records private conversations.  According to Amazon; “Alexa, will sometimes mistakenly hear a series of requests and commands that sound like “wake up words” and “record words,” at which time Alexa operates as programmed.












“Echo woke up due to a word in background conversation sounding like ‘Alexa,’” Amazon said in a statement. “Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”


If you own an Echo and are concerned about what it might be recording, an Amazon help page explains that you can review, listen and delete the audio and other interactions in the settings menu.

Whether this is fact or fiction remains to be determined, but one thing is certain, some veterans may feel more comfortable having conversations out of listening range of their virtual assistant devices.  



Add A Comment

12 | 3 | 4 | 5Next>> 

Dennis Willard, 4/3/19

Our days of privacy are over!


Russell Streiber, 4/3/19
Flagrant violation of Privacy Act. Legal action should be taken against manufacturers. Any and all excuses not accepted.


John Fellows, 4/2/19

Someone better get a fix on this right away. What happened to privacy in this world?


James Bowerman, 4/1/19

People need to read the ENTIRE USER AGREEMENT of ALL devices and apps that they decide to use. For instance, Skype has clauses that allows them to monitor your bank account and other financial information. Now WHY would a video streaming app./utility need to do THAT? I was blown away when read that BS, late in the agreement and there is more still, thus I will NOT use Skype. Really folks, we need to be vigilant and informed before we decide what we're willing to sacrifice or compromise for simple "convenience". BTW, just why is the VA IG monitoring the "cloud" of such apps./services/devices?


William Napoli, 4/1/19

Big brother invading ur home with an invite. Dracula anyone?


Jim Hood, 4/1/19

Get rid of it!


Jim Duszynski, 4/1/19

This has happened more than once, a friend of also had this happen.