Amazon, Apple and Google all employees who list to customer voice recordings from their smart speakers and voice assistant apps.
News site Bloomberg highlighted the topic after speaking to Amazon staff who "reviewed" Alexa recordings.
All three companies say voice recordings are occasionally reviewed by humans to improve speech recognition.
But the reaction to the Bloomberg article suggests many people are unaware that humans may be listening.
The news site said it had spoken to seven people who reviewed audio from Amazon Echo smart speakers and the Alexa service.
Reviewers typically transcribed and annotated voice clips to help improve Amazon's speech recognition systems.
Amazon's voice recordings are associated with an account number, the customer's first name and the serial number of the echo device used. Bloomberg that they shared amusing voice clips with one another in an internal chat room.
They also described hearing distressing clips as a potential sexual assault. However, they were not by Amazon's job to intervene
What did Amazon say?
The terms and conditions for Amazon's Alexa service state that voice recordings are used to answer your questions, fulfill your requests and improve your experience and services. Human reviewers are not explicitly mentioned.
A statement, Amazon said it took security and privacy seriously and only annotated "an extremely small sample of Alexa voice recordings"
"This information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone, "it said in a statement.
" We have strict technical and operational safeguards and have a zero tolerance policy for the abuse of our system. Employees do not directly access information that can identify the person or account as part of this workflow. "
What about Apple and Siri?
Apple also has human reviewers who make sure its voice assistant Siri is Interpreting requests correctly
According to Apple's security policy, voice recordings are uniquely identifiable information and are linked to a random ID number, which is reset every time Siri is switched off.
Any voice recordings kept after six months are stored without the random ID number.
Its human reviewers never receive personally identifiable information or the random ID
What about Google and Assistant?
Google said human reviewers could list to audio clips from its Assistant, which is embedded in most Android phones and the Home speaker
The said clips were not associated with unidentified information and the company also distorted the audio to disguise the customer's voice.
Are common speakers are all my conversations?
A common fear is that smart speakers are secretly recording everything that is said in the home.
While smart speakers are technically always "hearing", they are typically not "listening" to your conversations.
All the major home assistants record and analysis short snippets or audio internally, in order to detect a wake word such as "Alexa", "Ok Google" or "Hey Siri".
If the wake is not heard, the audio is discarded.
But if the wake word is detected, the audio is kept and recording continues so that the customer's request can be sent to the voice recognition service. ] It would be easy to detect if a speaker was continuously sending entire conversations back to a remote server for analysis, and security researchers did not find evidence to suggest this is happening
Can I stop human reviewers listening to my voice clips?
Amazon's Alexa privacy settings do not let you out of voice recording or human review, but you can stop your recordings being used to help develop new features. You can also list previous voice recordings and delete
Google lets you list and delete voice recordings on the My Activity page. You may also switch to Siri recordings.
Apple does not let you back to Siri recordings. Its privacy portal, which lets you download a copy of your personal data, says it cannot provide information "that is not personally identifiable or linked to your Apple ID".
to the Siri & Search menu in Settings and switch Siri off. Then go to the Keyboard menu (found in the General section) and switch off Dictation.