Apple contracts with employees around the world who listen to the data received by Siri regularly and then collect it and work to improve the experience of the personal assistant Siri so that he is more able to better understand commands and inquiries. However, one of these employees shared extremely dangerous information about what they were listening to as they followed the data recorded by Siri. Are they listening to everything that revolves around users' devices? And what was Apple's response? Is there a way to prevent this? Follow us.

Siri development personnel around the world listen to your sensitive information


The Guardian newspaper learned that Apple contract employees around the world are working on developing Siri, listening to what is going on about users' devices, and listening to confidential medical information or between doctors and their patients, drug deals, gun and drug deals, as well as what is going on between couples and even in the bedrooms, They consider this part of their job in helping to develop Siri.

Although Apple does not explicitly state this to users in the privacy policy, only a small percentage of Siri records are passed on to these contracted Apple employees, as they have been tasked with evaluating answers in various situations. And knowing whether Siri is running automatically or intentionally through "Hey Siri", as well as knowing how appropriate Siri response is in various situations.

Apple says this data is used to help Siri and voice dictation to better understand the user and get to know what he is saying. But Apple does not explicitly state that these people intentionally or unintentionally listen to recordings of users in various situations. She also said that those recordings remain Its source is unknown and it is not known who said it.

Apple told The Guardian: “A small portion of the data recorded by Siri is being analyzed to work on improving it and improving voice dictation. Siri responses are analyzed in completely secure facilities, and all auditors adhere to Apple's strict confidentiality requirements. The company added that a very small random sample, less than 1% of Siri's daily activations, is used to analyze and classify it, and this sample is usually only a few seconds.


One "corruption detection" observer working for Apple, who requested anonymity, expressed concern about this, especially given the unintended automatic Siri activations and recording of highly sensitive personal information.

Siri can be activated by mistake when you accidentally hear the phrase "Hey Siri". And it can be activated unintentionally, and this happens a lot, as happened with the British parliamentarian Gavin Williamson In an interview with the BBC about the situation in Syria, where his words were interrupted by Siri, it seems that Siri was activated as soon as he heard the word Syria and told him that she had found something on the Internet related to Syria, so he commented on this by saying, “What a thing Very strange. ”The audience laughed.

 Also, Siri can also be activated in other ways. For example, if the Apple Watch detects that you have raised your hand, Siri will automatically activate and start listening to your speech.

This observer said: “There have been countless cases of recordings featuring private discussions between doctors and patients, business deals, criminal dealings, and more. These recordings are accompanied by user data explaining the location, contact details, and application data.

The accompanying information can be used to verify whether it has been successfully handled or not. In its privacy documents, Apple says: "Siri data is not linked to other data that Apple may have through your use of other Apple services." There is no specified name or identifier attached, and no individual registration can be linked to any other registrations.


Although Siri is present on most Apple devices, the Apple Watch and HomePod smartphones have been highlighted as the most popular recording sources. They said, “The automatic playback on the watch is incredibly loud, and the watch can record some snippets of up to 30 seconds. This is not a long time, but a good idea of ​​what is happening can be gathered.

And Apple isn't alone in hiring employees around the world to monitor personal assistants. Last April, it was revealed that Amazon was using similar employees to listen to some Alexa recordings. And earlier this month, Google workers were found to be doing the same with Google Assistant.

But Apple differs from these companies in some ways. Amazon and Google allow users to unsubscribe from some use of their records; But Apple doesn't offer a similar option to completely disable Siri.


How do I disable Siri

Apple has always cherished user privacy, but rather uses it as a competitive advantage against other companies. It even displayed a billboard in Las Vegas saying, "What happens on the iPhone stays on the iPhone." And Apple has to reveal to users the existence of this human censorship and to activate a new option that allows the user to completely disable Siri at any time we want so that such atrocities do not happen even if Apple does not know their source. It is now recommended that you disable the "Listen to" Hey Siri "option so that Siri does not work with you automatically.

Do you think companies should do this under the pretext of developing a personal assistant? Or is this blatant infringement on user privacy? Let us know in the comments.

Source:

The Guardian

Related articles