Apple contracts with employees around the world who listen to the data received by Siri regularly and then collect it and work to improve the experience of the personal assistant Siri so that he is more able to better understand commands and inquiries. However, one of these employees shared extremely dangerous information about what they were listening to as they followed the data recorded by Siri. Are they listening to everything that revolves around users' devices? And what was Apple's response? Is there a way to prevent this? Follow us.
The Guardian newspaper learned that Apple contract employees around the world are working on developing Siri, listening to what is going on about users' devices, and listening to confidential medical information or between doctors and their patients, drug deals, gun and drug deals, as well as what is going on between couples and even in the bedrooms, They consider this part of their job in helping to develop Siri.
Although Apple does not explicitly state this to users in the privacy policy, only a small percentage of Siri records are passed on to these contracted Apple employees, as they have been tasked with evaluating answers in various situations. And knowing whether Siri is running automatically or intentionally through "Hey Siri", as well as knowing how appropriate Siri response is in various situations.
Apple says this data is used to help Siri and voice dictation to better understand the user and get to know what he is saying. But Apple does not explicitly state that these people intentionally or unintentionally listen to recordings of users in various situations. She also said that those recordings remain Its source is unknown and it is not known who said it.
Apple told The Guardian: “A small portion of the data recorded by Siri is being analyzed to work on improving it and improving voice dictation. Siri responses are analyzed in completely secure facilities, and all auditors adhere to Apple's strict confidentiality requirements. The company added that a very small random sample, less than 1% of Siri's daily activations, is used to analyze and classify it, and this sample is usually only a few seconds.
One "corruption detection" observer working for Apple, who requested anonymity, expressed concern about this, especially given the unintended automatic Siri activations and recording of highly sensitive personal information.
Siri can be activated by mistake when you accidentally hear the phrase "Hey Siri". And it can be activated unintentionally, and this happens a lot, as happened with the British parliamentarian Gavin Williamson In an interview with the BBC about the situation in Syria, where his words were interrupted by Siri, it seems that Siri was activated as soon as he heard the word Syria and told him that she had found something on the Internet related to Syria, so he commented on this by saying, “What a thing Very strange. ”The audience laughed.
Also, Siri can also be activated in other ways. For example, if the Apple Watch detects that you have raised your hand, Siri will automatically activate and start listening to your speech.
This observer said: “There have been countless cases of recordings featuring private discussions between doctors and patients, business deals, criminal dealings, and more. These recordings are accompanied by user data explaining the location, contact details, and application data.
The accompanying information can be used to verify whether it has been successfully handled or not. In its privacy documents, Apple says: "Siri data is not linked to other data that Apple may have through your use of other Apple services." There is no specified name or identifier attached, and no individual registration can be linked to any other registrations.
Although Siri is present on most Apple devices, the Apple Watch and HomePod smartphones have been highlighted as the most popular recording sources. They said, “The automatic playback on the watch is incredibly loud, and the watch can record some snippets of up to 30 seconds. This is not a long time, but a good idea of what is happening can be gathered.
And Apple isn't alone in hiring employees around the world to monitor personal assistants. Last April, it was revealed that Amazon was using similar employees to listen to some Alexa recordings. And earlier this month, Google workers were found to be doing the same with Google Assistant.
But Apple differs from these companies in some ways. Amazon and Google allow users to unsubscribe from some use of their records; But Apple doesn't offer a similar option to completely disable Siri.
How do I disable Siri
Apple has always cherished user privacy, but rather uses it as a competitive advantage against other companies. It even displayed a billboard in Las Vegas saying, "What happens on the iPhone stays on the iPhone." And Apple has to reveal to users the existence of this human censorship and to activate a new option that allows the user to completely disable Siri at any time we want so that such atrocities do not happen even if Apple does not know their source. It is now recommended that you disable the "Listen to" Hey Siri "option so that Siri does not work with you automatically.
Source:
thank you! Believe it, you see the feature, how many times you open the Apple Watch when you say some words or a special one, if you start saying the word (oh) when addressing one of my friends, but for good urgency, my device has Wi-Fi to respond to my Siri, and the second thing, this feature does not work unless the screen is in automatic operation to raise the hand So, on the stage, there is no reaction when saying (oh) !?
Finally, Apple was whipped by Yvonne Islam.
😂😂😂😂😂😂😂😂😂😂😂
I already mentioned very important information
As long as your phone is connected to the Internet
Never expect privacy
Hey Siri 🤫
👍
For every person who says that he does not have sensitive information until he disrupts Siri or Google Assistant, or that he does not care about such matters, I would love to tell you that you are not the only one who collects your data and analyzes your behavior .. Indeed, if you want to know more about your family and study and analyze your information on your own, it will hardly help me, but if I studied and analyzed every person in your family and knew the similarities and differences, what you like and what you hate What do you like, what makes you happy, what does not include you, and what does he want? After collecting this information and knowing your way of thinking, I can do a lot without moving an arm or carrying a weapon. It is applied at the state level for you to imagine all the intellectual wars taking place now, what caused them ... and for you to imagine how weakness and strife occurred between brothers that Islam brought together under one roof in advance .. You can imagine how nations were displaced and how an entire people turned against their guardians overnight. All there is to it is do not underestimate your information, no matter how small or unimportant, and reveal it .. This information and data will become of great value when communicating them to each other and do not secure the deception of companies such as Google, Amazon, Facebook and even Apple itself. M a state out of thin air!
Your words are sound
I salute you
Believe 👍
It is nice to see the wide variation in the responses here regarding the same responses that happened with Alexa and Kocl
What happens on the iPhone stays on the iPhone
This article has always been boasted by Apple 😬😬
And the saying is suspended in a city in America 😷😷
Connecting to the Internet in any way will make you lose a lot of your privacy
The important thing is that it's after you've turned on Siri, not all the time
First fully professional and impartial post for this blog
I always tell about Tim Cook Siri was watching me and Tim Cook heard me Sutter we locked in tonight 🙄
Why close Siri? I am neither a head of state nor a state king. Even if Siri knows what to tell the important words that I have a lot, it will happen to me like in this video. Things are good, and this Siri is a rich man who helps you quickly, responds to you more quickly than people with regard to the Arabic speech that comes out of us. Do not be afraid, the words come out encrypted and delivered to Apple like talismans, by God, in words in Arab countries that have no meaning in Google Translate 🤣🤣
Refer to Brother @Keenan's response, which is a response to your comment
I returned to the comment, may Allah reward him well for this information
If she only listens to the things being said after turning on Siri, then there is no problem in that
Not every company adhered to safety standards, there would be no risk
This is a breach of privacy, and this is an offense for which Apple, Google and Amazon should be punished
Every day, privacy decreases. With the development of smartphones, our lives have become truly bare!
Exactly
Our life is naked I want to go wear life and privacy clothes Oh shame on life and privacy What are they ashamed Oh cover, we locked up tonight 🤣