Apple is expected to launch its feature New Child Safety CSAM Later this year, specifically with the launch of iOS 15, but one of the developers said that they found the algorithm through which the feature works in older versions of the iPhone operating system and discovered a flaw and a technical flaw in this technology. So what is this defect and what was Apple's response?
CSAM image analysis and scanning feature
A Reddit user says he discovered a version of the NeuralHash algorithm used to detect CSAM images within iOS 14.3.
Apple's response was: The extracted version is not up to date and will not be used.
The developer known as AsuharietYgvar was able to find NerualHash code buried inside iOS 14.3 and said that he reverse-engineered the code and reconstructed a working model in Python that can be tested by passing images of these codes.
The developer claims that this algorithm is not fooled by image compression or resizing but it cannot detect the process of cropping or rotating the image. Using the language of Python, users of the GitHub platform began examining how the algorithm works and whether it can be misused and exploited in a way that threatens iPhone owners. Abused and completely innocent photo but it deceives Apple codes and produces a match with CSAM codes, thus in theory a hacker could send these very natural looking photos as your device wallpaper or a normal landscape to Apple users to try to run the algorithm and trap them and cause them problems.
Not the final version
Despite these flaws, it can be said that the discovered algorithm is not the final version as Apple has been building the CSAM image detection algorithm for years so there are supposed to be some versions of the code for testing.
As for the possibility of creating fake photos and sending them to iPhone users to trap them, it is not that simple, because Apple does not reveal the photos unless they are synchronized with the cloud and do not forget the human review to ensure that the photos actually contain child abuse, all of this means that the attack that It is assumed to be true in theory but difficult in practice.
Apple responded
Apple said in a statement that the NeuralHash code discovered in iOS 14.3 is not the final version, also no one has discovered the existence of a CSAM database or matching algorithm in the iOS 15 beta version yet, but the feature is expected to be launched in its full version after the appearance of the system New launch iOS 15.
Source:
They arrest me for eating children because my mobile phone is full of videos and I play with my children as if I am eating them hahahahahaha
Apple still hasn't said anything about notifying and compensating users whose photos have been subject to human review and proven innocent.
Will Apple compensate them for violating their privacy by showing Apple employees their private photos?
Of course, it will be used to search for other things and without our knowledge, and countries can use it to search for anything else.
Apple is not an angel, and in the end it is an exploitative company that is only interested in raising money
Thank you for your effort and interest