Apple Neural hash
Apple has always focused on one thing through all its ads campaigns, “what happens on your iPhone stays on your iPhone”.
Apple has announced new technologies to detect child sexual abuse materials which seems to question the above tagline. The new technology has increased concern for privacy, surveillance, and tech-enabled crimes.
As per Apple, these new features help in preventing child abuse but not at the expense of privacy. However, the question that is being raised is, what such technologies can do in the future?
The new detection tool called NeuralHash can identify images of child abuse stored on iPhone without decrypting those images. Once detected, the images will be sent to the National Centre for Missing and Exploited Children (NCMEC) and then to law enforcement.
There will be multiple checks to decrease any changes in the centre before being sent to NCMEC.
Companies like Google, Microsoft, Dropbox are already scanning any material being stored on their servers. The premise is the same with the only difference being that some of Apple’s scans will occur on the iPhone itself.
Apple’s on-device scan will send an unreadable hash to the company which is a kind of code to identify images of child abuse based on the database maintained by NCMEC.
This feature is only applicable to people who upload photos on iCloud. Those iCloud accounts are not encrypted so enforcement agencies can already have a look at them. Hence, there is no effect on privacy as said by Apple’s spokesman.
Now, the question being asked is why not Apply is scanning images when being uploaded to iCloud just like other companies are doing? Why is there a need to create a new technology to check on the images?
The iPhone was one of the first personal devices to be encrypted automatically. iMessage is a popular messaging app with end-to-end encryption. However, iCloud accounts were never encrypted.
The government has always been putting pressure on big tech firms to allow special access to encrypted data to law enforcement to prevent any crimes. Child Abuse has always been on the top of that list.
Apple can probably go for encryption of iCloud and use their new technology to identify child abuse images which can be then sent to NCMEC.
Presently, the new scanning technology is only being released in the USA.
Apple has to decide on the next steps carefully. Such features have the potential to be weaponized by the government for broadening surveillance. In such a case, Apple will fail on its promise of ensuring privacy.
To structure your CAT online preparation in an efficient way consider joining iQuanta which is a top online cat coaching institute and being a part of the iQuanta CAT 22 Course.
You can also check out and be a part of their Facebook group for peer learning, doubt solving and free material.
For 24*7 doubts-solving, FREE guidance and counselling and peer to peer learning, join the CAT preparation Fb group below: