• Cryptocurrency
  • Earnings
  • Enterprise
  • About TechBooky
  • Submit Article
  • Advertise Here
  • Contact Us
TechBooky
  • African
  • AI
  • Metaverse
  • Gadgets
Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
  • African
  • AI
  • Metaverse
  • Gadgets
Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
TechBooky
Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Home Enterprise

Apple Plans To Scan U.S. iPhones For Images Of Child Sexual Abuse

Ayoola by Ayoola
August 6, 2021
in Enterprise
Share on FacebookShare on Twitter

Tech giant, Apple has unveiled plans to scan U.S. iPhones for images of child sexual abuse, in the process earning accolades from several child protection groups while some security researchers have raised their concern that the system could be misused by entities like governments for illegal surveillance.

Apple plans to use the tool called “neural Match” to scan images of child sexual abuse before they are uploaded to iCloud. If child pornography is suspected, then reviewed by a human and is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified. They also plan to scan users’ encrypted messages for sexually explicit content as a child safety measure.

Apple has assured that it can only detect images that are already in the centre’s database of child pornography.

Matthew Green, a leading cryptography researcher at Johns Hopkins University, warned that the system could be used for nefarious activities. It could be used to implicate someone by sending them seemingly harmless pictures that would trigger child pornography matches and trick Apple’s algorithm to alert law enforcement.

“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’” Green asked when listing other implications of the implementing the plan. He expressed concern that the leading tech organization would be unable to say no.

The concerns of privacy advocates are not so farfetched as Apple, and other companies who have embraced end-to-end encryption, have been putting off government pressure to allow for increased surveillance of encrypted data for years. These new security measures would necessitate Apple to create a delicate equilibrium between protecting children’s rights and also keeping the integrity of its privacy policy. The company reported that it would be releasing these latest changes as part of updates for iPhones, Macs and Apple Watches.

Apple said the app will use on-device machine learning to identify and haze out sexually explicit images on children’s phones and also alert the parents of younger children who have enrolled their children’s phones. It would also “intervene” in users’ search for topics related to child sexual abuse.

Hany Farid, a researcher at the University of California, Berkeley argues that a lot of other programs designed for protection also have these concerns. For example, he says, “WhatsApp provides users with end-to-end encryption to protect their privacy, but also employs a system for detecting malware and warning users not to click on harmful links.”

Julia Cordua, the CEO of Thorn, a non-profit that uses technology to help protect children from sexual abuse said that Apple’s technology balances “the need for privacy with digital safety for children.”

However, the Centre for Democracy and Technology in Washington have called on Apple to abandon the changes as it would eventually terminate Apple’s assurance of “end-to-end encryption.”

Apple responded that the new features would not jeopardize the security of private communications.

Related Posts:

  • apple-reuters-image-163135177316×9
    Apple Is Introducing End-To-End Encryption To iCloud Backups
  • no-porn-warning-sign
    Eight US States Mull Anti-Porn Bills And It Would…
  • MWnFhVdGkErQc9gVJA48Mc
    EU Proposes Chat Control Law For Scanning Encrypted Messages
  • apple-sued-to-remove-telegram-from-app-store-over-anti-semit_agaq
    Telegram Removes 15 Million Suspicious Groups Using…
  • app icons, social media, search _ logo, google, engine, software_md
    Google Bans Ads Promoting Deepfake Explicit Content
  • bba9ccb3-0e1a-4858-8e00-a6cb81cd5e40
    The "Tech" Company Behind Pornographic Sites Like…
  • 4d5d108f-c4c9-45b2-a087-41168b1b3344-large16x9_GettyImages1141304207
    In A First In The United States, Utah Limits Teens…
  • X3XTFPQXFBKUVLKAWDUTX6U2SM
    UK and US Seek to Resolve Apple Encryption Dispute

Discover more from TechBooky

Subscribe to get the latest posts sent to your email.

Tags: Applechild sexual abuseiphonesociety
Ayoola

Ayoola

Ayoola Faseyi, an Abuja based Journalist with interest in Technology and Politics. He is a versatile writer with articles in many renowned News Journals.He is the Co-Founder of media brand, The Vent Republic.

BROWSE BY CATEGORIES

Receive top tech news directly in your inbox

subscription from
Loading

Freshly Squeezed

  • Visa-ICBA Partnership Enables Real-time Banking Transfers July 17, 2025
  • Cash App Launches Tap to Pay for Business Sellers on iPhone July 17, 2025
  • Hackers use Microsoft Teams to spread Matanbuchus malware July 17, 2025
  • PalmPay, Moniepoint, OPay, Interswitch Make CNBC’s Top Fintech List July 17, 2025
  • MTN South Africa Plans Sunset of 2G and 3G Networks July 17, 2025
  • Canal+ MultiChoice Deal Faces South African Review this Week July 17, 2025

Browse Archives

July 2025
MTWTFSS
 123456
78910111213
14151617181920
21222324252627
28293031 
« Jun    

Quick Links

  • About TechBooky
  • Advertise Here
  • Contact us
  • Submit Article
  • Privacy Policy
Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
  • African
  • Artificial Intelligence
  • Gadgets
  • Metaverse
  • Tips
  • About TechBooky
  • Advertise Here
  • Submit Article
  • Contact us

© 2025 Designed By TechBooky Elite

Discover more from TechBooky

Subscribe now to keep reading and get access to the full archive.

Continue reading

We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.