All you Dog Fuckers better Switch to Android

Aug 11, 2021
Adult Business
23 0

Apple is about to start going through your pictures. Using child sexual abuse as the probable cause..This would be great if it was limited to ONLY child pornography.. But like Alana and the Unions naked tweet removal quest, it’s gonna be up to someone else to determine whats acceptable and what’s not..You would think that something like that is cut and dry..it’s not..There are people in today’s society who feel like breastfeeding your child is pornographic..

And whats gonna happen when a picture of a dog eating out out some random porn chick pops up? Where is the line drawn? There’s supposed to be a poop orgy later this month, will that trigger someone or some program?

While in theory I’m all for this, I know how invasive Apple already is…

Apple Will Scan U.S. iPhones For Images Of Child Sexual Abuse

Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detected known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will only flag images that are already in the center’s database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn’t “see” such images, just mathematical “fingerprints” that represent them — could be put to more nefarious purposes.

Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

Read More on NPR

0 0 votes
Article Rating
Spread the love
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments
TrafficHolder.com - Buy & Sell Adult Traffic
0
Would love your thoughts, please comment.x
()
x