Home
Forums
NAIJAFANS TV
NAIJAFANS RADIO
New posts
Trending
Search forums
What's new
New posts
New listings
New resources
New profile posts
Latest activity
Classifieds
New listings
Resources
Latest reviews
Search resources
Members
Current visitors
New profile posts
Search profile posts
Log in
Register
What's new
Search
Search
Search titles only
By:
Menu
Log in
Register
Install the app
Install
Home
Computer / Gadgets
Apple new feature to scan iPhones for child sex abuse images
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Techfader" data-source="post: 2044" data-attributes="member: 646"><p>[ATTACH=full]747[/ATTACH]</p><p></p><p><strong>Apple has announced details of a system to find child sexual abuse material (CSAM) on US customers' devices.</strong></p><p>Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.</p><p>Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.</p><p>However there are privacy concerns that the technology could be expanded to scan phones for prohibited content or even political speech.</p><p>Experts worry that the technology could be used by authoritarian governments to spy on its citizens.</p><p></p><p>Apple said that new versions of iOS and iPadOS - due to be released later this year - will have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy".</p><p></p><p>The system works by comparing pictures to a database of known child sexual abuse images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations.</p><p>Those images are translated into "hashes", numerical codes that can be "matched" to an image on an Apple device.</p><p>Apple says the technology will also catch edited but similar versions of original images.</p><h2>'High level of accuracy'</h2><p>"Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes," Apple said.</p><p>The company claimed the system had an "extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account".</p><p>Apple says that it will manually review each report to confirm there is a match. It can then take steps to disable a user's account and report to law enforcement.</p><p></p><p>The company says that the new technology offers "significant" privacy benefits over existing techniques - as Apple only learns about users' photos if they have a collection of known CSAM in their iCloud Photos account.</p><p>However some privacy experts have voiced concerns.</p><p>"Regardless of what Apple's long term plans are, they've sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users' phones for prohibited content," Matthew Green, a security researcher at Johns Hopkins University, said.</p><p>"Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone."</p><p></p><p></p><p>[URL unfurl="true"]https://www.bbc.com/news/technology-58109748[/URL]</p></blockquote><p></p>
[QUOTE="Techfader, post: 2044, member: 646"] [ATTACH type="full"]747[/ATTACH] [B]Apple has announced details of a system to find child sexual abuse material (CSAM) on US customers' devices.[/B] Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM. Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement. However there are privacy concerns that the technology could be expanded to scan phones for prohibited content or even political speech. Experts worry that the technology could be used by authoritarian governments to spy on its citizens. Apple said that new versions of iOS and iPadOS - due to be released later this year - will have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy". The system works by comparing pictures to a database of known child sexual abuse images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations. Those images are translated into "hashes", numerical codes that can be "matched" to an image on an Apple device. Apple says the technology will also catch edited but similar versions of original images. [HEADING=1]'High level of accuracy'[/HEADING] "Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes," Apple said. The company claimed the system had an "extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account". Apple says that it will manually review each report to confirm there is a match. It can then take steps to disable a user's account and report to law enforcement. The company says that the new technology offers "significant" privacy benefits over existing techniques - as Apple only learns about users' photos if they have a collection of known CSAM in their iCloud Photos account. However some privacy experts have voiced concerns. "Regardless of what Apple's long term plans are, they've sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users' phones for prohibited content," Matthew Green, a security researcher at Johns Hopkins University, said. "Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone." [URL unfurl="true"]https://www.bbc.com/news/technology-58109748[/URL] [/QUOTE]
Insert quotes…
Verification
Post reply
Richest Naijafans User
Most NaijaCash
Naijafans
11,217 NaijaCash
Streetot
6,275 NaijaCash
N
NL SOFT
2,595 NaijaCash
maventechie
589 NaijaCash
SACHSTOSHI
578 NaijaCash
Naijablog
397 NaijaCash
Klaus
390 NaijaCash
Naijababe
272 NaijaCash
bestosteopathy1
205 NaijaCash
I
Irinaabada
130 NaijaCash
Home
Computer / Gadgets
Apple new feature to scan iPhones for child sex abuse images
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.
Accept
Learn more…