Tech

Apple Kills Its Plan to Scan Your Photos for CSAM. Here’s What’s Next


In August 2021, Apple has announced plans to scan photos users store in iCloud for child sexual abuse material (CSAM). The tools used to protect privacy and allows the company to flag potentially problematic and abusive content without disclosing anything else. But the initiative was controversial, and it soon widely criticized from privacy and security researchers and digital rights groups, who are concerned that surveillance itself could be abused to undermine the privacy and security of iCloud users all around the world. At the beginning of September 2021, Apple says it will pause the rollout of this feature to “gather input and make improvements before releasing these all-important child safety features.” In other words, a launch is still coming. Now, the company says that in response to the feedback and instructions it received, the CSAM detection tool for iCloud photos has stopped working.

Instead, Apple told WIRED this week that it is focusing on anti-CSAM efforts and investing in “Safety of Communication” features, which the company says is not the case. originally published in August 2021 and launched last December. Parents and caregivers can opt-in to protections through the family iCloud account. These features work in Siri, Apple’s Spotlight search, and Safari Search to alert you if someone is viewing or searching for child sexual abuse material, and provide an on-site resource for reporting the content and seek help. Additionally, at the core of the protection is Message Safety, which caregivers can set up to provide alerts and resources to children if they receive or attempt to send nude photos. . The goal is to prevent child exploitation before it occurs or becomes severe and to minimize the creation of new CSAMs.

“After extensive consultation with experts to gather feedback on the child protection initiatives we proposed last year, we are increasing our investment in the feature. Secure communications that we first offered in December 2021,” the company told WIRED in a statement. “We have decided to go ahead with the previously recommended CSAM detection tool for iCloud Photos. Children can be protected without companies checking personal data, and we will continue to work with governments, children’s advocates and other companies to help protect young people. , protect their privacy and make the internet a safer place for children and for all of us.”

Apple’s CSAM update arrives along with its announcement today that the company is significantly expanding its end-to-end encryption services for iCloud, including adding protection for backups and photos stored in the cloud. Child safety experts and technologists working to combat CSAM often oppose the broader implementation of end-to-end encryption because it makes it impossible for tech companies to access user data, making it impossible they are harder to scan and flag CSAM. Law enforcement agencies around the world also have quote the serious problem of child sex abuse in opposing the use and expansion of end-to-end encryption, although many of these agencies have a history of was hostile for end-to-end encryption in general as it can make some investigations more difficult. Research has been Consistency showHowever, that end-to-end encryption is a Important safety tool to protect human rights and the disadvantages of doing it do not outweigh the benefits.

Secure Communication for Messages is an option that engages and analyzes image attachments that users send and receive on their devices to determine if a photo contains nudity. This feature is designed so that Apple never has access to messages, the end-to-end encryption that Messages provides is never broken, and Apple doesn’t even know that a device has detected the photo. Dear.

The company told WIRED that while it’s not ready to announce a specific timeline for expanding the Communication Safety features, the company is working on adding the ability to detect nudity in videos sent via Messages. message when protection is on. The company also plans to expand its offering beyond Messages to its other communication apps. Ultimately, the goal is to make it possible for third-party developers to incorporate Communication Safety tools into their own applications. Apple says the more features that can proliferate, the more likely kids will get the information and support they need before they’re taken advantage of.

The company said in its statement: “Potential child exploitation can be stopped before it happens by providing parents opt-in tools to help protect their children from abuse. unsecured communication”. “Apple is dedicated to developing innovative privacy solutions to combat Child Sexual Abuse Material and protect children, while addressing the unique privacy needs of personal and data saving.”

Similar to other companies that have publicly fought to tackle CSAM—including Meta—Apple told WIRED it also plans to continue working with child safety experts to help users report Reporting content and situations of abuse to advocacy organizations and legislation as easily as possible.

Anti-CSAM is a complicated and ultra-high-stakes nuanced effort for kids around the world, and it’s not yet known how much traction Apple’s bets on proactive intervention will receive. However, the tech giants are on the right track as they work to balance CSAM detection and user privacy.

newsofmax

News of max: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button
Immediate Matrix Immediate Maximum
rumi hentai besthentai.org la blue girl 2 bf ganda koreanporntrends.com telugusareesex hakudaku mesuhomo white day flamehentai.com hentai monster musume سكس محارم الماني pornotane.net ينيك ابنته tamil movie downloads tubeblackporn.com bhojpuri bulu film
sex girel pornoko.net redtube mms odia sex mobi tubedesiporn.com nude desi men صور سكسي متحركه porno-izlemek.net تردد قنوات سكس نايل سات sushmita sex video anybunny.pro bengali xxx vido desigay tumblr indianpornsluts.com pakistani escorts
desi aunty x videos kamporn.mobi hot smooch andaaz film video pornstarsporn.info tamil sexy boobs internet cafe hot tubetria.mobi anushka sex video desi sexy xnxx vegasmovs.info haryana bf video 黒ギャル 巨乳 無修正 javvideos.net 如月有紀