Apple’s anti-sexting tool for iOS will warn kids about nudes

But the company dropped a component of the device that will notify parents when their kids are under 13 or send socks. That decision has drawn praise from some advocates who say it protects the privacy of children’s communications, and ranks others who say it can leave children vulnerable to predators and threats.

Stephen Balkum, chief executive officer of the non-profit Family Online Safety Institute, said, “They should have kept parental information for children under the age of 13. There’s no reason why we should be asking ourselves to an 11-year-old.” Ask to be kept safe online.”

Apple’s new feature, expected this month in iOS 15.2, raises questions about how tech companies balance the rights of both parents and children when they develop products, and how much control kids have over their tech use. Should be.

Many parents will say that it is normal – even expected – for them to know what their children are doing on their devices. Sexting among children under the age of 13 has recently increased, and so has online child hunting.

Thorne, a technology nonprofit focused on protecting children from online sexual abuse, released a report last month showing that of 9 to 12-year-olds surveyed, 14% said That he shared sexually explicit images of himself last year. This is up from 6% a year ago. Thorne conducted an online survey of 1,000 children aged 9 to 17 in both 2020 and 2019. In 2020, in a survey of 9- to 12-year-olds, 21% said it was more than 13% for children their age. year ago.

According to Apple, giving kids a chance to think before sending or opening a nude image can prevent them from making hasty decisions, which can have big consequences. Children sometimes share nudes of other children as a form of bullying, and some teens have died by suicide after such incidents. In many states, it is illegal for children to have nude photos of other minors on their phones, even if the exchange was consensual.

Mr Balkum’s organization has been urging Apple for years to do something about sexually explicit images involving children. He commended Apple for taking the step, though he said parents shouldn’t be left out. “I think they are 85% with it,” he said.

This anti-sexting tool for messages was announced earlier this year along with another initiative aimed at protecting victims of child pornography. Apple planned to introduce a system that would identify known child-pornography images and alert the company if a certain number of them were uploaded to iCloud, Apple’s cloud storage service. After receiving criticism from the government for abusing tools capable of flagging users’ content, Apple said it would take longer to collect input. This component will not be part of the iOS 15.2 release.

Some privacy experts say Apple’s anti-sexting solution for Messages is a compromise that protects children and their privacy.

Alyssa Redmills, a privacy scholar and faculty member at the Max Planck Institute for Software Systems, said, “The idea of ​​notifying parents of children under the age of 13 operates with the assumption that there is a relationship with the child that is safe.”

In the case of LGBTQ youth, for example, a parent seeing an image that reveals something about their child’s sexuality can lead to conflict or abuse, she said.

She said notifying parents also tells children they are being watched.

An Apple engineer whom Apple provided to speak about the feature said that, in an earlier plan, children who had been warned about nude photos would also have been told that if they went ahead. So their parents will be informed. While that message was designed to be clear to children, Engineer said, some people may have shed light on it or not understood the implications.

He said the system Apple decided on still provided useful protection. Parents can choose to monitor their kids’ devices in other ways, such as by manually reviewing their texts and photos (if kids don’t delete them first).

The company also hopes to help prevent children from sharing photos with predators. Thorne’s study found that of 9 to 12-year-olds who shared nude photos, 36% did so with people they believed were 18 or older.

How will Apple’s new safety feature work?

When Apple releases iOS 15.2, it’s expected to include the new communication security feature via the Family Sharing setting, which parents must choose to turn on from their phone. When enabled, the artificial-intelligence software on the child’s device will detect nude images that the Messages app receives, or that the child adds to the app. A nude image will appear blurry in the app—unless the child chooses to open it.

Attempting to send or receive such an image will trigger a message asking if they are sure they want to proceed. They’ll also get a second message urging them to talk to someone they trust if they feel pressured to see or send nude photos. They’ll have the option to text a trusted adult for help within the Messages app.

Apple said the company would not have access to the messages or images on its servers, as they are still encrypted end-to-end.

This feature will work on any child-designated account for children under 18, as long as the children’s accounts are linked to the parent’s accounts through Family Sharing. (You can learn how to set up Family Sharing here.)

The feature is also scheduled for the next update for iPadOS 15 and macOS, which have their own versions of the Messages app.

Image blurring and warnings will only work in Apple’s Messages app. The feature will not apply to third-party messaging apps like WhatsApp and Signal, or photo-sharing social-media apps like Instagram and Snapchat.

—For more family and tech columns, advice, and answers to tech questions about your family, sign up for my weekly newsletter.

subscribe to mint newspaper

, Enter a valid email

, Thank you for subscribing to our newsletter!

Never miss a story! Stay connected and informed with Mint.
download
Our App Now!!

,