Some tech companies take a hard stance on protecting children from online dangers. But for the most part, it’s up to parents to protect their kids. Tap or click here for five ways to protect your child online.
Apple’s iOS 15.2 is almost ready for full release. It adds a handful of functions, but one, in particular, has garnered equal praise and dissatisfaction.
Read on to find out what Apple will be implementing and why it’s causing a huge debate.
Here’s the backstory
With iOS 15.2, the operating system update brings several changes that span security, privacy and a new Apple Music Voice Plan. The added App Privacy Report will tell you which apps have accessed your location or photos in the last seven days.
It’s a significant step towards online privacy and can be a good resource for parents. But there is one addition to iOS 15.2 raising some eyebrows. The Messages app is getting an update and will alert children before they receive any images or photos containing nudity.
However it comes with a twist. While the child using Messages will see the notification about harmful content, their parents won’t be alerted. It was initially announced that it would, but that seems to have changed.
The release notes for iOS 15.2 states that “Communication safety setting gives parents the ability to enable warnings for children when they receive or send photos that contain nudity. Safety warnings contain helpful resources for children when they receive photos that contain nudity.”
In short, you will be able to set up warnings for your child but won’t be notified when they receive such communication.
What you can do about it
The iOS update will use artificial intelligence on the minor’s device to scan any images or photos the Messages app receives for nude pictures. The same technology will also scan image uploads through the messaging app.
If a provocative image is received or attempted to be sent, it will appear blurred in the app until the child decides to open it. Before that, several warnings will pop up, urging the minor to reconsider before proceeding. There will also be an option to text a trusted adult for help.
The feature will only work if the device is designated as being used by a minor. It also must be connected to an adult’s account through Apple’s Family Sharing. A word of caution, though: the child-safety feature only works with Messages and not WhatsApp or Instagram.
To set up Family Sharing for Apple devices, you must first create a new family group. You can add more people and invite them to join when that is done. On an iPhone:
- Tap Settings, then tap your name.
- Tap Family Sharing in the second block, then tap Set Up Your Family.
- Follow the instructions onscreen to complete the setup process.
🚨 What it means for you
The impact technology has on children is once again making headlines, especially with the recent release of internal documents showing the toxic effect Instagram can have on mental health. This Apple update is meant to improve child safety, with a few caveats.
✅ With these updates, images received or prepared to be sent through iPhones, iPads and Macs owned by children will be scanned for nudity. If any are detected, they’ll be blurred and the child will be warned.
✅ Just know that Apple decided to remove the part of the feature that notifies parents. If you would like help setting boundaries, Kim created a tech safety contract for kids you can check out here.
✅ This feature is separate from Apple’s eventual plan to scan all iCloud photos for signs of child abuse. Tap or click here for Kim’s take.