— — Apps

In-app events hit the App Store, TikTok tries Stories, Apple reveals new child safety plan – TechCrunch

Welcome back to This Week in Apps, the weekly TechCrunch series that recaps the latest mobile OS news, mobile applications, and the overall app economy. The app industry continues to grow, with a record 218 billion downloads and $143 billion in global consumer spending in 2020. Consumers last year also spent 3.5 trillion minutes using apps on Android devices alone. And in the U.S., app usage surged ahead of the time spent watching live TV. The average American watches 3.7 hours of live TV daily but now spends four hours per day on their mobile devices.

Apps aren’t just a way to pass idle hours — they’re also a big business. In 2019, mobile-first companies had a combined $544 billion valuation, 6.5x higher than those without a mobile focus. In 2020, investors poured $73 billion in capital into mobile companies — a figure up 27% year-over-year. This Week in Apps offers a way to keep up with this fast-moving industry in one place, with the latest from the world of apps, including news, updates, startup fundings, mergers and acquisitions, and suggestions about new apps and games to try, too.

App Store

Apple to scan for CSAM imagery.

Apple announced a significant initiative to scan devices for CSAM imagery. The company on Thursday announced a new set of features, arriving later this year, that will detect child sexual abuse material (CSAM) in its cloud and report it to law enforcement. Companies like Dropbox, Google, and Microsoft already scan for CSAM in their cloud services, but Apple had allowed users to encrypt their data before it reached iCloud. Now, Apple’s new technology, NeuralHash, will run on users’ devices to detect when users upload known CSAM imagery without decrypting the images. It can even notice the imagery if it’s been cropped or edited to avoid detection.

Meanwhile, on iPhone and iPad, the company will roll out protections to Messages app users to filter images and alert children and parents if sexually explicit photos are sent to or from a child’s account. Children will not be shown the photos but will see a grayed-out image instead. If they try to view the image through the link, they’ll be shown interruptive screens explaining why the material may be harmful and warned that their parents would be notified.

Some privacy advocates pushed back af such a system, believing it could expand to end-to-end encrypted photos, lead to false positives, or set the stage for more on-device government surveillance. But many cryptology experts believe the system Apple developed provides a good balance between privacy and utility and have endorsed the technology. In addition, Apple said reports are manually reviewed before being sent to the National Center for Missing and Exploited Children (NCMEC). The changes may also benefit iOS developers who deal in user photos and uploads, as predators will no longer store CSAM imagery on iOS devices in the first place, given the new risk of detection.

Katie Axon

After leaving the corporate world to pursue my dreams, I started writing because it helped me organize and express myself. It also allowed me to connect with people who share my passion for art, travel, fashion, technology, health, and food. I currently write on vexsh, a site focused on sharing and discovering what it means to be a creative, passionate person living in today's digital age.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button