Welcome back to This Week in Apps, the weeklyOS news, mobile applications, and the overall app economy. The app industry continues to grow, with a record 218 billion downloads and $143 billion in global in 2020. Consumers on Android devices alone. And in the U.S., app usage surged ahead of the TV. The average American watches 3.7 hours of live TV daily but now spends four hours per day on their .
they’re also a big business. In 2019, mobile-first companies had a combined $544 , 6.5x higher than those without a mobile focus. In 2020, investors poured $73 billion in capital into a figure up 27% year-over-year. This Week in to try, too.
Apple to scan for CSAM imagery.
a significant initiative to scan devices for CSAM imagery. The company on Thursday announced a new set of features, arriving later this , that will detect child sexual abuse material (CSAM) in its cloud and report it to law enforcement. Companies like Dropbox, Google, and Microsoft already scan for CSAM in their cloud services, but Apple had to encrypt their data before it reached iCloud. Now, Apple’s new technology, NeuralHash, will run on users’ devices to detect when users upload known CSAM imagery without decrypting the images. It can even notice the imagery if it’s been cropped or edited to avoid detection.
Meanwhile, on iPhone and iPad, the company will roll out protections toto filter images and alert children and parents if sexually explicit photos are sent to or from a child’s account. Children will not be shown the photos but will see a grayed-out image instead. If they try to view the image through the link, they’ll be shown interruptive screens explaining why the material may be that their parents would be notified.
Some privacy advocates pushed back af such a system, believing it could expand to Apple said reports are manually reviewed before being sent to the National Center for Missing and Exploited Children (NCMEC). The changes may also benefit iOS developers who deal in user photos and uploads, as predators will no longer store CSAM imagery on iOS devices in the first place, given the new risk of detection.photos, lead to false positives, or set the stage for more on-device government surveillance. But many cryptology experts believe the provides a good balance between privacy and utility and have endorsed the technology. In addition,