Apple illegally goes into people’s phones to scans photos. Why? John Iadarola and Brett Erlich break it down on The Damage Report.
Become a TDR YouTube Member: http://www.youtube.com/thedamagereport/join
Follow The Damage Report on Facebook: https://www.facebook.com/TheDamageReportTYT/
Help build the Home of the Progressives http://tyt.com/JOIN
Subscribe to The Damage Report YouTube channel: https://www.youtube.com/thedamagereport?sub_confirmation=1
Follow The Damage Report on TikTok: https://www.tiktok.com/@thedamagereport?lang=en
Follow The Damage Report on Instagram: http://www.instagram.com/thedamagereport/
Follow The Damage Report on Twitter: https://twitter.com/TheDamageReport
Read more here: https://techcrunch.com/2021/08/05/apple-icloud-photos-scanning/
"Apple told TechCrunch that the detection of child sexual abuse material (CSAM) is one of several new features aimed at better protecting the children who use its services from online harm, including filters to block potentially sexually explicit photos sent and received through a child’s iMessage account. Another feature will intervene when a user tries to search for CSAM-related terms through Siri and Search.
Most cloud services — Dropbox, Google, and Microsoft to name a few — already scan user files for content that might violate their terms of service or be potentially illegal, like CSAM. But Apple has long resisted scanning users’ files in the cloud by giving users the option to encrypt their data before it ever reaches Apple’s iCloud servers."
#TheDamageReport #JohnIadarola #TheYoungTurks