if the off switch works yet
https://www.youtube.com/watch?v=Wm9KnhCB0oc&list=UU9rJrMVgcXTfa8xuMnbhAEA - video
https://pivottoai.libsyn.com/20251012-microsoft-lets-you-disable-ai-face-recognition-three-times-a-year - podcast
time: 4 min 03 sec
if the off switch works yet
https://www.youtube.com/watch?v=Wm9KnhCB0oc&list=UU9rJrMVgcXTfa8xuMnbhAEA - video
https://pivottoai.libsyn.com/20251012-microsoft-lets-you-disable-ai-face-recognition-three-times-a-year - podcast
time: 4 min 03 sec
I am told that Apple, DropBox, etc. have done this for years, often in the name of “fighting CSAM” or “helping you organize your photos”. https://support.apple.com/en-us/108795 Agree that its a very good reason not to touch corporate cloud services and to not let people take digital photos of your face even if they promise not to share them! I do not trust any company with physical assets in the USA not to be penetrated by three-letter-organizations and data brokers.
Apple does not look at your data. Several years ago they announced plans to scan for “harmful content” and quickly abandoned it when watchdogs called them out on the plan being a horrendous privacy violation.
https://appleinsider.com/articles/23/08/31/apple-provides-detailed-reasoning-behind-abandoning-iphone-csam-detection
didn’t they recent-ish have a “oops, you weren’t supposed to see that we’re making backups of your deleted photos, sry not sry” incident?
Dont know about that incident, but that is different than scanning for csam which is different than scanning for faces which is different from feeding your images into an genAI training set.
You could think that they are doing all this anyway (which I think the AI only companies do btw, vut doubt the bigger ones do, esp Apple).