this post was submitted on 29 Mar 2024
1022 points (98.0% liked)

Curated Tumblr

4820 readers
4 users here now

For preserving the least toxic and most culturally relevant Tumblr heritage posts.

The best transcribed post each week will be pinned and receive a random bitmap of a trophy superimposed with the author's username and a personalized message. Here are some OCR tools to assist you in your endeavors:

Don't be mean. I promise to do my best to judge that fairly.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] federalreverse@feddit.de 72 points 1 year ago (3 children)

Apple, afaik, used to be doing this on-device rather than in the cloud. Not quite sure about the situation today.

[–] lseif@sopuli.xyz 22 points 1 year ago (1 children)
[–] Septimaeus@infosec.pub 33 points 1 year ago (1 children)

I don’t. Corps gonna corp, if they can. But I’ve checked this using all the development, networking, and energy monitoring tools at my disposal and apple’s e2e and on-device guarantee does appear to hold. For now.

Still, those who can should audit periodically, even if they’re only doing it for the settlement.

[–] brbposting@sh.itjust.works 8 points 1 year ago (1 children)
[–] Septimaeus@infosec.pub 5 points 1 year ago* (last edited 1 year ago) (1 children)

Security is in my interest, but yw

[–] Hawk@lemmynsfw.com 6 points 1 year ago (1 children)

They were inferencing a cnn on a mobile device? I have no clue but that would be costly battery wise at least.

[–] didnt_readit@sh.itjust.works 1 points 1 year ago* (last edited 1 year ago) (1 children)

They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.

[–] Hawk@lemmynsfw.com 1 points 1 year ago

It couldn’t quite be a decade, a decade ago we only just had the vgg — but sure, broad strokes, they’ve been doing local stuff, cool.