82
submitted 5 months ago* (last edited 5 months ago) by Ghostalmedia@lemmy.world to c/apple_enthusiast@lemmy.world
top 10 comments
sorted by: hot top controversial new old
[-] GlitterInfection@lemmy.world 31 points 5 months ago

This article says some funny things:

While more advanced features will ultimately require an internet connection

Ok, then?

On-device processes could help eliminate certain controversies found with server-side AI tools. For example, these tools have been known to hallucinate, meaning they make up information confidently.

What? How would on-device processes have any effect on hallucination in LLMs?

Or are you trying to tell us that this article was written by an LLM and that the whole thing is a confidently made up hallucination?

[-] Fake4000@lemmy.world 10 points 5 months ago

That's what they all say. But a lot of these so called AI features require power more than what a phone has. Offloading to a server is sometimes a must.

[-] fartsparkles@sh.itjust.works 18 points 5 months ago

Quantised models can be surprisingly small. And if Apple aren’t targeting LLMs for local use, more specific/tailored models absolutely can run on device.

That said, given the precedent sent by Siri, their next progression of Siri into an LLM will absolutely require network connection and be executed server side.

[-] kadu@lemmy.world 8 points 5 months ago* (last edited 5 months ago)

Samsung's version on One UI 6.1 lets you toggle between running the local models on the phone's NPU versus connecting to their servers.

The local version is slightly slower and produces worse results, but can be used for privacy or without the internet. The remote version is what you'd expect.

The thing is, these AI features are just features already present in some way or another, just emphasizing content generation and slapping AI branding.

[-] Daxtron2@startrek.website 0 points 5 months ago

Sure if you're running large models like gpt, smaller models tailored to specific use cases can absolutely run on phones. Whether or not they get there implementation down right is a different story though

[-] aeronmelon@lemmy.world 5 points 5 months ago

So it will very slowly find some results on the web for you.

[-] chrash0@lemmy.world 10 points 5 months ago

you’d be surprised how fast a model can be if you narrow the scope, quantize, and target specific hardware, like the AI hardware features they’re announcing.

not a 1-1, but a quantized Mistral 7B runs at ~35 tokens/sec on my M2. that’s not even as optimized as it could be. it can write simple scripts and do some decent writing prompts.

they could get really narrow in scope (super simple RAG, limited responses, etc), quantize down to even something like 4 bit, and run it on custom accelerated hardware. it doesn’t have to reproduce Shakespeare, but i can imagine a PoC that runs circles around Siri in semantic understanding and generated responses. being able to reach out on Slack to the engineers that built the NPU stack ain’t bad neither.

[-] IchNichtenLichten@lemmy.world 2 points 5 months ago

Hopefully Apple will bump up the memory and storage because of this, it's long overdue.

[-] steal_your_face@lemmy.ml 2 points 5 months ago

Isn’t Siri server-side now?

[-] Ghostalmedia@lemmy.world 11 points 5 months ago

Siri was originally in the cloud, but Apple has been trying to handle more Siri requests locally so that requests can be handled faster and without internet access.

this post was submitted on 16 Apr 2024
82 points (95.6% liked)

Apple

17241 readers
78 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 1 year ago
MODERATORS