456
submitted 11 months ago by L4s@lemmy.world to c/technology@lemmy.world

Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he "absolutely" believes that Amazon will soon start charging a subscription fee for Alexa

you are viewing a single comment's thread
view the rest of the comments
[-] Soundhole@lemm.ee 47 points 11 months ago* (last edited 11 months ago)

That's already here. Anyone can run AI chatbots similar to, but not as intelligent as, Chatgpt or Bard.

Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there's no dedicated graphics card available (although more slowly). And there are numerous open source models available that can be trained for just about any task.

Hell, you can even run llama.cpp on Android phones.

This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

[-] Zetta@mander.xyz 7 points 11 months ago* (last edited 11 months ago)

Yes, and you can run a language model like Pygmalion Al locally on koboldcpp and have a naughty AI chat as well. Or non sexual roleplay

[-] Soundhole@lemm.ee 8 points 11 months ago

Absolutely and there are many, many models that have iterated on and surpassed Pygmalion as well as loads of uncensored models specifically tuned for erotic chat. Steamy role play is one of the driving forces behind the rapid development of the technology on lower powered, local machines.

[-] Chreutz@lemmy.world 15 points 11 months ago

Never underestimate human ingenuity

When they're horny

[-] das@lemellem.dasonic.xyz 2 points 11 months ago

And where would one look for these sexy sexy AI models, so I can avoid them, of course...

[-] Soundhole@lemm.ee 2 points 11 months ago* (last edited 11 months ago)

Huggingface is where the models live. Anything that's uncensored (and preferably based on llama 2) should work.

Some popular suggestions at the moment might be HermesLimaRPL2 7B and MythomaxL2 13B for general roleplay that can easily include nsfw.

There are lots of talented people releasing models everyday tuned to assist with coding, translation, roleplay, general assistance (like chatgpt), writing, all kinds of things, really. Explore and try different models.

General rule: if you don't have a dedicated GPU, stick with 7B models. Otherwise, the bigger the better.

[-] Zetta@mander.xyz 1 points 11 months ago

Which models do you think beat Pygmalion for erotic roleplay? Curious for research haha

[-] Soundhole@lemm.ee 1 points 11 months ago* (last edited 11 months ago)

Hey, I replied below to a different post with the same question, check it out.

[-] Zetta@mander.xyz 1 points 11 months ago

Oh I see, sorry for the repeat question. Thanks!

[-] Soundhole@lemm.ee 1 points 11 months ago

lol nothing to be sorry about, I just wanted to make sure you saw it.

[-] MaxHardwood@lemmy.ca 4 points 11 months ago

GPT4All is a neat way to run an AI chat bot on your local hardware.

[-] Soundhole@lemm.ee 2 points 11 months ago

Thanks for this, I haven't tried GPT4All.

Oobabooga is also very popular and relatively easy to run, but it's not my first choice, personally.

[-] teuast@lemmy.ca 3 points 11 months ago

it does have a very funny name though

[-] scarabic@lemmy.world 1 points 11 months ago

Don’t these models require rather a lot of storage?

[-] Soundhole@lemm.ee 1 points 11 months ago* (last edited 11 months ago)

13B quantized models, generally the most popular for home computers with dedicated gpus, are between 6 and 10 gigs each. 7B models are between 3 and 6. So, no, not really?

It is relative so, I guess if you're comparing that to an atari 2600 cartridge then, yeah, it's hella huge. But you can store multiple models for the same storage cost as a single modern video game install.

[-] scarabic@lemmy.world 1 points 11 months ago

Yeah that’s not a lot. I mean… the average consumer probably has 10GB free on their boot volume.

It is a lot to download. If we’re talking about ordinary consumers. Not unheard of though - some games on Steam are 50GB+

So okay, storage is not prohibitive.

[-] art@lemmy.world 1 points 11 months ago

Storage is getting cheaper every day and the models are getting smaller with the same amount of data.

[-] scarabic@lemmy.world 1 points 11 months ago

I’m just curious - do you know what kind of storage is required?

[-] teuast@lemmy.ca 1 points 11 months ago

In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

You're probably right, but I kinda hope you're wrong.

[-] Soundhole@lemm.ee 1 points 11 months ago
[-] teuast@lemmy.ca 1 points 11 months ago

Call it paranoia if you want. Mainly I don't have faith in our economic system to deploy the technology in a way that doesn't eviscerate the working class.

[-] Soundhole@lemm.ee 1 points 11 months ago* (last edited 11 months ago)

Oh, you are 100% justified in that! It's terrifying, actually.

But what I am envisioning is using small, open source models installed on our phones that can answer questions or just keep us company. These would be completely private, controlled by the user only, and require no internet connection. We are already very close to this reality, local AI models can be run on Android phones, but the small AI "brains" that are best for phones are still pretty stupid (for now).

Of course, living in our current Capitalist Hellscape, it's hard not to imagine that going awry to the point where we'll all 'rent' AI from some asshole who spies on everything we do, censors the AI for our own 'protection', or puts ads in there somehow. But I guess I'm a dreamer.

this post was submitted on 25 Sep 2023
456 points (95.6% liked)

Technology

57944 readers
3518 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS