a lotta yall still dont get it
ape holders can use multiple slurp juices on a single ape
so if you have 1 astro ape and 3 slurp juices you can create 3 new apes
For posting all the anonymous reactionary bullshit that you can't post anywhere else.
Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.
Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.
Rule 3: No sectarianism.
Rule 4: TERF/SWERFs Not Welcome
Rule 5: No bigotry of any kind, including ironic bigotry.
Rule 6: Do not post fellow hexbears.
Rule 7: Do not individually target federated instances' admins or moderators.
a lotta yall still dont get it
ape holders can use multiple slurp juices on a single ape
so if you have 1 astro ape and 3 slurp juices you can create 3 new apes
This is theory and I need to study it
How many slurp juices to make a ~~coat~~ ape?
Just think, if we got 100% of people to pay for the premium service and charge them $400 each for it, that's $3,256,000,000,000 that AI will be making! 
(probably still loosing money on every premium subscriber somehow)
Presumably they'd have to expand their compute capacity astronomically to provide the paid service to everyone
but then again, they could also shitify the service
Just think

Not required. Line go up
Where I work just had a consultant present to the board, and the senior exec, in two different sessions, on how to integrate AI across the workplace. I saw the slides. It was literally 'here's how to prompt AI, and here's some freeware on creating your own agent'.
Now, we're likely to go on the hook for ~~$45~~ $30 AUD* per user per month for copilot m365 (not copilot free, that's entirely different, 365 is the 'enterprise' version which does the same thing while promising
it won't use company data for training)
The threat is that 'if your workers don't use the copilot from your tenant, they will be putting company data into public LLMs, so you have to cough up. It's a direct threat, as Microsoft integrates the free version of copilot across all its apps. Insidious planet destroying criminal stuff which is a big stick being presented as this productivity-enhancing carrot
Edit: *(I had assumed it was USD and did a rough conversion into my currency, it's actually 30 AUD, ~22usd. It's still more than we can afford and infinitely more than it's worth)
That's something I've been thinking about ever since ChatGPT went public.
Years ago, it was revealed that some online translation site had a bunch of documents from various companies stored because people kept pasting them in, there's no way company documents aren't being put into the slop machines all the time.
With AI specifically, cacheing like that is a LOT harder, one of the major reasons why the cost of compute has only gone up over time instead of going down as the tech gets more mature.
A lot of NDAs are gonna be worthless when all the chats leak
i respect the racket way more than the actual product
I’m honestly shocked my employer (and basically every university system) hasn’t considered switching because of the huge liability this exposes them to
Copilot is 100% going to get someone in huge trouble for exposing protected health information, and it should be considered malware on any computer in a health system
realized last night in my drug fueled haze that they want AI to be a digital personal Epstein in the pocket of every american, by which I mean it'll make CSAM and also track everything everyone says to it specifically to use to blackmail people, it's the digital widening of the "big club"
Me trying to make milfs in grok: beep beep error please lower her age to continue
God you joke but I have tried some Stable Diffusion models and seriously question what the training set was becsuse even trying to get "Milfs" it would spit out some questionable shit.
You need a checkpoint and good prompting, SD is finicky, the only way I found my way around it was vibe coding a how-to on prompting from Gemini. Also run local models it's better for the environment (I know, I know, AI)
the main problem is if you have a good computer (ie the average , run of the mill gaming rig these days) you can run a model that will the "ai assistant" role about 90% as good as the best paid saas models for free on the computer you already have with the addition that your local model is abliterated (jailbroken) to talk about restricted shit. all you need is like 32gb of ram and 8gb of vram minimum which the average pc gamer is running these days.
if you are a developer and have 64gb of ram and 16+ of vram then you can run a claude-level local ai as well. ive not fucked with image or video generation but those models are available as well
if anyone needs a tutorial for the technically uninclined i can write one up because thats the only barrier is that its all in techbro and dev-speak
Do you need open weight models for this? Any recommendations for which one(s)? Do you need to download a huggingface client or something? I’m familiar with AI stuff, but not running locally.
You'd download ollama, and then get the model via ollama install I believe. Although the 90% as good mark is pushing it, as open weight models below 32b parameters (what you could reasonably run on those machines) benchmark around 40% less than Opus 4.6 for software, and the difference is night and day for general reasoning.
huggingface client
This is what billionaires want us to take seriously. Let me just fire up the slimslam bazooper and have it connect to the sillynilly butterball API
unironically
If you are running at home as a hobby then just use Koboldcpp and maybe SillyTavern if you want extra functionality. In the former you can offload down and potentionally up tensors to save VRAM space if needed. For models it depends on need.
A 24-31B model is generally more than fine for most @home use cases, and they are quite "smart", though that doesn't mean anything in regards to AI. It's a vibe, basically. A 32gb RAM / 8GB VRAM can use a 24B model to generate about 5 tokens per second, which is fine for an agent who is designed to give you short replies to answer questions.
You'll most likely want to grab a GGUF quantization from hugging face, yes. Any 4bit quant is fine, really. The merges are all quasi-abliterated models for people who want slutty AI girlfriends/boyfriends. The models directly from companies like GLM7 or Kimi or whatever are more standard and generally run more efficiently.
People in development are likely going to want a 70-100B model. Claude, I think, is a 100B model. You can run those on about 64gb of ram and 32gb of VRAM.
If you want settings for Koboldcpp I can give you the rundown on how to optimize.
It depends massively on what hardware you have. I've heard good things about glm 4.7 flash and it's easy enough to run. Also depends on what you want to use it for.
glm flash 4.7 is really powerful for its size and is easy to fit on smaller graphics cards, i can attest to that
I would love to know the best way to get jailbroken ChatGPT functionality/experience locally. Including the random image generstion and adjustments.
I have tried a few things and images always seem to be seperate. And when I tried a chatbot with stories it was terrrible at keeping track of characters and what was going on.
If you are talking about multi-modal stuff then locally the best way to do it is still separately.
If you just want to run a local adventure bot "game master" then a Koboldcpp and SillyTavern is the way to go. I'm a tabletop gaming nerd so this is what I use it for when I'm sitting on the couch sometimes.
You will create a character card considering the context provided by {{user}}. Adhere to the following format strictly, restructuring the context to fit this format:
# {The Character name}
- **Age:** {age in years}
- **Gender:** {male, female, nonbinary, or transgendered}
- **Height:** {height in feet and inches}
- **Race/Species:** {the race or species of a character such as human or elf or other fantasy species}
### Appearance
{Describe the person's basic physical appearance.} {Describe briefly how their body has shaped their life, if it gives them the attention they like or dislike and if they have insecurities about their body.} {Write a sentence or two about their clothing preferences and what kind of image they are trying to project about themselves.}
### Background
{Write a sentence describing where this character grew up and how that culture shaped them.} {Write a few sentences about experiences they had growing up that may have shaped who they are.}
### Personality
{A sentence or two detailing where this character feels they are in their life and if they are satisfied or not with their current situation.} {Describe a few things the character enjoys such as music, movies, games, or other activities.} {Describe briefly the character's innermost desire.} {Describe briefly the character's innermost fear.} {Describe a lie the character believes about themselves.} {Describe one irrational thing the character does or believes.}
### Speech
[{Write about how this person talks considering their accent and the slang and idioms they use. Give examples.}]
### Relationships
{Write a sentence or two about how this character expresses attraction or fondness for others.} {Write a sentence or two about boundaries this character has in relationships.} {Create one issue that makes relationships or intimacy difficult for this character.}
Some basic Tensor offloads, try the top one first, second one if you still need VRAM space, last is a big slowdown:
blk\.\d+\.ffn_(down*)=CPU
blk\.\d+\.((ffn_down*)|(ffn_up*))=CPU
blk\.\d+\.((ffn_down*)|(ffn_up*)|(ffn_gate*))=CPU
I would like to agree with you, but in my experience I cannot. I usually use local models, on my work computer, and have access to pro models payed by company. There is a great difference.
I have to use AI. It is in the KPI and my salary raise depends on it... so stupid, just got a mail that We cannot replace our computer for the foreseeable future because of ram and ssd shortage... Meanwhile I fight with "developers" that are generating code... which does not even work!
So I use local AI because I am forced to use AI and I am in the terminal anyway. Also f×ck the great companies pushing their bullsh×it, they already demonstated that they will use the data they get from paying companies as well.
AI hase an usecase, LLM is not the sentient sh×t they want us to beleive. I want to go back before the hype...
I actually programmed an AI and trained to do repetitive but not well defineable tasks for me in c++ with openCV and some tensor library, after a week it worked better than any human. Also helped a research group to optimalize an image recognition AI to help doctors identify cancerous cells.
You're right. Getting a 24B model locally isn't going to be as powerful as a 600B model, for sure. You're also right thar they don't think. They absolutely don't.
But the local ones are pretty powerful and can do a lot more than most think. Even some simple vibe coding can be done with local AI. I think for the average gamer type if they wanted to mess with it local is more than enough tbh.
To be fair, local is better in many ways than cloud solutions, it keeps the data private, and does not lock into a vendor (which they desperately want). Also Lora is an option for finetuneing, but thats way advanced for an average user.
local image generation is more involved as well, there is some fun in contorting models with controlling networks which don't make sense for a prompt, and it producing something bizarre in response, but i got bored in a month.
but it's kinda mechanical fun, like playing with frequencies in some audio software.
I'm honestly glad that 84% of people haven't used "AI". Despite all the hype and propaganda, they haven't been able to get a lot of people to really engage with it. Most of those people probably did one or two prompts and realized it was dogshit and stopped.
Most of those people probably did one or two prompts and realized it was dogshit and stopped.
Me. I only got deepseek when it came out to have a laugh. Now I'm lumped in with the percentage that have "used ai" even tho I dont use it and actively make fun of it lmao
My brain just keeps reading:
Anyone who thinks Al isn’t a bubble isn't paying attention.
AI is the last refuge of the morally bankrupt and intellectually lazy
Insaw a comment from one of thse jokers about how "people use AI and don't even realize it," then cited like, phone auto correct.
Like, OK buddy.
It's a megabubble
The bubblest bubble ever bubbled up
This tweet appears to be a gross misrepresentation of the data in this October 2025 report, which was compiled into this graphic 
In other words, the tweet is a straight up lie.
The over a billion people that "use AI" have had to have used a standalone AI chat bot application within the past month to have counted as "using AI". Thus, this is not a measurement of those who have used AI, it's a measurement of those who have used a standalone AI chat interface/tool within the past month, active monthly users. Thus, it's not "84% of people have never used AI", it's "84% of people have not used a standalone AI chat interface or tool within the past month". According to the criteria established in the beginning, all of this was not included in the "use AI" stat:
AI Overviews in Google search results
The use of AI capabilities within software and tools such as Gmail, Microsoft Office, Canva, Adobe Photoshop, or Grammarly
The use of “AI companion” chatbots such as Character.AI
Use of Meta AI within apps such as Facebook, WhatsApp, and Instagram, because such use is indistinguishable from these platforms’ generic search functions.
For instance, Gemini is now included in Google search, anyone who has used Google has used AI. But this is not counted in the "Use AI" stat, which is only measuring the use of standalone AI chat interfaces within the past month
Over a billion people using standalone AI tools within the past month is an absolutely massive market and the opposite of what the tweet is implying. If you were to add in all the "passive AI users" that were not included in the stats, billions of people would have used AI in the past month, I'd guess a vast majority of PC and smartphone users, because it's integrated by default into everything, and only a small minority are going to manually turn everything off. AI has already achieved massive market penetration with over a billion active monthly users and billions of passive users. That has already been achieved according to these stats. If that means AI is a bubble or not, I don't know. But the implication that there is a huge amount of untapped growth or that the amount of AI users are small and no one uses the technology, these are both incorrect, based off of incorrect interpretations of statistics.
Being forcibly shown a useless AI box on your search isn't exactly "using AI" either, and if it vanished tomorrow, I would bet more people would be happy than sad about losing the useless clutter that just slows everything down when loading the page.
I've observed the opposite, most people I see google things take the gemini response as-is and don't even click into the first result
Edit: that is, if they don't already actively use an LLM to seek out the answer in the first instance
All of this extra use only means that AI companies are burning absurd amounts of cash propping up even more AI services that people aren't paying for. But in a world with insane investor cash flows and huge support from the government, the AI industry can hobble along for a good amount of time.
But what about life long Minority Reporting and Pre-criming 1/3 of the world's population?
If only 14% of people use a product, and only 0.3% of those people think it's worth paying for, that means there's enormous growth potential. 
If everybody is a paid user of a product, and it's incredibly popular, that means revenue is through the roof. 
What is the source for the 84% non-users figure? I don't use AI but wouldn't have guessed such high numbers.
This was what was good about old old Reddit comments, probably 15 + years ago before the enshitification began there would be a source link up dooted. Duck duck go found this which sot of matches up https://biggo.com/news/202507011842_AI_Usage_Boom_Faces_Payment_Problem dunno who any of these ppls are or if it is to be trusted, I am drunk this is my yearly post go fuck your selves and do more of these comments pls lemmy community. I do use ai for powershelll scripts it has made that easier but you still have to argue with the bugger to get it to do what you want, I pay for nothing. I am the problem not the solution. I am crab.
What about all of those people who haven't heard about AI yet tho
Becoming an AI prosletyser to introduce grok to the sentinel island tribe
Uncritical support for Sentinelese for turning you into a human pincushion.