this post was submitted on 16 May 2026
175 points (98.9% liked)

Programming Humor

3427 readers
57 users here now

Related Communities !programmerhumor@lemmy.ml !programmer_humor@programming.dev !programmerhumor@kbin.social !programming_horror@programming.dev

Other Programming Communities !programming@beehaw.org !programming@programming.dev !programming@lemmy.ml !programming@kbin.social !learn_programming@programming.dev !functional_programming@programming.dev !embedded_prog@lemmy.ml

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] assembly@lemmy.world 8 points 1 day ago (1 children)

So we are going to assume rest api consumers use the api responsibly as opposed to making a ton of additional calls for no reason that an LLM will have to interpret rather than my existing Redis cache? I’m sure an LLM will be far more efficient.

[–] Jesus_666@lemmy.world 15 points 1 day ago* (last edited 1 day ago)

I mean, it will be vastly more efficient for certain tasks. Like night and day.

Previously, a DOS attack could hope to exhaust the service's bandwidth out maybe overwhelm its load balancer. And that needed a large botnet for most services. Now you can run a small-scale attack and exhaust the business's token budget or (if they don't have token limits) their operational budget. That's incomparably more efficient!

And all of that without needing to wait for exploits; it's an aspect of normal operation.

The future is now, old-timer.