this post was submitted on 31 Jan 2026
71 points (93.8% liked)
Showerthoughts
39763 readers
1461 users here now
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.
Here are some examples to inspire your own showerthoughts:
- Both “200” and “160” are 2 minutes in microwave math
- When you’re a kid, you don’t realize you’re also watching your mom and dad grow up.
- More dreams have been destroyed by alarm clocks than anything else
Rules
- All posts must be showerthoughts
- The entire showerthought must be in the title
- No politics
- If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
- A good place for politics is c/politicaldiscussion
- Posts must be original/unique
- Adhere to Lemmy's Code of Conduct and the TOS
If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.
Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
LLM can be used as a search engine for things you know absolutely zero terminology about. That's convenient. You can't ask Google for "tiny striped barrels with wires" and expect to get the explanation of resistors marking.
10-15 years ago Google returned the correct answers when I used the wrong words. For example, it would have most likely returned resistors for that query because of the stripes, and if you left off stripes it would have been capacitors.
AI isn't nearly as good as Google was 10+ years ago.
There’s the theory it’s by design. They have made search so bad so that we now turn to Ai to give us what search can, and by that they can effectively charge you for searching…which generally we would baloney the idea of paying to search.
it also drives more revenue to Google as they have less bounce rate off their search page. they get more monetization off their own sponsored results.
hurts sites that are providing information as now less people click through from search. less human traffic and increased bot traffic, which of course Google won't pay for as it filters bots out on its ad network.
It worked yesterday trying to find a video by describing the video and what I remembered from the thumbnail. That was great. I want that for my own photoa and videos without having to upload them somewhere.
It sounds like you might be referring to miniature striped barrels used in crafts or model-making, often decorated or with wire elements for embellishment or functionality. These barrels can be used in various DIY projects, including model railroads, dioramas, or even as decorative items.
Reverse image search would let you find that answer more accurately than some llm
How? And don't those image searches have LLMs under the hood?
When you see something you have no idea what it is, you just take a photo and do the reverse search, finding other similar photos and the name of the thing. You don't even need to spend time describing what you see and won't have a chance of getting a wrong confident answer. Reverse image search exists for more than a decade and don't use llms
ML is ML. No matter if it is LLM or not. And the question "What is this thing?" covers a negligibly tiny percent of search requests.
It's not all the same. Application-specific ml models tend to be much smaller and demand much less resources than llms. They also tend to be more precise.
I was just addressing the given example