this post was submitted on 14 Feb 2026
816 points (99.2% liked)
Technology
81208 readers
6377 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Right, a question that literal neuroscientists couldn't answer.
I believe the technical term is "your brain is way more fucking complex". We have like 50 (I'm not a neuroscientist, just studied AI) chemicals being transmitted around the brain, frequently. They're used and passed on by cells which do biological and chemical things I dont understand. Ever heard of dopamine, cortisol, serotonin? AI dont got those. We have neurons that don't connect to every other neuron - only tech Bros would think that's an acceptable expression. Our brain forms literal pathways, along which it transmits those chemicals. No, a physical connection is not the same as a higher average weight, and the people who came up with AI maths in the 50s would back me up.
AI uses floating point maths to draw correlations and make inferences. More advanced AI does this more per second and has had more training. Their neurons are a programming abstraction used to explain a series of calculations and inputs, they're not actually a neuron, nor an advanced piece of tech. They're not magic.
High schoolers could study AI for a single class, then neurobiology right after and realise just how basic the AI model is when mimicking a brain. Its not even close, but I guess Sam Altman said we're approaching general intelligence so I'm probably just a hater.
Everything you said is right, but you're only proving that LLM weights is a severely simplified version of neurons. It neither disproves that they don't have consciousness or that being a mathematical model precludes it from having consciousness at all.
In my opinion, the current models doesn't express any consciousness, but I am against saying they don't because they are a mathematical model rather than by the results we can measure. The fact that we can't theoretically prove consciousness in the human brain also means we can't theoretically disprove consciousness in an LLM model. They aren't conscious because they haven't expressed enough to be considered conscious, and that's the extent we should claim to know.