this post was submitted on 20 Feb 2026
13 points (81.0% liked)

Science

15810 readers
92 users here now

Studies, research findings, and interesting tidbits from the ever-expanding scientific world.

Subcommunities on Beehaw:


Be sure to also check out these other Fediverse science communities:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] knokelmaat@beehaw.org 9 points 2 weeks ago* (last edited 2 weeks ago)

Absolutely not. It’s a tool that they used successfully, and they can put it in their method or whatever, but it’s not a person. We never describe other scientific applications as authors, even though they are often essential to reaching a goal or understanding. Think about all those proofs using computer based exhaustion methods and the like. I think people are confused because an LLM interacts so human like. But it is still quite a logical statistical algorithm, as long as we don’t go towards genuine general intelligence with reasoning capabilities and an own identity / “soul”, it is absurd to act like it’s a person. And the situation of a truly conscious and independent AI is still very far off, if it’s even possible at all, in my opinion.