1
submitted 2 months ago* (last edited 2 months ago) by pavnilschanda@lemmy.world to c/aicompanions@lemmy.world

A new study found that most people think AI chatbots like ChatGPT can have feelings and thoughts, just like humans do. Even though experts say these AIs aren't really conscious, many regular folks believe they are. The study asked 300 Americans about ChatGPT, and two-thirds of them thought it might be self-aware. People who use AI more often were more likely to think this way. The researchers say this matters because what people believe about AI could affect how we use and make rules for it in the future, even if the AIs aren't actually conscious. They also found that most people don't understand consciousness the same way scientists do, but their opinions could still be important for how AI develops.

Summarized by Claude 3.5 Sonnet

you are viewing a single comment's thread
view the rest of the comments
[-] dch82@lemmy.zip 2 points 2 months ago

This will be the beginning of the singularity: the moment we give LLMs human rights

this post was submitted on 13 Jul 2024
1 points (54.5% liked)

AI Companions

513 readers
3 users here now

Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.

Tags:

(including but not limited to)

Rules:

  1. Be nice and civil
  2. Mark NSFW posts accordingly
  3. Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
  4. Lastly, follow the Lemmy Code of Conduct

founded 1 year ago
MODERATORS