227
submitted 1 year ago by ooli@lemmy.world to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] CharlestonChewbacca@lemmy.world 5 points 1 year ago

Have you tried 3.5 or 4?

I haven't had many issues in 4. Occasionally it does what you're saying and I just say "bro, that doesn't exist" and it's like "oh, my bad, here you go." And gives me something that works.

[-] TurnItOff_OnAgain@lemmy.world 1 points 1 year ago

I don't remember what version. I just gave up trying

[-] CharlestonChewbacca@lemmy.world 0 points 1 year ago

Well don't expect it to just give magical results without learning prompt engineering and understanding the tools you're working with.

[-] TurnItOff_OnAgain@lemmy.world 2 points 1 year ago

Set-MailboxAddressBook doesn't exist.

Set-ADAttribute doesn't exist.

Asking for a simple command and expecting to receive something that actually exists is magical?

[-] radau@lemmy.dbzer0.com 1 points 1 year ago* (last edited 1 year ago)

I used gpt4 for terraform and it was kind of all over the place in terms of fully deprecated methods. It felt like a nice jumping off point but honestly probably would've been less work to just write it up from the docs in the first place.

I can definitely see how it could help someone fumble through it and come up with something working without knowing what to look for though.

Was also having weird issues with it truncating outputs and needing to split it, but even telling it to split would cause it to kind of stall.

Just yesterday I had 4 make up a Jinja filter that didn't exist. I told it that and it returned something new that also didn't work but had the same basic form. 4 sucks now for anything that I'd like to be accurate.

[-] Spellbind8558@lemmy.world 1 points 1 year ago

Both models have definitely decreased in quality over time.

[-] CharlestonChewbacca@lemmy.world 0 points 1 year ago

What kind of prompts are you giving?

I find results can be improved quite easily with better prompt engineering.

It makes things up wholecloth and it's the user's fault for not prompting in correctly? Come on.

[-] CharlestonChewbacca@lemmy.world 0 points 1 year ago

It's not a person. It's a tool.

this post was submitted on 30 Aug 2023
227 points (96.3% liked)

Technology

59066 readers
4390 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS