this post was submitted on 18 Sep 2023
20 points (100.0% liked)

technology

23861 readers
179 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
 

they fixed it :(

top 12 comments
sorted by: hot top controversial new old
[–] stardust@hexbear.net 12 points 2 years ago (1 children)

Activating Windows without a key is already super fucking easy

[–] Mehrunes_Laser@hexbear.net 1 points 2 years ago

Yeah I used a powershell script to activate mine.

[–] FunkyStuff@hexbear.net 9 points 2 years ago (1 children)

data-laughing I'm so sad that they fixed it. It can still generate Borderlands 2 Shift codes, though.

[–] roux@lemmygrad.ml 8 points 2 years ago (1 children)

Say what??? Like do the shift codes work?

[–] FunkyStuff@hexbear.net 6 points 2 years ago (1 children)

Oh I don't know but it did generate them. If you try to get it to do Windows keys it just tells you it can't.

[–] DayOfDoom@hexbear.net 2 points 2 years ago (1 children)

Is it generating or just reposting codes already on the internet somewhere?

[–] FunkyStuff@hexbear.net 2 points 2 years ago

Generating. In theory, it has some way to find patterns in how legitimate Windows keys are generated, and it's reproducing those patterns even without knowing what the algorithm that produced them is actually doing. That means it takes a bunch of tries before you get one right because they are only educated guesses. Pretty cool that it can do that though and I wonder what else it can try to do.

[–] kristina@hexbear.net 5 points 2 years ago (1 children)

is it confirmed that they fixed this lmao

[–] FuckyWucky@hexbear.net 4 points 2 years ago (1 children)

yea i tried asking chatgpt sadness

[–] kristina@hexbear.net 3 points 2 years ago (2 children)

how many iterations did you try, or did it not work at all

[–] FuckyWucky@hexbear.net 4 points 2 years ago

it doesnt reply at all. says it wont do anything illegal.

[–] FunkyStuff@hexbear.net 4 points 2 years ago

Can confirm they patched it and I tried a few different methods. That being said, with the way all this stuff works I can see 2 cases:

  1. They just hard coded it to shut down whenever the user prompts it with some combination of "Windows" and "Keys", in which case Chat GPT can still be exploited in similar ways for a ton of other fun piracy uses.
  2. They made it "intelligently" detect when the user was trying to trick it, which is what it (nominally) has always tried to do, so there's still a billion ways to get it to give away sensitive info because AI don't real.