hatedbad

joined 2 years ago
[–] hatedbad 1 points 1 year ago (1 children)

the hostname of a website is explicitly not encrypted when using TLS. the Encrypted Client Hello extension fixes this but requires DNS over HTTPS and is still relatively new.

[–] hatedbad -2 points 1 year ago (1 children)

honestly i wouldn’t trust your linux example at all, what happens with run([“echo”, “&& rm -rf /“])

[–] hatedbad 8 points 1 year ago (1 children)

just a guess, but in order for an LLM to generate or draw anything it needs source material in the form of training data. For copyrighted characters this would mean OpenAI would be willingly feeding their LLM copyrighted images which would likely open them up to legal action.

[–] hatedbad -2 points 1 year ago (1 children)

even in your hypothetical of a file name passed in through the args, either the attacker has enough access to run said tool with whatever args they want, or, they have taken over that process and can inject whatever args they want.

either attack vector requires a prior breach of the system. you’re owned either way.

the only way this actually works as an exploit is if there are poorly written services out there that blindly call through to CreateProcess that take in user sourced input without any sanitization, which if you’re doing that then no duh you’re gonna have a bad time.

cmd.exe is always going to be invoked if you’re executing a batch script, it’s literally the interpreter for .bat files. the issue is, as usual, code that might be blindly taking user input and not even bothering to sanitize it before using it.

[–] hatedbad 2 points 1 year ago (6 children)

i’m not understanding how this is supposed to be so severe. if an attacker has the ability to change the arguments to a CreateProcess call, aren’t you hosed already? they could just change it to invoke any command or batch file they wanted.

[–] hatedbad 3 points 1 year ago

computer science teaches you the theories of computation which absolute starts with mechanical computers.

if one didn’t study Turing’s tape machine in their compsci program then they should demand their money back.

[–] hatedbad 8 points 1 year ago (1 children)

open source software getting backdoored by nefarious committers is not an indictment on closed source software in any way. this was discovered by a microsoft employee due to its effect on cpu usage and its introduction of faults in valgrind, neither of which required the source to discover.

the only thing this proves is that you should never fully trust any external dependencies.

[–] hatedbad 4 points 1 year ago

git bisect is just this guy jumping through portals to alternate universes where the bug either exists or doesn’t

[–] hatedbad 4 points 1 year ago (1 children)

“every situation you're involved in is 10X more escalated than it needs to be because a gun is involved at all”

absolutely delusional, please explain how a vulnerable individual concealed carrying is escalating anything.

[–] hatedbad 17 points 1 year ago (5 children)

jfc what an absolute hellhole of a police state you’ve dreamt up. so many of your hairbrained ideas amount to “cops should have unlimited access to your private life”. how exactly do you think this would play out given the US and it’s systemic racism and classism?

[–] hatedbad 3 points 1 year ago (6 children)

yeah silly me for supporting artists with my money but also downloading drm-free copies of things so I can actually exercise a semblance of ownership. but sure, keelhaul me so you can keep your sense of smug superiority.

[–] hatedbad 1 points 1 year ago (10 children)

AI is a tool that is fundamentally based on the concept of theft and plagiarism. The LLM training data comes from artists and creators that did not consent to their work being plagiarized by a hallucinating machine.

view more: ‹ prev next ›