this post was submitted on 06 Mar 2026
338 points (98.0% liked)
Technology
82329 readers
3468 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I always love it when folks who don't actually know what they're talking about, comment like they do...
It's not just the browser. This example is the browser, but it's your entire system stability that is affected by random bit flips.
There's a jump instruction by an address read from RAM, a bit flip occurred so a condition "if friend greet else kill" worked as "if friend rape else kill". Absolutely anything can happen, that wasn't determined by program design flaws and errors. A digital computer is a deterministic system (sometimes there are intentional non-deterministic elements like analog-based RNGs), this is non-deterministic random changes of the state.
In concrete terms - things break without reason. A perfect program with no bugs, if such exists, will do random wrong things if bit flips occur. Clear enough?
In practice preftct programs do exist, they just have to be small enough to do formal verification
Blah blah blah, made up use cases.
I don't want to use the M-word or the T-word, but those "made up use cases" constitute every computer program in existence.
Sorry let me correct. Use cases that normal people give two fucks about and based on reality.
Each and every one of them, moron. Everything you do on a computer every moment.
Guy, you can't even separate your metal tracks on what people care about vs what is possible.
Point to me where you live in the spectrum.
People care about what they care about breaking in their hands and exploding into their faces.
ASD and BAD, probably also ADHD.
People also love to assume what they keep on their hard drives and memory sticks is somehow preserved over time and machine time. Bitflips and other physical effects onto your imagined perfect machine are why it's not, and is as good or worse as what's written on paper. A cat decides to piss onto your grandpa's diary and there's no more diary. Or humidity slowly eats it. With computers it's even faster.
Oh, now we're launching into hard drive degregation because firefox crashing isn't actually enough to send normies into a rage?
Nobody but nerds optimizing for use cases for the. 01%.