this post was submitted on 26 May 2025
135 points (94.1% liked)
Technology
70461 readers
2720 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
We also have a standard socket and a high power socket.
Expect our normal outlets provide 230V 16A 3.5kW (3kW sustained) and the typical high power outlets outlets provide 400V 30A 11kW or 400V 60A 21kW.
Which is why typical electric stoves here use 11kW and typical instant water heaters use 21kW.
Though probably the most noticeable advantage is in electric car charging.
Yeah, in Sweden I charge our plug in hybrid off 240, it's pretty quick and you can use any outlet.
The giant round connectors are weird BTW, with all the holes, trying to sort that out for faster charging.
I don't think we should run 100+ volts everywhere, we need to standardize on lvdc in most places (basically usb-c or so) with 100v only in kitchens and places you need it, because it's more dangerous and can cause fires more easily.
That's a common misconception. It's the Amps that cause fires, not the voltage.
The 5090 uses 600W, at 12V that's 50A, but at 120V that'd only be 5A and at 240V only 2.5A.
50A melts cables and burns your PC down, 2.5A won't. The only risk of higher voltages is that they can jump across small air gaps much easier.
No it's not, I'm an ee.
P = I^2R, so power squares against the current, while it's linear to voltage.
This means current causes more heat dissipation in the wire, which has risks, potentially fire if you really go too far, this is why breakers trip.
But what really causes fires (again, outside of crazy overcurrent) is Arcing, from basically either bad connections or bad insulation, OR, from an inductive load that gets disconnected, so the current tries to stay constant in the coils, which leads to massive voltage spikes.
The recent technology connections video cited a lot of statistics on this topic, and at least household fires are primarily caused by overcurrent, not by arcing.
You probably know more than me — I only studied compsci with ee as minor — but from my personal experience, I've seen many cases where overcurrent caused damage, burns or fire, but I can't remember a single case where arcing caused actual damage.
Even in cheap chinesium powerstrips, the primary cause of fires is overcurrent due to AWG 22 copper clad iron wire, not arcing. (Though the switches usually weld themselves together after a few dozen uses).
Arcing causes more fires, because over current caused all the fires until we tightened standards and dual-mode circuit breakers.
Now fires are caused by loose connections arcing, and damaged wires arcing to flammable material.
Breakers are specifically designed for a sustained current, but arcing is dangerous because it tends to cascade, light arcing damages contacts, leading to more arcing in a cycle.
The real danger of arcing is that it can happen outside of view, and start fires that aren't caught till everything burns down.