view the rest of the comments
Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
Oh for goodness sake. 400MJ in for 3.15MJ out is not a net energy gain. I wish just once they'd be honest about what they do, it's ok to do basic physics research without pretending you've saved the world every six months.
Where do you get those numbers from? They don't seem to match the figures in this article or the article it links to. I get that you're saying they leave out some important facts about the total energy used in the experiment, but I'm curious about exactly what's not documented here.
Wikipedia's figures for the last time they made this claim. The exact figures might be a bit different this time round, but I doubt they've found 99% efficiency gains. Livermore sends out this sort of press release pretty regularly and it always comes down to the same creative accounting
Basically, there's a whole load of input energy that they just don't count. Heat? Doesn't count. UV? Doesn't count. Plasma? Doesn't count. this diagram from the wiki might be instructive. There may be decent justifications for counting it like this - I don't know, I'm not a nuclear physicist. But I think the way they continue to report it to the media is simply dishonest.
The logic is that they don't count ignition costs because they only have to be paid once. So it's producing more than it consumes, and would eventually start netting a surplus.
Except it's not and it won't. It's just a fraction of a second pop and done. There's no sustained reaction because inertial confinement by it's nature is extremely temporary, and there's no way to introduce new fuel. If they do some monster fuel pellet that outshines the laser then sure - they can claim a net surplus. If they find some contrivance to keep a reaction going after it's started then fantastic, well done, the day is saved. But they're not likely to do that at the NIF because, shhh! NIF is not really about generating energy.
Except this one isn't basic physics research. It's an end run around nuclear weapons treaties to test how missiles and planes respond to H-bombs going off nearby.
It could have an energy application (maybe), but given that the targets are ludicrously expensive, the most viable power plant would resemble the attempts in the 60s to use bombs in underground caverns to heat things up and put essentially a geothermal plant on top. Except with a laser detonator rather than a fission one. Chances of making it economically viable or reliable are slim.
Yeh that was me being circumspect. Last time i called it a weapons facility I got one of the researchers in my replies complaining that they totally intend to get round to some energy research one of these days. He didn't bother to correct any of the people in the same thread who were excited about their fusion power dreams finally coming true.
It's a shame. Blasting tritium into a mini sun with a massive frikken laser is plenty cool without having to misrepresent it so much.