The figures show the estimated greenhouse gas emissions from AI use are also now equivalent to more than 8% of global aviation emissions. His study used technology companies’ own reporting and he called for stricter requirements for them to be more transparent about their climate impact. “The environmental cost of this is pretty huge in absolute terms,” he said. “At the moment society is paying for these costs, not the tech companies. The question is: is that fair? If they are reaping the benefits of this technology, why should they not be paying some of the costs?”
So that's actually not that much? After everybody was screaming that AI is boiling the world, 8% of global aviation emissions is kind of low. And you might hate AI, but it is really more useful than Katie from Sales getting skin cancer on a beach in Thailand or that dude getting drunk on Mallorca or whatever those billionaires are doing in their private jets
I hate both the AI bubble (not the science behind it) and private jets and billionaires if that makes you feel better.
Also, global aviation serves an extremely useful function. Not sure that compares to fancy code autocomplete and media generation that either invalidates digital evidence in legal courts or looks like an insult to life itself.
Also, global aviation serves an extremely useful function. Not sure that compares to fancy code autocomplete and media generation that either invalidates digital evidence in legal courts or looks like an insult to life itself.
Isn't it both? There are great use cases for global aviation (like visiting your family back home) and bad use cases (like sex tourism in a third world country). There are also great things you can do with AI and bad things.
Yes, I will concede to that. Supposedly LLMs help a lot in certain scientific research domains replacing tedious manual work.
The thing is, the prevalence of good vs bad scenarios are inversed between GenAI and aviation I would argue. Due to lack of legal regulation we see insane amounts of funding being given for the most greedy nefarious purposes, like the elimination of the working class or artists, privacy violations for the sake of control and literally weapons out of dystopian scifi.
A lot of the AI you hear about in the scientific community is likely not an LLM. It might use bits of similar technology behind the scenes but they aren't using chatgpt to fold proteins or develop new weather models. That's the problem with lumping everything under "AI".
Unless you're talking writing the papers afterwards, but that's back into using LLMs for paperwork that every industry is discussing.
So that's actually not that much? After everybody was screaming that AI is boiling the world, 8% of global aviation emissions is kind of low. And you might hate AI, but it is really more useful than Katie from Sales getting skin cancer on a beach in Thailand or that dude getting drunk on Mallorca or whatever those billionaires are doing in their private jets
I hate both the AI bubble (not the science behind it) and private jets and billionaires if that makes you feel better.
Also, global aviation serves an extremely useful function. Not sure that compares to fancy code autocomplete and media generation that either invalidates digital evidence in legal courts or looks like an insult to life itself.
Isn't it both? There are great use cases for global aviation (like visiting your family back home) and bad use cases (like sex tourism in a third world country). There are also great things you can do with AI and bad things.
Yes, I will concede to that. Supposedly LLMs help a lot in certain scientific research domains replacing tedious manual work.
The thing is, the prevalence of good vs bad scenarios are inversed between GenAI and aviation I would argue. Due to lack of legal regulation we see insane amounts of funding being given for the most greedy nefarious purposes, like the elimination of the working class or artists, privacy violations for the sake of control and literally weapons out of dystopian scifi.
It's really not the same.
A lot of the AI you hear about in the scientific community is likely not an LLM. It might use bits of similar technology behind the scenes but they aren't using chatgpt to fold proteins or develop new weather models. That's the problem with lumping everything under "AI".
Unless you're talking writing the papers afterwards, but that's back into using LLMs for paperwork that every industry is discussing.