I'm hoping for the AI bubble to pop. I want to see Nvidia and X and Tesla and Microslop and OpenAI all crash and burn.
But there's another part of me that knows there's a good historical precedent for what happens in these situations. My brain is zapped but Larry Fink was the architect of the previous market crash that led to BlackRock profiteering wildly and becoming the immense beast that it is today.
I know that the big AI companies/AI affiliated companies are what is keeping the US economy afloat at the moment and I know the US will play the "too big to fail" card to do yet-another immense transfer of wealth from the proles directly to these companies in the form of bailouts.
I don't think AI-only companies are gonna survive this. I think OpenAI might be the first domino to fall. But Google commands a vast amount of diversified income sources, unlike OpenAI, and I wonder if almost every other big player in AI will get swept under but, with big bailouts and the power that Google commands, it feels like they will be poised to gobble up all the smaller fish and expand their monopoly to integrate themselves into every level of government administration as the government cuts back on expenditures to weather the fallout from the bubble popping plus the immense cost of bailouts, so I can see it being a Faustian bargain where money goes into Google (either directly or indirectly), Google vacuums up all the business and especially the data centers, then Google offers the insolvent US government the "solution" of selling them terminals for every citizen interfacing role that runs on AI and embedding AI in all sorts of bureaucratic processes that occur mostly behind the scenes. For a "small" fee, of course. (Or maybe Palantir or some scumfuck company like Larry Ellison's swoops in and profiteers from the fire sale as the market burns.)
(I'd explain all the fuckery with Larry Fink and BlackRock and how I anticipate the parallels to play out this time around but there's a lot of threads and I'd have to have the brain power available to brush up on the sources and weave the narrative together but that's not gonna happen for me today. Has TrueAnon covered BlackRock yet?)
Strange to think that the scenario of the AI bubble popping and causing all sorts of economic catastrophe for the working class people around the world while the US starts to crumble and descend into fascism and civil war is my optimistic take and that my doomer take is that AI collapses but it doesn't take the market with it and instead AI gets monopolized, bailed out, and forcibly integrated into all levels of society while the US descends into fascism and civil war.
I don't think that AI monopolization is a big concern since it already is very monopolized. It's not like housing where you have families defaulting on their homes.
The way the bubble will burst (or rather deflate) is that AI won't be pushed absolutely everywhere (free to use cloud AI features in Meta/Google/Windows), some products that are currently subsidiesed will get massively more expensive (pay to use AI products like AI coding offered by directly by the large AI companies), some products will get discontinued or bought out (products by AI companies that rely on APIs of larger companies).
That's a fair take. I'm less worried about an AI monopolization per se than I am about Alphabet swooping in and expanding their monopoly while gobbling up everything and sustaining AI demand by more or less forcibly and permanently embedding itself into government functions as a way of deepening neoliberalization to a lower level of hell. (Along with the wealth transfer stuff.) Maybe this is a tinfoil hat take but I feel like Elon Musk's real interest in his DOGE fiasco wasn't just to protect his own business interests with Tesla etc. but, if he wasn't so apt to stepping on his own dick, perhaps he was planning to strip back the functions of government so much that it would have become necessary to implement Grok or a sanitized, government-friendly Grok, into government functions so he could corner a market there.
I think nearly every tech-minded person is waiting with baited breath for the AI bubble to pop that will bring about a return to rationality but, in a bad case scenario, it feels like the bubble will pop and the aftermath will not be a return to normal but a readjustment that forces AI on everyone in a completely different, unavoidable, mandatory way while still trashing the market and bringing about bailouts that will only benefit the biggest players.
sure that's possible but it doesn't really matter how monopolized the industry is. It's also not the bubble bursting.
The realistic nightmare scenario is stuff like all (public school) teachers getting replaced by AI.
Many public schools and municipalities are already under intense financial strain and are implementing harsh austerity measures just to keep afloat. I hate how likely even a partial fulfillment of this scenario is.
I mean, in some respects we are already seeing the early phases of this - textbooks are being partly or wholly generated by AI (there are examples of this online), marking and feedback is being done by AI more and more, lectures and reading are being summarized by AI, obviously assignments and lectures and lesson plans are being generated by AI too.
We are rapidly approaching that point predicted for AI more generally but it's happening in education right now - we have a recursive problem where AI is trained on education materials, it creates the textbooks, it creates the lecture materials and lesson plans, it assesses work and shapes what is deemed high quality and poor quality work, and then it is further trained on the material that is increasingly AI-produced which accelerates the recursive effect. Education was already crumbling in the US but this is gonna cause its collapse.
Imagine students learning from AI materials then using AI to produce work to meet the standards of AI and all of this feeds into AI models that are used to teach students. It's nightmarish.
A photocopy of a photocopy of a photocopy of a photocopy of a photocopy of a photocopy except it's the entire US education system
That's so sad.
I'm gonna listen to Nine Inch Nails' song Copy of a while I read Baudrillard's Simulacra and Simulation and I have a Philip K Dick audiobook playing in the background.
The Luddites did nothing wrong.
I found a YouTube link in your comment. Here are links to the same video on alternative frontends that protect your privacy:
I think we are disagreeing on words and agreeing on the points here.
What I'm trying to say is that I'm not concerned about the monopoly, not really, and whether it's a bubble pop or AI profits crash or the whole economy goes belly-up, my real concern is not about the economic implications directly so much as I'm concerned that a lot of people are banking on the event as being the moment were we will revert to an AI-free world, or at least one that is the early days of AI where it was mostly used as a resource on tap rather than being embedded in everything but I see a good chance that we're in a situation where we are never gonna get the toothpaste back in that tube.
My worry is that when the instability in the current situation is so untenable that it has its inevitable rupture, however you think that rupture will take shape, that it won't be the moment that society is liberated from under the yoke of AI but instead it's going to be a moment of fire sales and consolidation where AI concentrates into the hands of one or two big players and the agenda to further embed AI and foster more dependence in the pursuit of profit will accelerate as the era of the AI wild west will give way to a more "responsible" and conservative type of AI business model (i.e. Google's model instead of Grok's profiteering from spouting edgelord Nazi rhetoric and producing nonconsensual pornography of people) whose advance is effectively inexorable.
The question in my mind is this: "What if an AI crash doesn't lead to AI being wound back but instead it advances the interests of and consolidates the power of a big player and this means that level of AI integration of society we see today will become the new normal, or what if it becomes the low point that people will look back on with nostalgia?"