Here is a good overview on trends in datacenters.
-
Another overlapping perspective. There is an AI bubble. The "High Yield" video makes these data centers sound inevitable and unstoppable. But, what if all of this infrastructure isn't really producing any value? Or enough value?
OpenAI has only a handful of customers and most of them are using it to do their homework for them. We discussed "Vibe Coding" last week and no one can agree on if it saves any time or not.
Ed Zitro makes the case for the bubble on MR.
@futurebird OpenAI only has a handful of customers? Loooots of companies building them in as a core part of their stack these days (which is itself terrifying on several levels).
-
@futurebird OpenAI only has a handful of customers? Loooots of companies building them in as a core part of their stack these days (which is itself terrifying on several levels).
Compared to the level of investment there just aren't as many people diving in. And some who dive in will find out it's not doing much and pull out.
-
Compared to the level of investment there just aren't as many people diving in. And some who dive in will find out it's not doing much and pull out.
@futurebird That’s fair. I think there is a bubble. But OpenAI calls are getting tossed into all sorts of software these days. Not to mention the large swath of startups that are basically just wrappers on top of OpenAI
-
@futurebird That’s fair. I think there is a bubble. But OpenAI calls are getting tossed into all sorts of software these days. Not to mention the large swath of startups that are basically just wrappers on top of OpenAI
what happens when they raise the price? are they indispensable? or all these things "chat bots" that try to talk to you while you use a website that could be easily scrapped?
-
what happens when they raise the price? are they indispensable? or all these things "chat bots" that try to talk to you while you use a website that could be easily scrapped?
@futurebird They’re mostly *not* chatbots. They’re mostly stuff like “look through this data and pull out these parts and format it as a json file”
-
@futurebird They’re mostly *not* chatbots. They’re mostly stuff like “look through this data and pull out these parts and format it as a json file”
@futurebird The expectation among c levels is that prices will fall, not rise.
-
@futurebird They’re mostly *not* chatbots. They’re mostly stuff like “look through this data and pull out these parts and format it as a json file”
Is that what’s making all the terrible websites that come up in search?
-
@futurebird The expectation among c levels is that prices will fall, not rise.
I am but a simple school teacher, so perhaps I’m missing something these businesses-folk see that I do not— however: why do they think the price would go down? I thought this stage of the AI business model was like the early days of uber and amazon wherein they corner the market, addict the consumer and then the hammer comes down and big profits emerge (or not)
-
I am but a simple school teacher, so perhaps I’m missing something these businesses-folk see that I do not— however: why do they think the price would go down? I thought this stage of the AI business model was like the early days of uber and amazon wherein they corner the market, addict the consumer and then the hammer comes down and big profits emerge (or not)
@futurebird Moore’s law adjacent reasons. They expect compute to become cheaper and for models to become more efficient. In the 15 years I’ve been doing this stuff, so far this has held. Models that were too big to fit on one gpu now train easily on a normal laptop that is sipping power. The LLMs of today are inconceivably large compared to models from not that many years ago.
-
@futurebird Moore’s law adjacent reasons. They expect compute to become cheaper and for models to become more efficient. In the 15 years I’ve been doing this stuff, so far this has held. Models that were too big to fit on one gpu now train easily on a normal laptop that is sipping power. The LLMs of today are inconceivably large compared to models from not that many years ago.
@dx @futurebird The only thing they'll be able to retain in the new data centers they are making to run this stuff is the floor space ( if the hardware becomes obsolete ) , so at the very least, most of the current investment has to be paid for in addition to the new moore's lawed hardware before they can reach profitability. Assuming that it is indeed possible to scale the processor capabilities without linearly increasing costs.
-
@dx @futurebird The only thing they'll be able to retain in the new data centers they are making to run this stuff is the floor space ( if the hardware becomes obsolete ) , so at the very least, most of the current investment has to be paid for in addition to the new moore's lawed hardware before they can reach profitability. Assuming that it is indeed possible to scale the processor capabilities without linearly increasing costs.
@toerror @futurebird To be clear, I do think a bubble is coming. I think the data centre investment is *insane*. But it’s important to understand these models are not merely being used for end user applications like chat and code completion. They are absolutely getting integrated into major features. It makes me skin crawl that folks wiring in a 3rd party api in some of the things I’ve seen, but it is happening. Many, many companies will have a reckoning if costs balloon or if OpenAI implodes.
-
F myrmepropagandist shared this topic