A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Carmack defends AI tools after Quake fan calls Microsoft AI demo “disgusting”
-
G Games shared this topic
-
Considering that the AI craze is what's fueling the shortage and massive increase in prices, I really don't see gamers ever embracing AI.
-
I get it, AI has some significant downsides, but people go way overboard with criticism of it. Things aren't 100% terrible or 100% righteous. You don't have to be militant over everything.Somebody didn't watch Terminator 2
-
Carmack *is* an AI sent from the future, so he's a bit biased.
-
Considering that the AI craze is what's fueling the shortage and massive increase in prices, I really don't see gamers ever embracing AI.The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD. …No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction.
-
Considering that the AI craze is what's fueling the shortage and massive increase in prices, I really don't see gamers ever embracing AI.DLSS (AI upscaling) alone should see gamers embracing the tech. https://en.wikipedia.org/wiki/Deep_Learning_Super_Sampling
-
The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD. …No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction.
-
The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD. …No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction.Still have limited wafers at the fabs. The chips going to datacenters could have been consumer stuff instead. Besides they (nVidia, Apple, AMD) are all fabricated at TSMC. Local AI benefits from platforms with unified memory that can be expanded. Watch platforms based on AMD's Ryzen AI MAX 300 chip or whatever they call it take off. Frameworks you can config a machine with that chip to 128 GB RAM iirc. It's the main reason why I believe Apple's memory upgrades cost a ton so that it isn't a viable option financially for local AI applications.
-
DLSS (AI upscaling) alone should see gamers embracing the tech. https://en.wikipedia.org/wiki/Deep_Learning_Super_Sampling
-
The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD. …No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction.Chip fabbing allocations are limited and what chips for Ai datacenters takeup, the desktop GPUs don't get made. And what's left of it are desktop chips sold for workstation Ai models like the RTX 5090 and even RX 7900 XTX because they have more memory. Meanwhile they still sell 8GB cards to gamers when it hasn't been enough for a while. Whole situation is just absurd.
-
You must not have heard the dis gamers use for this tech. Fake frames. I think they'd rather have more raster and ray tracing especially raster in competitive games.DLSS runs on the same hardware as raytracing. That's the entire point. It's all just tensor math. DLSS is one of those things where I'm not even sure what people are complaining about when they complain about it. I can see it being frame generation, which has downsides and is poorly marketed. But then some people seem to be claiming that DLSS does worse than TAA or older upscaling techniques when it clearly doesn't, so it's hard to tell. I don't think all the complainers are saying the same thing or fully understanding what they're saying.
-
You must not have heard the dis gamers use for this tech. Fake frames. I think they'd rather have more raster and ray tracing especially raster in competitive games.People only call fake frames to the interpolation crap they're touting as performance... I don't think a lot of people have issues with image upscaling at a decent level (aka quality settings)
-
DLSS runs on the same hardware as raytracing. That's the entire point. It's all just tensor math. DLSS is one of those things where I'm not even sure what people are complaining about when they complain about it. I can see it being frame generation, which has downsides and is poorly marketed. But then some people seem to be claiming that DLSS does worse than TAA or older upscaling techniques when it clearly doesn't, so it's hard to tell. I don't think all the complainers are saying the same thing or fully understanding what they're saying.The Lemmy userbase seems to have this echo chamber effect where anything to do with AI is categorically bad, doesn't matter what it is or how it performs. Also mentioning AI gets your comment downvoted, further increasing the echo chamber effect.
-
Oh man this thing is amazing, it's got some good memory as room layouts weren't changing on me whenever they left the view unlike previous attempts I've seen with Minecraft. https://copilot.microsoft.com/wham?features=labs-wham-enabledWhat are you talking about? It looks like shit, it plays like shit and the overall experience is shit. And it isn't even clear what the goal is? There are so many better ways to incorporate AI into game development, if one wanted to and I'm not sure we want to. I have seen people argue this is what the technology can do today, imagine in a couple of years. However that seems very naive. The rate at which barriers are reached have no impact on how hard it is to break through those barriers. And as often in life, diminishing returns are a bitch. Microsoft bet big on this AI thing, because they have been lost in what to do ever since they released things like the Windows Phone and Windows 8. They don't know how to innovate anymore, so they are going all in on AI. Shitting out new gimmicks at light speed to see which gain traction. (Please note I'm talking about the consumer and small business side of Microsoft. Microsoft is a huge company with divisions that act almost like seperate companies within. Their Azure branch for example has been massively successful and does innovate just fine.)
-
DLSS (AI upscaling) alone should see gamers embracing the tech. https://en.wikipedia.org/wiki/Deep_Learning_Super_SamplingWhy? AI doing one good thing doesn't erase the dozens of bad ways it's utilized. I'm interested to see AI used on a larger scale in really specific ways, but the industry seems more interested in using it to take giant shortcuts and replace staff. That is going to piss people off, and it's going to *really* piss people off when it also delivers a shit product. I'm fine with DLSS, because I want to see AI *enhance* games. I want it to make them better. So far, all I can see is that it's making them worse with one single upside that I can just.. toggle off on my end if I don't like it.
-
The Lemmy userbase seems to have this echo chamber effect where anything to do with AI is categorically bad, doesn't matter what it is or how it performs. Also mentioning AI gets your comment downvoted, further increasing the echo chamber effect.I guess. It's not like downvotes mean anything here beyond... dopamine hits, I suppose? I don't know that it's Lemmy. Both supporters and detractors don't seem to have a consistent thing they mean when they say "AI". I don't think many of them mean the same thing or agree with each other. I don't think many understand how some of the things they're railing about are put together or how they work. I think the echo chamber element comes in when people who may realize they don't mean the same thing don't want to bring it up because they broadly align ideologically (AI yay or AI boo, again, it happens both ways), and so the issue gets perpetuated. Aaaand we've now described all of social media, if not all of human discourse. Cool.
-
Chip fabbing allocations are limited and what chips for Ai datacenters takeup, the desktop GPUs don't get made. And what's left of it are desktop chips sold for workstation Ai models like the RTX 5090 and even RX 7900 XTX because they have more memory. Meanwhile they still sell 8GB cards to gamers when it hasn't been enough for a while. Whole situation is just absurd.