A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Carmack defends AI tools after Quake fan calls Microsoft AI demo “disgusting”
-
The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD. …No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction.
-
The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD. …No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction.Still have limited wafers at the fabs. The chips going to datacenters could have been consumer stuff instead. Besides they (nVidia, Apple, AMD) are all fabricated at TSMC. Local AI benefits from platforms with unified memory that can be expanded. Watch platforms based on AMD's Ryzen AI MAX 300 chip or whatever they call it take off. Frameworks you can config a machine with that chip to 128 GB RAM iirc. It's the main reason why I believe Apple's memory upgrades cost a ton so that it isn't a viable option financially for local AI applications.
-
DLSS (AI upscaling) alone should see gamers embracing the tech. https://en.wikipedia.org/wiki/Deep_Learning_Super_Sampling
-
The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD. …No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction.Chip fabbing allocations are limited and what chips for Ai datacenters takeup, the desktop GPUs don't get made. And what's left of it are desktop chips sold for workstation Ai models like the RTX 5090 and even RX 7900 XTX because they have more memory. Meanwhile they still sell 8GB cards to gamers when it hasn't been enough for a while. Whole situation is just absurd.
-
You must not have heard the dis gamers use for this tech. Fake frames. I think they'd rather have more raster and ray tracing especially raster in competitive games.DLSS runs on the same hardware as raytracing. That's the entire point. It's all just tensor math. DLSS is one of those things where I'm not even sure what people are complaining about when they complain about it. I can see it being frame generation, which has downsides and is poorly marketed. But then some people seem to be claiming that DLSS does worse than TAA or older upscaling techniques when it clearly doesn't, so it's hard to tell. I don't think all the complainers are saying the same thing or fully understanding what they're saying.
-
You must not have heard the dis gamers use for this tech. Fake frames. I think they'd rather have more raster and ray tracing especially raster in competitive games.People only call fake frames to the interpolation crap they're touting as performance... I don't think a lot of people have issues with image upscaling at a decent level (aka quality settings)
-
DLSS runs on the same hardware as raytracing. That's the entire point. It's all just tensor math. DLSS is one of those things where I'm not even sure what people are complaining about when they complain about it. I can see it being frame generation, which has downsides and is poorly marketed. But then some people seem to be claiming that DLSS does worse than TAA or older upscaling techniques when it clearly doesn't, so it's hard to tell. I don't think all the complainers are saying the same thing or fully understanding what they're saying.The Lemmy userbase seems to have this echo chamber effect where anything to do with AI is categorically bad, doesn't matter what it is or how it performs. Also mentioning AI gets your comment downvoted, further increasing the echo chamber effect.
-
Oh man this thing is amazing, it's got some good memory as room layouts weren't changing on me whenever they left the view unlike previous attempts I've seen with Minecraft. https://copilot.microsoft.com/wham?features=labs-wham-enabledWhat are you talking about? It looks like shit, it plays like shit and the overall experience is shit. And it isn't even clear what the goal is? There are so many better ways to incorporate AI into game development, if one wanted to and I'm not sure we want to. I have seen people argue this is what the technology can do today, imagine in a couple of years. However that seems very naive. The rate at which barriers are reached have no impact on how hard it is to break through those barriers. And as often in life, diminishing returns are a bitch. Microsoft bet big on this AI thing, because they have been lost in what to do ever since they released things like the Windows Phone and Windows 8. They don't know how to innovate anymore, so they are going all in on AI. Shitting out new gimmicks at light speed to see which gain traction. (Please note I'm talking about the consumer and small business side of Microsoft. Microsoft is a huge company with divisions that act almost like seperate companies within. Their Azure branch for example has been massively successful and does innovate just fine.)
-
DLSS (AI upscaling) alone should see gamers embracing the tech. https://en.wikipedia.org/wiki/Deep_Learning_Super_SamplingWhy? AI doing one good thing doesn't erase the dozens of bad ways it's utilized. I'm interested to see AI used on a larger scale in really specific ways, but the industry seems more interested in using it to take giant shortcuts and replace staff. That is going to piss people off, and it's going to *really* piss people off when it also delivers a shit product. I'm fine with DLSS, because I want to see AI *enhance* games. I want it to make them better. So far, all I can see is that it's making them worse with one single upside that I can just.. toggle off on my end if I don't like it.
-
The Lemmy userbase seems to have this echo chamber effect where anything to do with AI is categorically bad, doesn't matter what it is or how it performs. Also mentioning AI gets your comment downvoted, further increasing the echo chamber effect.I guess. It's not like downvotes mean anything here beyond... dopamine hits, I suppose? I don't know that it's Lemmy. Both supporters and detractors don't seem to have a consistent thing they mean when they say "AI". I don't think many of them mean the same thing or agree with each other. I don't think many understand how some of the things they're railing about are put together or how they work. I think the echo chamber element comes in when people who may realize they don't mean the same thing don't want to bring it up because they broadly align ideologically (AI yay or AI boo, again, it happens both ways), and so the issue gets perpetuated. Aaaand we've now described all of social media, if not all of human discourse. Cool.
-
Chip fabbing allocations are limited and what chips for Ai datacenters takeup, the desktop GPUs don't get made. And what's left of it are desktop chips sold for workstation Ai models like the RTX 5090 and even RX 7900 XTX because they have more memory. Meanwhile they still sell 8GB cards to gamers when it hasn't been enough for a while. Whole situation is just absurd.
-
I guess. It's not like downvotes mean anything here beyond... dopamine hits, I suppose? I don't know that it's Lemmy. Both supporters and detractors don't seem to have a consistent thing they mean when they say "AI". I don't think many of them mean the same thing or agree with each other. I don't think many understand how some of the things they're railing about are put together or how they work. I think the echo chamber element comes in when people who may realize they don't mean the same thing don't want to bring it up because they broadly align ideologically (AI yay or AI boo, again, it happens both ways), and so the issue gets perpetuated. Aaaand we've now described all of social media, if not all of human discourse. Cool.Yeah, to people things are black and white and all or nothing. Even suggesting there might be nuance to things elicits a defensive knee jerk reaction.
-
Considering that the AI craze is what's fueling the shortage and massive increase in prices, I really don't see gamers ever embracing AI.Speak for yourself. As an avid gamer I am excitedly looking towards the future of AI in games. Good models (with context buffers *much* longer than the .9s in this demo) have the potential to revolutionise the gaming industry. I really don't understand the amount of LLM/AI hate in Lemmy. It is a tool with many potential uses.
-
What are you talking about? It looks like shit, it plays like shit and the overall experience is shit. And it isn't even clear what the goal is? There are so many better ways to incorporate AI into game development, if one wanted to and I'm not sure we want to. I have seen people argue this is what the technology can do today, imagine in a couple of years. However that seems very naive. The rate at which barriers are reached have no impact on how hard it is to break through those barriers. And as often in life, diminishing returns are a bitch. Microsoft bet big on this AI thing, because they have been lost in what to do ever since they released things like the Windows Phone and Windows 8. They don't know how to innovate anymore, so they are going all in on AI. Shitting out new gimmicks at light speed to see which gain traction. (Please note I'm talking about the consumer and small business side of Microsoft. Microsoft is a huge company with divisions that act almost like seperate companies within. Their Azure branch for example has been massively successful and does innovate just fine.)
-
>DLSS is one of those things where I'm not even sure what people are complaining about when they complain about it. Mostly performance, from what I've seen. That hardware requirements go up, while the real resolution goes down, meanwhile the image quality is stagnant. It's not completely the DLSS's fault though. I think temporal antialiasing never looks good. I don't really care to talk about dlss though, I just shut up and avoid upscaling (unless it's forced grrrr).
-
>DLSS is one of those things where I'm not even sure what people are complaining about when they complain about it. Mostly performance, from what I've seen. That hardware requirements go up, while the real resolution goes down, meanwhile the image quality is stagnant. It's not completely the DLSS's fault though. I think temporal antialiasing never looks good. I don't really care to talk about dlss though, I just shut up and avoid upscaling (unless it's forced grrrr).See, this is the type of thing that weirds me out. Temporal AA doesn't look good *compared to what*? What do you mean "real resolution goes down"? Down from where? This is a very confusing statement to make. I don't know what it is that you're supposed to dislike or what a lot of the terms you're using are supposed to mean. What is "image quality" in your view? What are you comparing as a reference point on all these things that go up and down?
-
Chip fabbing allocations are limited and what chips for Ai datacenters takeup, the desktop GPUs don't get made. And what's left of it are desktop chips sold for workstation Ai models like the RTX 5090 and even RX 7900 XTX because they have more memory. Meanwhile they still sell 8GB cards to gamers when it hasn't been enough for a while. Whole situation is just absurd.Unfortunately, no one is buying a 7900 XTX for AI, mostly not a 5090 either. The 5090 didn’t even work till recently and still doesn’t work with many projects, doubly so for the 7900 XTX. The fab capacity thing is an issue, but not as much as you’d think since the process nodes are different. Again, I am trying to emphasize, a lot of this is just Nvidia being greedy as shit. They are skimping on VRAM/busses and gouging gamers *because they can*.
-
There’s what AI could’ve been (collaborative and awesome), and then there’s what the billionaire class is pushing today (exploitative shit that they hit everyone over the head with until they say they like it). But the folks frothing at the mouth over it are unwilling to listen to why so many people are against the AI we’ve had forced upon us today.