A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Carmack defends AI tools after Quake fan calls Microsoft AI demo “disgusting”
-
I guess. It's not like downvotes mean anything here beyond... dopamine hits, I suppose? I don't know that it's Lemmy. Both supporters and detractors don't seem to have a consistent thing they mean when they say "AI". I don't think many of them mean the same thing or agree with each other. I don't think many understand how some of the things they're railing about are put together or how they work. I think the echo chamber element comes in when people who may realize they don't mean the same thing don't want to bring it up because they broadly align ideologically (AI yay or AI boo, again, it happens both ways), and so the issue gets perpetuated. Aaaand we've now described all of social media, if not all of human discourse. Cool.Yeah, to people things are black and white and all or nothing. Even suggesting there might be nuance to things elicits a defensive knee jerk reaction.
-
Considering that the AI craze is what's fueling the shortage and massive increase in prices, I really don't see gamers ever embracing AI.Speak for yourself. As an avid gamer I am excitedly looking towards the future of AI in games. Good models (with context buffers *much* longer than the .9s in this demo) have the potential to revolutionise the gaming industry. I really don't understand the amount of LLM/AI hate in Lemmy. It is a tool with many potential uses.
-
What are you talking about? It looks like shit, it plays like shit and the overall experience is shit. And it isn't even clear what the goal is? There are so many better ways to incorporate AI into game development, if one wanted to and I'm not sure we want to. I have seen people argue this is what the technology can do today, imagine in a couple of years. However that seems very naive. The rate at which barriers are reached have no impact on how hard it is to break through those barriers. And as often in life, diminishing returns are a bitch. Microsoft bet big on this AI thing, because they have been lost in what to do ever since they released things like the Windows Phone and Windows 8. They don't know how to innovate anymore, so they are going all in on AI. Shitting out new gimmicks at light speed to see which gain traction. (Please note I'm talking about the consumer and small business side of Microsoft. Microsoft is a huge company with divisions that act almost like seperate companies within. Their Azure branch for example has been massively successful and does innovate just fine.)
-
>DLSS is one of those things where I'm not even sure what people are complaining about when they complain about it. Mostly performance, from what I've seen. That hardware requirements go up, while the real resolution goes down, meanwhile the image quality is stagnant. It's not completely the DLSS's fault though. I think temporal antialiasing never looks good. I don't really care to talk about dlss though, I just shut up and avoid upscaling (unless it's forced grrrr).
-
>DLSS is one of those things where I'm not even sure what people are complaining about when they complain about it. Mostly performance, from what I've seen. That hardware requirements go up, while the real resolution goes down, meanwhile the image quality is stagnant. It's not completely the DLSS's fault though. I think temporal antialiasing never looks good. I don't really care to talk about dlss though, I just shut up and avoid upscaling (unless it's forced grrrr).See, this is the type of thing that weirds me out. Temporal AA doesn't look good *compared to what*? What do you mean "real resolution goes down"? Down from where? This is a very confusing statement to make. I don't know what it is that you're supposed to dislike or what a lot of the terms you're using are supposed to mean. What is "image quality" in your view? What are you comparing as a reference point on all these things that go up and down?
-
Chip fabbing allocations are limited and what chips for Ai datacenters takeup, the desktop GPUs don't get made. And what's left of it are desktop chips sold for workstation Ai models like the RTX 5090 and even RX 7900 XTX because they have more memory. Meanwhile they still sell 8GB cards to gamers when it hasn't been enough for a while. Whole situation is just absurd.Unfortunately, no one is buying a 7900 XTX for AI, mostly not a 5090 either. The 5090 didn’t even work till recently and still doesn’t work with many projects, doubly so for the 7900 XTX. The fab capacity thing is an issue, but not as much as you’d think since the process nodes are different. Again, I am trying to emphasize, a lot of this is just Nvidia being greedy as shit. They are skimping on VRAM/busses and gouging gamers *because they can*.
-
There’s what AI could’ve been (collaborative and awesome), and then there’s what the billionaire class is pushing today (exploitative shit that they hit everyone over the head with until they say they like it). But the folks frothing at the mouth over it are unwilling to listen to why so many people are against the AI we’ve had forced upon us today.
-
At least the DLSS I've seen looks terrible. I've tried it in a bunch of games, and it produces visible artifacts that are worse than TAA. Cyberpunk 2077 is a great example. Newer versions are supposedly better, but I haven't seen them yet.You haven't seen it in a while, then, because that was definitely not true of the previous version and it's absolutely, objectively not true of the new transformer model version. But honestly, other than what? the very first iteration it hasn't been true in a while. TAA and DLSS tended to artifact in different ways (DLSS struggles with particles and fast movement, TAA struggles with most AA challenge areas like sub-pixel detail and thin lines. Honestly, for real time use at 4K I don't know of a more consistent, cleaner AA solution than DLSS. And I hesitate to call 4K DLAA a real time solution, but that's definitely the best option we have in game engines at this point. I don't even like Nvidia as a company and I hate that DLSS is a proprietary feature, but you can't really argue with results.
-
Why? AI doing one good thing doesn't erase the dozens of bad ways it's utilized. I'm interested to see AI used on a larger scale in really specific ways, but the industry seems more interested in using it to take giant shortcuts and replace staff. That is going to piss people off, and it's going to *really* piss people off when it also delivers a shit product. I'm fine with DLSS, because I want to see AI *enhance* games. I want it to make them better. So far, all I can see is that it's making them worse with one single upside that I can just.. toggle off on my end if I don't like it.OK, but... you know AI isn't a *person*, right? You seem to be mad at math. Which is not rare, but it *is* weird.
-
You haven't seen it in a while, then, because that was definitely not true of the previous version and it's absolutely, objectively not true of the new transformer model version. But honestly, other than what? the very first iteration it hasn't been true in a while. TAA and DLSS tended to artifact in different ways (DLSS struggles with particles and fast movement, TAA struggles with most AA challenge areas like sub-pixel detail and thin lines. Honestly, for real time use at 4K I don't know of a more consistent, cleaner AA solution than DLSS. And I hesitate to call 4K DLAA a real time solution, but that's definitely the best option we have in game engines at this point. I don't even like Nvidia as a company and I hate that DLSS is a proprietary feature, but you can't really argue with results.
-
Well, if nothing else I've made my case. I mean, I'm not gonna go build a Digital Foundry comparison video for you, but this type of argument is definitely what I'm talking about when I say I don't understand what people just claiming this out of the blue even think they're saying.
-
DLSS (AI upscaling) alone should see gamers embracing the tech. https://en.wikipedia.org/wiki/Deep_Learning_Super_Sampling
-
Well, if nothing else I've made my case. I mean, I'm not gonna go build a Digital Foundry comparison video for you, but this type of argument is definitely what I'm talking about when I say I don't understand what people just claiming this out of the blue even think they're saying.
-
I don't think it's particularly hard to understand what I'm saying - Cyberpunk 2077 with DLSS looks worse than Cyberpunk 2077 with TAA for me. You can disagree, but please don't act like I'm saying something incredibly complex and un-understandable.Well, for one thing you're saying you haven't seen the latest version, so I already don't know what you were looking at or when. I don't know what settings you're using or what resolutions you were targeting. Do I think there are any settings or resolutions in Cyberpunk where TAA looks noticeably better than any DLSS after 2? Absolutely, definitely 100% no. But hey, it'd help get an idea of what type of images we're comparing. And it'd help to know what sorts of things you are seeing that make you notice a difference. What is it about that DLSS image that looks worse? I mean, let's be clear, these will mostly be the same base picture, just upscaled to whatever your monitor resolution is in slightly different ways. So what is it from each of those that is leading to this assessment? What are the problems we're talking about here? Because last time I fired up Cyberpunk (and that was a couple of weeks ago) DLSS performed noticeably faster than TAA and looked effectively indistinguishable from native rendering at 4K. So yeah, you're saying some weird stuff right here and from the explanations you're (not) giving you may as well have ran in from the street drenched in red fluid claiming the aliens stole your dog and I would have about as much connection to my lived reality to parse what's going on. I'm less concerned you're going to randomly stab me over your antialiasing preferences, sure, but in terms of using words to exchange recognizable information I'm kinda lost here.
-
Well, for one thing you're saying you haven't seen the latest version, so I already don't know what you were looking at or when. I don't know what settings you're using or what resolutions you were targeting. Do I think there are any settings or resolutions in Cyberpunk where TAA looks noticeably better than any DLSS after 2? Absolutely, definitely 100% no. But hey, it'd help get an idea of what type of images we're comparing. And it'd help to know what sorts of things you are seeing that make you notice a difference. What is it about that DLSS image that looks worse? I mean, let's be clear, these will mostly be the same base picture, just upscaled to whatever your monitor resolution is in slightly different ways. So what is it from each of those that is leading to this assessment? What are the problems we're talking about here? Because last time I fired up Cyberpunk (and that was a couple of weeks ago) DLSS performed noticeably faster than TAA and looked effectively indistinguishable from native rendering at 4K. So yeah, you're saying some weird stuff right here and from the explanations you're (not) giving you may as well have ran in from the street drenched in red fluid claiming the aliens stole your dog and I would have about as much connection to my lived reality to parse what's going on. I'm less concerned you're going to randomly stab me over your antialiasing preferences, sure, but in terms of using words to exchange recognizable information I'm kinda lost here.
-
DLSS (AI upscaling) alone should see gamers embracing the tech. https://en.wikipedia.org/wiki/Deep_Learning_Super_SamplingThey do. You'll see a lot of hate for DLSS on social media, but if you go to the forums or any newly-released game that doesn't have DLSS, you'll find at least one post *demanding* that they implement it. If it's on by default, most people don't ever touch that setting and they're fine with it.
-
See, this is the type of thing that weirds me out. Temporal AA doesn't look good *compared to what*? What do you mean "real resolution goes down"? Down from where? This is a very confusing statement to make. I don't know what it is that you're supposed to dislike or what a lot of the terms you're using are supposed to mean. What is "image quality" in your view? What are you comparing as a reference point on all these things that go up and down?>Temporal AA doesn't look good *compared to what*? Compared to some older AA tech. TAA is awful in motion in games. edit: by default, if there's a config file it can be made better >What do you mean "real resolution goes down"? Down from where? I mean internal resolution. Playing at 1080p with DLSS means the game doesn't render at your specified resolution, but a fraction of it. Native (*for now*) is the best. >What is "image quality" in your view? Mostly general clarity and stuff like particle effects, textures, stuff like that I think. You can ask those people directly, you know. I'm just the messenger, I barely play modern games. >I don't know ... a lot of the terms you're using are supposed to mean Yeah, that's a problem. More people should be aware of the graphical effects in games. Thankfully some games now implement previews for damn near every quality toggle.
-
AAA dev here. Carmack is correct. I expect to be dogpiled by uninformed disagreements, though, because on social media all AI = Bad and no nuance is allowed.What AI tools are you personally looking forward to or already using?