A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
Carmack defends AI tools after Quake fan calls Microsoft AI demo “disgusting”
-
DLSS (AI upscaling) alone should see gamers embracing the tech. https://en.wikipedia.org/wiki/Deep_Learning_Super_Sampling
-
Well, if nothing else I've made my case. I mean, I'm not gonna go build a Digital Foundry comparison video for you, but this type of argument is definitely what I'm talking about when I say I don't understand what people just claiming this out of the blue even think they're saying.
-
I don't think it's particularly hard to understand what I'm saying - Cyberpunk 2077 with DLSS looks worse than Cyberpunk 2077 with TAA for me. You can disagree, but please don't act like I'm saying something incredibly complex and un-understandable.Well, for one thing you're saying you haven't seen the latest version, so I already don't know what you were looking at or when. I don't know what settings you're using or what resolutions you were targeting. Do I think there are any settings or resolutions in Cyberpunk where TAA looks noticeably better than any DLSS after 2? Absolutely, definitely 100% no. But hey, it'd help get an idea of what type of images we're comparing. And it'd help to know what sorts of things you are seeing that make you notice a difference. What is it about that DLSS image that looks worse? I mean, let's be clear, these will mostly be the same base picture, just upscaled to whatever your monitor resolution is in slightly different ways. So what is it from each of those that is leading to this assessment? What are the problems we're talking about here? Because last time I fired up Cyberpunk (and that was a couple of weeks ago) DLSS performed noticeably faster than TAA and looked effectively indistinguishable from native rendering at 4K. So yeah, you're saying some weird stuff right here and from the explanations you're (not) giving you may as well have ran in from the street drenched in red fluid claiming the aliens stole your dog and I would have about as much connection to my lived reality to parse what's going on. I'm less concerned you're going to randomly stab me over your antialiasing preferences, sure, but in terms of using words to exchange recognizable information I'm kinda lost here.
-
Well, for one thing you're saying you haven't seen the latest version, so I already don't know what you were looking at or when. I don't know what settings you're using or what resolutions you were targeting. Do I think there are any settings or resolutions in Cyberpunk where TAA looks noticeably better than any DLSS after 2? Absolutely, definitely 100% no. But hey, it'd help get an idea of what type of images we're comparing. And it'd help to know what sorts of things you are seeing that make you notice a difference. What is it about that DLSS image that looks worse? I mean, let's be clear, these will mostly be the same base picture, just upscaled to whatever your monitor resolution is in slightly different ways. So what is it from each of those that is leading to this assessment? What are the problems we're talking about here? Because last time I fired up Cyberpunk (and that was a couple of weeks ago) DLSS performed noticeably faster than TAA and looked effectively indistinguishable from native rendering at 4K. So yeah, you're saying some weird stuff right here and from the explanations you're (not) giving you may as well have ran in from the street drenched in red fluid claiming the aliens stole your dog and I would have about as much connection to my lived reality to parse what's going on. I'm less concerned you're going to randomly stab me over your antialiasing preferences, sure, but in terms of using words to exchange recognizable information I'm kinda lost here.
-
DLSS (AI upscaling) alone should see gamers embracing the tech. https://en.wikipedia.org/wiki/Deep_Learning_Super_SamplingThey do. You'll see a lot of hate for DLSS on social media, but if you go to the forums or any newly-released game that doesn't have DLSS, you'll find at least one post *demanding* that they implement it. If it's on by default, most people don't ever touch that setting and they're fine with it.
-
See, this is the type of thing that weirds me out. Temporal AA doesn't look good *compared to what*? What do you mean "real resolution goes down"? Down from where? This is a very confusing statement to make. I don't know what it is that you're supposed to dislike or what a lot of the terms you're using are supposed to mean. What is "image quality" in your view? What are you comparing as a reference point on all these things that go up and down?>Temporal AA doesn't look good *compared to what*? Compared to some older AA tech. TAA is awful in motion in games. edit: by default, if there's a config file it can be made better >What do you mean "real resolution goes down"? Down from where? I mean internal resolution. Playing at 1080p with DLSS means the game doesn't render at your specified resolution, but a fraction of it. Native (*for now*) is the best. >What is "image quality" in your view? Mostly general clarity and stuff like particle effects, textures, stuff like that I think. You can ask those people directly, you know. I'm just the messenger, I barely play modern games. >I don't know ... a lot of the terms you're using are supposed to mean Yeah, that's a problem. More people should be aware of the graphical effects in games. Thankfully some games now implement previews for damn near every quality toggle.
-
AAA dev here. Carmack is correct. I expect to be dogpiled by uninformed disagreements, though, because on social media all AI = Bad and no nuance is allowed.What AI tools are you personally looking forward to or already using?
-
Speak for yourself. As an avid gamer I am excitedly looking towards the future of AI in games. Good models (with context buffers *much* longer than the .9s in this demo) have the potential to revolutionise the gaming industry. I really don't understand the amount of LLM/AI hate in Lemmy. It is a tool with many potential uses.There's a difference between LLMs making games and LLMs trained to play characters in a game.
-
What AI tools are you personally looking forward to or already using?Not me personally, as AI can't really replicate my work (I'm a sound designer on a big game), but [a few colleagues of mine have already begun reaping the workflow improvements of AI at their studio.](https://gameworldobserver.com/2022/12/19/how-ai-helps-artists-gearbox-borderlands-prevent-crunch)
-
DLSS (AI upscaling) alone should see gamers embracing the tech. https://en.wikipedia.org/wiki/Deep_Learning_Super_Samplingfirst thing I turn off. It only works in tech demos with very slow moving cameras. Sometimes
-
OK, but... you know AI isn't a *person*, right? You seem to be mad at math. Which is not rare, but it *is* weird... yeah, I'm aware AI isn't a person. I'm not sure why that's a question? Maybe I phrased things badly, but I'm not- nor have I ever- been really *mad* about AI usage. It's mostly just disappointment. It's just a technology. I just largely dislike the way it's being used, partly because I feel like it has a lot of potential.
-
Not me personally, as AI can't really replicate my work (I'm a sound designer on a big game), but [a few colleagues of mine have already begun reaping the workflow improvements of AI at their studio.](https://gameworldobserver.com/2022/12/19/how-ai-helps-artists-gearbox-borderlands-prevent-crunch)Obviously AI is coming for sound designers too. You know that right? https://elevenlabs.io/sound-effects And if you work on games and you haven’t seen your industry decimated in the past 16 months, I want to know what rock you have been living under and if there’s room for one more.
-
What are you talking about? It looks like shit, it plays like shit and the overall experience is shit. And it isn't even clear what the goal is? There are so many better ways to incorporate AI into game development, if one wanted to and I'm not sure we want to. I have seen people argue this is what the technology can do today, imagine in a couple of years. However that seems very naive. The rate at which barriers are reached have no impact on how hard it is to break through those barriers. And as often in life, diminishing returns are a bitch. Microsoft bet big on this AI thing, because they have been lost in what to do ever since they released things like the Windows Phone and Windows 8. They don't know how to innovate anymore, so they are going all in on AI. Shitting out new gimmicks at light speed to see which gain traction. (Please note I'm talking about the consumer and small business side of Microsoft. Microsoft is a huge company with divisions that act almost like seperate companies within. Their Azure branch for example has been massively successful and does innovate just fine.)I'm happy to see someone else pushing back against the inevitability line I see so much around this tech. It's still incredibly new and there's no guarantee it will continue to improve. Could it? Sure, but I think it's equally likely it could start to degrade instead due to ai inbreeding or power consumption becoming too big of an issue with larger adoption. No one actually knows the future and it's hardly inevitable.
-
Still have limited wafers at the fabs. The chips going to datacenters could have been consumer stuff instead. Besides they (nVidia, Apple, AMD) are all fabricated at TSMC. Local AI benefits from platforms with unified memory that can be expanded. Watch platforms based on AMD's Ryzen AI MAX 300 chip or whatever they call it take off. Frameworks you can config a machine with that chip to 128 GB RAM iirc. It's the main reason why I believe Apple's memory upgrades cost a ton so that it isn't a viable option financially for local AI applications.> The chips going to datacenters could have been consumer stuff instead. This is true, but again, they do use different processes. The B100 (and I *think* the 5090) is TSMC 4NP, while the other chips use a lesser process. Hopper (the H100) was TSMC 4N, Ada Lovelace (RTX 4000) was TSMC *N4*. The 3000 series/A100 was straight up split between Samsung and TSMC. The AMD 7000 was a mix of older N5/N6 due to the MCM design. > Local AI benefits from platforms with unified memory that can be expanded. This is tricky because expandable memory is orthogonal to bandwidth and power efficiency. Framework (ostensibly) *had* to use soldered memory for their Strix Halo box because it's literally the only way to make the traces good enough: SO-DIMMs are absolutely not fast enough, and even LPCAMM apparently isn't there yet. > AMD’s Ryzen AI MAX 300 chip Funny thing is the community is still quite lukewarm to the AMD APUs due to poor software support. It works okay... if you're a python dev that can spend hours screwing with rocm to get things fast
But it's quite slow/underutilized if you just run popular frameworks like ollama or the old diffusion ones. > It’s the main reason why I believe Apple’s memory upgrades cost a ton so that it isn’t a viable option financially for local AI applications. Nah, Apple's been gouging memory *way* before AI was a thing. It's their thing, and honestly it kinda backfired because it made them so unaffordable for AI. Also, Apple's stuff is actually... Not *great* for AI anyway. The M-chips have relatively poor software support (no pytorch, MLX is barebones, leaving you stranded with GGML mostly). They don't have much compute compared to a GPU or even an AMD APU, the NPU part is useless. Unified memory doesn't help at all, it's just that their stuff *happens* to have a ton of memory hanging off the GPU, which is useful.
-
Obviously AI is coming for sound designers too. You know that right? https://elevenlabs.io/sound-effects And if you work on games and you haven’t seen your industry decimated in the past 16 months, I want to know what rock you have been living under and if there’s room for one more.I love when regular folks act like they understand things better than inustry insiders near the top of their respective field. It's genuinely amusing. Let me ask you a simple question: do YOU want to play a game with mediocre, lowest-common-denominator-generated AI audio? Or do you want something crafted by a human with feelings (a thing an AI model does not have) and the ability to create unique design crafted specifically to create emotional resonance with you (and thing an AI has exactly zero intuition for)? Answers on a postcard, thanks. The market agrees with me as well. And no, I have not seen my industry decimated by AI. Talk to any experienced AAA game dev on LinkedIn, it's not really a thing. There still is and always will be a huge demand for art specifically created by humans and for humans for the exact reasons listed above. What has ACTUALLY decimated my industry is the overvaluation and inflation of everything in the economy, and now the low-interest rates put in place to counter it, which is leading to layoffs once giant games don't generate the insane profit targets suits have, which is likely what you are ereoneously attributing to AI displacement.
-
I love when regular folks act like they understand things better than inustry insiders near the top of their respective field. It's genuinely amusing. Let me ask you a simple question: do YOU want to play a game with mediocre, lowest-common-denominator-generated AI audio? Or do you want something crafted by a human with feelings (a thing an AI model does not have) and the ability to create unique design crafted specifically to create emotional resonance with you (and thing an AI has exactly zero intuition for)? Answers on a postcard, thanks. The market agrees with me as well. And no, I have not seen my industry decimated by AI. Talk to any experienced AAA game dev on LinkedIn, it's not really a thing. There still is and always will be a huge demand for art specifically created by humans and for humans for the exact reasons listed above. What has ACTUALLY decimated my industry is the overvaluation and inflation of everything in the economy, and now the low-interest rates put in place to counter it, which is leading to layoffs once giant games don't generate the insane profit targets suits have, which is likely what you are ereoneously attributing to AI displacement.Do you remember the music from the last Marvel film you watched? I don't. Quality isn't directly correlated to success. Buy a modern pair of Nikes or... Go to McDonalds, play a modern mobile game. I love when industry insiders think they're so untouchable that a budget cut wouldn't have them in the chopping block. You're defensive because its your ass on the line, not because its true. People gargle shit products and pay for them willingly all day long. You're just insulated from it, for now.
-
I'm pretty sure the fabs making the chips for datacenter cards could be making more consumer grade cards but those are less profitable. And since fabs aren't infinite the price of datacenter cards is still going to affect consumer ones.Heh, especially for this generation I suppose. Even the Arc B580 is on TSMC and overpriced/OOS everywhere. It's kinda their own stupid fault too. They could've uses Samsung or Intel, and a bigger slower die for each SKU, but didn't.
-
Do you remember the music from the last Marvel film you watched? I don't. Quality isn't directly correlated to success. Buy a modern pair of Nikes or... Go to McDonalds, play a modern mobile game. I love when industry insiders think they're so untouchable that a budget cut wouldn't have them in the chopping block. You're defensive because its your ass on the line, not because its true. People gargle shit products and pay for them willingly all day long. You're just insulated from it, for now.
-
There’s what AI could’ve been (collaborative and awesome), and then there’s what the billionaire class is pushing today (exploitative shit that they hit everyone over the head with until they say they like it). But the folks frothing at the mouth over it are unwilling to listen to why so many people are against the AI we’ve had forced upon us today.A guy I used to work with would, at least I would swear it, submit shit code just so I would comment about the right way to do it. No matter how many times I told him how to do something. Sometimes it was code that didn't actually do anything. Working with co-pilot is a lot like working with that guy again.
-
A guy I used to work with would, at least I would swear it, submit shit code just so I would comment about the right way to do it. No matter how many times I told him how to do something. Sometimes it was code that didn't actually do anything. Working with co-pilot is a lot like working with that guy again.Funny enough, here’s a description of AI I wrote yesterday that I think you’ll relate to: > AI is the lazy colleague that will never get fired because their dad is the CTO. You’re forced to pair with them on a daily basis. You try to hand them menial tasks that they still manage to get completely wrong, while dear ol’ dad is gassing them up in every all-hands meeting.