A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
The post-GeForce era: What if Nvidia abandons PC gaming?
-
AFAIK, the H100 and up (Nvidia’s bestselling data center GPUs) can technically game, but they’re missing so many ROPs that they’re really bad at it. There will be no repurposing all those AI datacenters for cloud gaming farms.
-
I'd rather pay for chinese gpus than cloud gaming. The article, by focusing solely almost considers than nothing else exists, especially in China where moore threads gpu start to have reasonnable perf for gaming. I don't think Europe can't produce something as long as it stays neoliberal, but some weird stuff could happen with RISCVFact is, Nvidia has the vast majority of market share: https://www.techpowerup.com/337775/nvidia-grabs-market-share-amd-loses-ground-and-intel-disappears-in-latest-dgpu-update It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good? Alas, for *whatever reason* that’s how it is, so Nvidia suddenly becoming unavailable would be a huge event.
-
Fact is, Nvidia has the vast majority of market share: https://www.techpowerup.com/337775/nvidia-grabs-market-share-amd-loses-ground-and-intel-disappears-in-latest-dgpu-update It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good? Alas, for *whatever reason* that’s how it is, so Nvidia suddenly becoming unavailable would be a huge event.>It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good? Because the 2022 GPU amd 7900xtx has better raw performance than the 9000 series, and I'm pissed off they didn't try to one-up themselves in the high-end market. I'm not buying a new Nvidia card but I'm not buying a 9000 series either because it feels like I'm paying for a sub-par GPU compared to what they're capable of.
-
With China working hard to catch up with chip production, it is only a matter of time before we start seeing attractively priced Chinese made GPUs on the market. No idea how long it will take though.What makes you think chinese firms wont also jump on the AI bandwagon? someone with an actual CS/engineering background feel free to correct me, but i feel like the only way out of this for gamers is if someone finds a better type of chip for AI work. GPUs just happened to be the best thing for the job when the world went crazy, they were never designed specifically for these workloads. If someone like tenstorrent can design a RISC-V chip for these LLM workloads, it might take some demand off gaming GPUs.
-
>It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good? Because the 2022 GPU amd 7900xtx has better raw performance than the 9000 series, and I'm pissed off they didn't try to one-up themselves in the high-end market. I'm not buying a new Nvidia card but I'm not buying a 9000 series either because it feels like I'm paying for a sub-par GPU compared to what they're capable of.Well, same for me TBH. But most people I see online want something more affordable, right in the 9000 series or Arc range.
-
What makes you think chinese firms wont also jump on the AI bandwagon? someone with an actual CS/engineering background feel free to correct me, but i feel like the only way out of this for gamers is if someone finds a better type of chip for AI work. GPUs just happened to be the best thing for the job when the world went crazy, they were never designed specifically for these workloads. If someone like tenstorrent can design a RISC-V chip for these LLM workloads, it might take some demand off gaming GPUs.You’ve got a good point. I wouldn’t be surprised if nVidia wasn’t working on a dedicated platform for AI to cover this exact issue. Then again, I would be equally unsurprised if they just didn’t care and didn’t mind gutting the home gaming market for short-term profit.
-
What makes you think chinese firms wont also jump on the AI bandwagon? someone with an actual CS/engineering background feel free to correct me, but i feel like the only way out of this for gamers is if someone finds a better type of chip for AI work. GPUs just happened to be the best thing for the job when the world went crazy, they were never designed specifically for these workloads. If someone like tenstorrent can design a RISC-V chip for these LLM workloads, it might take some demand off gaming GPUs.
-
- Nvidia abandons x86 desktop gamers - The only hardware that gamers own are ARM handhelds - Some gamers stream x86 games, but devs start selling ARM builds since the x86 market is shrinking - AI bubble pops - Nvidia tries to regain x86 desktop gamers - Gamers are almost entirely on ARM - Nvidia pulls an IBM and vanishes quietly into enterprise services and not much else
-
Seriously. Why would I care that a billion dollar corporation who exploited the market to maximize their revenue is leaving? "Bye bitch."
-
Fact is, Nvidia has the vast majority of market share: https://www.techpowerup.com/337775/nvidia-grabs-market-share-amd-loses-ground-and-intel-disappears-in-latest-dgpu-update It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good? Alas, for *whatever reason* that’s how it is, so Nvidia suddenly becoming unavailable would be a huge event.For accelerated rendering etc cuda is the standard, and because of it Nvidia. And it's like that for a lot of other niche areas. Accelerated encoding? NVIDIA, and CUDA again. Yes, AMD and Intel can generally do all that these days as well. But that's a much more recent thing. If all you want to do is game, sure that's not a big issue. But if you want to do anything more than gaming. Say video editing and rendering timelines. Nvidia had been the standard with a lot of inertia. To give an example of how badly AMD missed the boat. I have been using accelerated rendering on my gt 750 in blender for a decade now. The card came out early 2014. The first AMD card capable of that came out one month before the end of 2020. Nearly a 7 year difference! I'm looking at a recent Intel arc or battle mage card or a 6xxx series AMD ATM. Because I run BSD/Linux and Nvidia has traditionally been a necessary PITA. But with both of them now supporting my workflows. Nvidia is much less necessary. Others will eventually leave too. As long as AMD and Intel don't fuck it up for themselves.
-
Games and gaming have fully become like Hollywood and Silicon Valley and I expect zero good things from them at this point. As with movies and music, now most of the good stuff will be from individuals and smaller enterprises. The fact is today's GPUs have enough power to do extraordinary things. The hardware these days moves so fast, no one is squeezing any performance out of anything like they used to have to do. And not every game needs photo realistic ray traced graphics, so these GPUs will be fine for many gamers so long as they remain supported through drivers.
-
My framerates have never been better since I went full AMD. My friend with a 5080 complains about low framerates on almost every new game, while I'm at max framerate. Don't let the door hit ya on the way out Nvidia.
-
>What makes you think chinese firms wont also jump on the AI bandwagon? the bubble won't last that longThe only thing that will burst the bubble is electricity. The Dotcom bubble burst due to Dark Fiber, all because massive Internet backbones were easy to build, and the last mile to people's homes, was not. The current electrical grid cannot support the number of data centers being built. The ones that are planned on top of that... Well dark data centers will be the new dark fiber. There's more complexity to it all, but really it all boils down to power for this particular bubble.
-
For accelerated rendering etc cuda is the standard, and because of it Nvidia. And it's like that for a lot of other niche areas. Accelerated encoding? NVIDIA, and CUDA again. Yes, AMD and Intel can generally do all that these days as well. But that's a much more recent thing. If all you want to do is game, sure that's not a big issue. But if you want to do anything more than gaming. Say video editing and rendering timelines. Nvidia had been the standard with a lot of inertia. To give an example of how badly AMD missed the boat. I have been using accelerated rendering on my gt 750 in blender for a decade now. The card came out early 2014. The first AMD card capable of that came out one month before the end of 2020. Nearly a 7 year difference! I'm looking at a recent Intel arc or battle mage card or a 6xxx series AMD ATM. Because I run BSD/Linux and Nvidia has traditionally been a necessary PITA. But with both of them now supporting my workflows. Nvidia is much less necessary. Others will eventually leave too. As long as AMD and Intel don't fuck it up for themselves.Yeah I mean you are preaching to the choir there. I picked up a used 3090 because rocm on the 7900 was in such a poor state. *** That being said, much of what you describe is just software obstinacy. AMD (for example) has had hardware encoding since early 2012, with the 7970, and Intel quicksync has long been a standard on laptops. It’s just a few stupid propriety bits that never bothered to support it. CUDA is indeed extremely entrenched in some areas, like anything involving PyTorch or Blender’s engines. But a lot has no good reason to not work on AMD/Intel.