Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

  1. Home
  2. Uncategorized
  3. The post-GeForce era: What if Nvidia abandons PC gaming?
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.

The post-GeForce era: What if Nvidia abandons PC gaming?

Scheduled Pinned Locked Moved Uncategorized
games
55 Posts 29 Posters 24 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • ? Guest
    >What makes you think chinese firms wont also jump on the AI bandwagon? the bubble won't last that long
    ? Offline
    ? Offline
    Guest
    wrote last edited by
    #26
    The bubble might burst, but there are real use cases for AI out there and therefore you will see AI in use even after the current ponzi scheme has collapsed. You can do great voice and handwriting recognition now and speech generation and that won't go away.
    ? 1 Reply Last reply
    0
    • B brucethemoose@lemmy.world
      Yeah I mean you are preaching to the choir there. I picked up a used 3090 because rocm on the 7900 was in such a poor state. *** That being said, much of what you describe is just software obstinacy. AMD (for example) has had hardware encoding since early 2012, with the 7970, and Intel quicksync has long been a standard on laptops. It’s just a few stupid propriety bits that never bothered to support it. CUDA is indeed extremely entrenched in some areas, like anything involving PyTorch or Blender’s engines. But a lot has no good reason to not work on AMD/Intel.
      ? Offline
      ? Offline
      Guest
      wrote last edited by
      #27
      Yes, the software will get there long before many people's hearts and minds. As you said already is in many cases. The inertia Nvidia got by being early is why they're so dominant now. But I think Nvidia's focus on crypto and now data center AI is set to hurt them long term. Only time will tell, and they're technically swimming in it ATM. But I'm getting out now.
      B 1 Reply Last reply
      0
      • ? Guest
        You expect zero good games to ever come out again? How edgy.
        ? Offline
        ? Offline
        Guest
        wrote last edited by
        #28
        For real! I can't wait for Black Cocks 27: Call for Booty.
        ? 1 Reply Last reply
        1
        0
        • ? Guest
          Well, my friend is using Windows, and I'm not, so...
          ? Offline
          ? Offline
          Guest
          wrote last edited by
          #29
          A 5080 will play everything at max on either OS so something else is happening here
          ? 1 Reply Last reply
          1
          0
          • ? Guest
            i love your spirit but you're either delusional or your friend is stupid or both
            ? Offline
            ? Offline
            Guest
            wrote last edited by
            #30
            OP has a 60Hz monitor Friend has a 240Hz monitor and can’t enjoy a game unless it says 240 in the corner /s
            ? 1 Reply Last reply
            1
            0
            • ? Guest
              I don't understand the hate on Nvidia. They raised their prices, people kept paying those prices. AMD has always been there, not quite as good. People who are willing to pay for the brand they want are the problem. Oh nooo I have to render at 2k instead of 4k how will my poor eyes survive?!?
              ? Offline
              ? Offline
              Guest
              wrote last edited by
              #31
              That didn't happen in a vacuum. For a lot of us we do more than game. And there legitimately wasn't an alternative till much more recently. For instance, for Over a decade. If you were rendering out, a hardware accelerated video through Premiere. It was likely with an Nvidia card. Raytracing, Nvidia has been king at that since long before the 2000 series. It's changing slowly. Thank goodness. I'm more than happy to be able to ditch Nvidia myself.
              ? 1 Reply Last reply
              1
              0
              • ? Guest
                The bubble might burst, but there are real use cases for AI out there and therefore you will see AI in use even after the current ponzi scheme has collapsed. You can do great voice and handwriting recognition now and speech generation and that won't go away.
                ? Offline
                ? Offline
                Guest
                wrote last edited by
                #32
                yes but the ridiculously heightened demand won't be there
                1 Reply Last reply
                1
                0
                • ? Guest
                  - Nvidia abandons x86 desktop gamers - The only hardware that gamers own are ARM handhelds - Some gamers stream x86 games, but devs start selling ARM builds since the x86 market is shrinking - AI bubble pops - Nvidia tries to regain x86 desktop gamers - Gamers are almost entirely on ARM - Nvidia pulls an IBM and vanishes quietly into enterprise services and not much else
                  ? Offline
                  ? Offline
                  Guest
                  wrote last edited by
                  #33
                  Nvidia has drivers for arm. They're not in as good a shape as the X86 one is. But I don't think it's that big of a roadblock.
                  ? 1 Reply Last reply
                  1
                  0
                  • ? Guest
                    The only thing that will burst the bubble is electricity. The Dotcom bubble burst due to Dark Fiber, all because massive Internet backbones were easy to build, and the last mile to people's homes, was not. The current electrical grid cannot support the number of data centers being built. The ones that are planned on top of that... Well dark data centers will be the new dark fiber. There's more complexity to it all, but really it all boils down to power for this particular bubble.
                    ? Offline
                    ? Offline
                    Guest
                    wrote last edited by
                    #34
                    or lack of use? the current trend is fueled by hype that AI can do everything and will sub 50% of the work force, another nightmare scenario… however, current AI may be an Ok tool for some jobs and not much more, dsthe world does not need 200 Gwatts of AI datacentres to produce memes
                    ? 1 Reply Last reply
                    1
                    0
                    • ? Guest
                      Yes, the software will get there long before many people's hearts and minds. As you said already is in many cases. The inertia Nvidia got by being early is why they're so dominant now. But I think Nvidia's focus on crypto and now data center AI is set to hurt them long term. Only time will tell, and they're technically swimming in it ATM. But I'm getting out now.
                      B This user is from outside of this forum
                      B This user is from outside of this forum
                      brucethemoose@lemmy.world
                      wrote last edited by
                      #35
                      CUDA is actually pretty cool, especially in the early days when there was nothing like it. And Intel/AMD attempts at alternatives have been as mixed as their corporate dysfunction. And Nvidia has long has a focus on other spaces, like VR, AR, dataset generation, augmented reality and stuff. If every single LLM thing disappeared overnight in a puff of smoke, they’d be fine. It that I’m an apologist for them being total jerks, but I don’t want to act like CUDA isn’t useful, either.
                      1 Reply Last reply
                      1
                      0
                      • ? Guest
                        For real! I can't wait for Black Cocks 27: Call for Booty.
                        ? Offline
                        ? Offline
                        Guest
                        wrote last edited by
                        #36
                        There are other publishers besides EA out there. Literally nobody is making you buy garbage other than you.
                        ? 1 Reply Last reply
                        1
                        0
                        • ? Guest
                          A 5080 will play everything at max on either OS so something else is happening here
                          ? Offline
                          ? Offline
                          Guest
                          wrote last edited by
                          #37
                          I was watching him stream Space Engineers 2 (which admittedly an alpha) at 15-30 fps. It runs 60+ on an RX 7900xtx.
                          1 Reply Last reply
                          1
                          0
                          • ? Guest
                            That didn't happen in a vacuum. For a lot of us we do more than game. And there legitimately wasn't an alternative till much more recently. For instance, for Over a decade. If you were rendering out, a hardware accelerated video through Premiere. It was likely with an Nvidia card. Raytracing, Nvidia has been king at that since long before the 2000 series. It's changing slowly. Thank goodness. I'm more than happy to be able to ditch Nvidia myself.
                            ? Offline
                            ? Offline
                            Guest
                            wrote last edited by
                            #38
                            I think "lot" is probably overstating but I don't have numbers. I don't know too much about ray tracing but I do know AMD is bad at it. But in the context of games I grew up with perfect dark. I can handle imperfect reflections and shadows. Really anyone can, but when you're shopping for things you tend to get caught up in wanting the best shiny new hotness and if you ignore the impacts of your choices and have infinite money, Nvidia is the clear winner. We just have now reached the inevitable outcome of that path.
                            ? 1 Reply Last reply
                            0
                            • ? Guest
                              This post did not contain any content.
                              M This user is from outside of this forum
                              M This user is from outside of this forum
                              mindbleach@sh.itjust.works
                              wrote last edited by
                              #39
                              PC gaming itself will hardly change, because AMD cards work just fucking fine. They've only ever been a little bit behind on the high end. They've routinely been the better value for money, and offered a much lower low end. If they don't have to keep chasing the incomparable advantages Nvidia pulls out of their ass, maybe they can finally get serious about heterogenous compute. Or hey, maybe Nvidia ditching us would mean AMD finds the testicular fortitude to *clone CUDA already,* so we can end this farce of proprietary computation for your own god-damn code. Making any PC component single-vendor should've seen Nvidia chopped in half, long before this stupid bubble. Meanwhile: Cloud gaming isn't real. Anywhere after 1977, the idea that consumers would buy half a computer and phone in to a mainframe was a joke. The up-front savings were negligible and difference in capabilities did not matter. All you missed out on were your dungeon-crawlers being multiplayer, and mainframe operators kept trying to delete those programs anyway. Once home internet became commonplace even that difference vanished. As desktop prices rose and video encoding sped up, people kept selling the idea you'll buy a dumb screen and pay to play games somewhere else. You could even use your phone! Well... nowadays your phone can run Unreal 5. And a PS5 costs as much as my dirt-cheap eMachines from the AOL era, *before* inflation. That console will do raytracing, except games don't use it much, because it doesn't actually look better than how hard we've cheated with rasterization. So what the fuck is a datacenter going to offer, with 50ms of lag and compression artifacts? Who expects it's going to be cheaper, as we all juggle five subscriptions for streaming video?
                              1 Reply Last reply
                              1
                              0
                              • ? Guest
                                - Nvidia abandons x86 desktop gamers - The only hardware that gamers own are ARM handhelds - Some gamers stream x86 games, but devs start selling ARM builds since the x86 market is shrinking - AI bubble pops - Nvidia tries to regain x86 desktop gamers - Gamers are almost entirely on ARM - Nvidia pulls an IBM and vanishes quietly into enterprise services and not much else
                                C This user is from outside of this forum
                                C This user is from outside of this forum
                                carpelbridgesyndrome@sh.itjust.works
                                wrote last edited by
                                #40
                                Nvidia does not care about the ISA of the CPU at all. They don't make it after all. Also not clear how they would kill x86. If they leave the market they cede it to AMD and Intel.
                                ? 1 Reply Last reply
                                1
                                0
                                • ? Guest
                                  There are other publishers besides EA out there. Literally nobody is making you buy garbage other than you.
                                  ? Offline
                                  ? Offline
                                  Guest
                                  wrote last edited by
                                  #41
                                  I didn't do any of this. In fact, you were the mastermind behind all this. You did this, you need to learn that you not only caused the problem; you are the origin of the problem. You will never learn, will you!
                                  ? 1 Reply Last reply
                                  0
                                  • ? Guest
                                    >It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good? Because the 2022 GPU amd 7900xtx has better raw performance than the 9000 series, and I'm pissed off they didn't try to one-up themselves in the high-end market. I'm not buying a new Nvidia card but I'm not buying a 9000 series either because it feels like I'm paying for a sub-par GPU compared to what they're capable of.
                                    ? Offline
                                    ? Offline
                                    Guest
                                    wrote last edited by
                                    #42
                                    how is it a sub par GPU given it targets a specific segment (looking at it's price point, die area and memory & power envelope) with its configuration? You're upset that they didn't aim for a halo GPU and I can understand that, but how does this completely rule out a mid to high end offering from them? the 9000 series is reminiscent of nv10 versus vega10 GPUs like the 56, 64, even the Radeon 7; achieving _far_ more performance for less power and hardware.
                                    1 Reply Last reply
                                    0
                                    • C carpelbridgesyndrome@sh.itjust.works
                                      Nvidia does not care about the ISA of the CPU at all. They don't make it after all. Also not clear how they would kill x86. If they leave the market they cede it to AMD and Intel.
                                      ? Offline
                                      ? Offline
                                      Guest
                                      wrote last edited by
                                      #43
                                      > Nvidia does not care about the ISA of the CPU at all. That’s kinda my point. They’re stuck communicating over PCI-E instead of being a first-class co-processor over AMBA.
                                      1 Reply Last reply
                                      1
                                      0
                                      • ? Guest
                                        Nvidia has drivers for arm. They're not in as good a shape as the X86 one is. But I don't think it's that big of a roadblock.
                                        ? Offline
                                        ? Offline
                                        Guest
                                        wrote last edited by
                                        #44
                                        Sure, but do you need a discrete video card if you’re gaming on an ARM SoC? And we’ve seen from the struggles of x86 iGPUs that graphics APIs pretty much have to choose whether they’re going to optimize for dedicated VRAM or shared memory, cuz it has inescapable implications for how you structure a game engine. ARM APIs will probably continue optimizing for shared memory, so PCI-E GPUs will always be second-class citizens.
                                        ? ? 2 Replies Last reply
                                        0
                                        • ? Guest
                                          I think "lot" is probably overstating but I don't have numbers. I don't know too much about ray tracing but I do know AMD is bad at it. But in the context of games I grew up with perfect dark. I can handle imperfect reflections and shadows. Really anyone can, but when you're shopping for things you tend to get caught up in wanting the best shiny new hotness and if you ignore the impacts of your choices and have infinite money, Nvidia is the clear winner. We just have now reached the inevitable outcome of that path.
                                          ? Offline
                                          ? Offline
                                          Guest
                                          wrote last edited by
                                          #45
                                          I've been able to use cuda accelerated cycles rendering in blender with my nearly 12 year old gt 750 for a decade. We aren't even talking RT cores, though they still have a solid lead yes. AMD didn't get the capability till basically the covid chip shortage and crypto bubble. When everything was unobtanium. Likewise, go talk to anyone that edits video semi professionally. Accelerated timeline rendering via cuda in premiere was massive. AMD and now Intel are supporting both finally. But are only roughly a decade late. And software is still maturing for them. I'm looking at upgrading to a battle mage card since they can support my workflows. Gaming and 3d modeling/raytracing. 2 years ago that wouldn't have been a possibility. Nvidia made a massively good investment and position with cuda.
                                          ? 1 Reply Last reply
                                          1
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • 1
                                          • 2
                                          • 3
                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups