Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

  1. Home
  2. Uncategorized
  3. The post-GeForce era: What if Nvidia abandons PC gaming?
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.

The post-GeForce era: What if Nvidia abandons PC gaming?

Scheduled Pinned Locked Moved Uncategorized
games
55 Posts 29 Posters 24 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • ? Guest
    This post did not contain any content.
    M This user is from outside of this forum
    M This user is from outside of this forum
    mindbleach@sh.itjust.works
    wrote last edited by
    #39
    PC gaming itself will hardly change, because AMD cards work just fucking fine. They've only ever been a little bit behind on the high end. They've routinely been the better value for money, and offered a much lower low end. If they don't have to keep chasing the incomparable advantages Nvidia pulls out of their ass, maybe they can finally get serious about heterogenous compute. Or hey, maybe Nvidia ditching us would mean AMD finds the testicular fortitude to *clone CUDA already,* so we can end this farce of proprietary computation for your own god-damn code. Making any PC component single-vendor should've seen Nvidia chopped in half, long before this stupid bubble. Meanwhile: Cloud gaming isn't real. Anywhere after 1977, the idea that consumers would buy half a computer and phone in to a mainframe was a joke. The up-front savings were negligible and difference in capabilities did not matter. All you missed out on were your dungeon-crawlers being multiplayer, and mainframe operators kept trying to delete those programs anyway. Once home internet became commonplace even that difference vanished. As desktop prices rose and video encoding sped up, people kept selling the idea you'll buy a dumb screen and pay to play games somewhere else. You could even use your phone! Well... nowadays your phone can run Unreal 5. And a PS5 costs as much as my dirt-cheap eMachines from the AOL era, *before* inflation. That console will do raytracing, except games don't use it much, because it doesn't actually look better than how hard we've cheated with rasterization. So what the fuck is a datacenter going to offer, with 50ms of lag and compression artifacts? Who expects it's going to be cheaper, as we all juggle five subscriptions for streaming video?
    1 Reply Last reply
    1
    0
    • ? Guest
      - Nvidia abandons x86 desktop gamers - The only hardware that gamers own are ARM handhelds - Some gamers stream x86 games, but devs start selling ARM builds since the x86 market is shrinking - AI bubble pops - Nvidia tries to regain x86 desktop gamers - Gamers are almost entirely on ARM - Nvidia pulls an IBM and vanishes quietly into enterprise services and not much else
      C This user is from outside of this forum
      C This user is from outside of this forum
      carpelbridgesyndrome@sh.itjust.works
      wrote last edited by
      #40
      Nvidia does not care about the ISA of the CPU at all. They don't make it after all. Also not clear how they would kill x86. If they leave the market they cede it to AMD and Intel.
      ? 1 Reply Last reply
      1
      0
      • ? Guest
        There are other publishers besides EA out there. Literally nobody is making you buy garbage other than you.
        ? Offline
        ? Offline
        Guest
        wrote last edited by
        #41
        I didn't do any of this. In fact, you were the mastermind behind all this. You did this, you need to learn that you not only caused the problem; you are the origin of the problem. You will never learn, will you!
        ? 1 Reply Last reply
        0
        • ? Guest
          >It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good? Because the 2022 GPU amd 7900xtx has better raw performance than the 9000 series, and I'm pissed off they didn't try to one-up themselves in the high-end market. I'm not buying a new Nvidia card but I'm not buying a 9000 series either because it feels like I'm paying for a sub-par GPU compared to what they're capable of.
          ? Offline
          ? Offline
          Guest
          wrote last edited by
          #42
          how is it a sub par GPU given it targets a specific segment (looking at it's price point, die area and memory & power envelope) with its configuration? You're upset that they didn't aim for a halo GPU and I can understand that, but how does this completely rule out a mid to high end offering from them? the 9000 series is reminiscent of nv10 versus vega10 GPUs like the 56, 64, even the Radeon 7; achieving _far_ more performance for less power and hardware.
          1 Reply Last reply
          0
          • C carpelbridgesyndrome@sh.itjust.works
            Nvidia does not care about the ISA of the CPU at all. They don't make it after all. Also not clear how they would kill x86. If they leave the market they cede it to AMD and Intel.
            ? Offline
            ? Offline
            Guest
            wrote last edited by
            #43
            > Nvidia does not care about the ISA of the CPU at all. That’s kinda my point. They’re stuck communicating over PCI-E instead of being a first-class co-processor over AMBA.
            1 Reply Last reply
            1
            0
            • ? Guest
              Nvidia has drivers for arm. They're not in as good a shape as the X86 one is. But I don't think it's that big of a roadblock.
              ? Offline
              ? Offline
              Guest
              wrote last edited by
              #44
              Sure, but do you need a discrete video card if you’re gaming on an ARM SoC? And we’ve seen from the struggles of x86 iGPUs that graphics APIs pretty much have to choose whether they’re going to optimize for dedicated VRAM or shared memory, cuz it has inescapable implications for how you structure a game engine. ARM APIs will probably continue optimizing for shared memory, so PCI-E GPUs will always be second-class citizens.
              ? ? 2 Replies Last reply
              0
              • ? Guest
                I think "lot" is probably overstating but I don't have numbers. I don't know too much about ray tracing but I do know AMD is bad at it. But in the context of games I grew up with perfect dark. I can handle imperfect reflections and shadows. Really anyone can, but when you're shopping for things you tend to get caught up in wanting the best shiny new hotness and if you ignore the impacts of your choices and have infinite money, Nvidia is the clear winner. We just have now reached the inevitable outcome of that path.
                ? Offline
                ? Offline
                Guest
                wrote last edited by
                #45
                I've been able to use cuda accelerated cycles rendering in blender with my nearly 12 year old gt 750 for a decade. We aren't even talking RT cores, though they still have a solid lead yes. AMD didn't get the capability till basically the covid chip shortage and crypto bubble. When everything was unobtanium. Likewise, go talk to anyone that edits video semi professionally. Accelerated timeline rendering via cuda in premiere was massive. AMD and now Intel are supporting both finally. But are only roughly a decade late. And software is still maturing for them. I'm looking at upgrading to a battle mage card since they can support my workflows. Gaming and 3d modeling/raytracing. 2 years ago that wouldn't have been a possibility. Nvidia made a massively good investment and position with cuda.
                ? 1 Reply Last reply
                1
                0
                • ? Guest
                  Sure, but do you need a discrete video card if you’re gaming on an ARM SoC? And we’ve seen from the struggles of x86 iGPUs that graphics APIs pretty much have to choose whether they’re going to optimize for dedicated VRAM or shared memory, cuz it has inescapable implications for how you structure a game engine. ARM APIs will probably continue optimizing for shared memory, so PCI-E GPUs will always be second-class citizens.
                  ? Offline
                  ? Offline
                  Guest
                  wrote last edited by
                  #46
                  Yes, even for applications other than gaming. There are legitimate mad lads out there running steam games with discrete video cards on Raspberry Pi's and LLMs. Not to mention there are non soc arm machines. And soc Intel machines. Sometimes getting the integrated graphics on any of these SOCs working is a much harder prospect than getting a discrete one.
                  1 Reply Last reply
                  0
                  • ? Guest
                    or lack of use? the current trend is fueled by hype that AI can do everything and will sub 50% of the work force, another nightmare scenario… however, current AI may be an Ok tool for some jobs and not much more, dsthe world does not need 200 Gwatts of AI datacentres to produce memes
                    ? Offline
                    ? Offline
                    Guest
                    wrote last edited by
                    #47
                    Data centers are already paid for, so they're being built. But if they can't go online due to power costs. Then that will burst the bubble. As for AI use... Sadly there are a bunch of people using it. And while the drop off rate of people trying it and ditching it is steep, there's actually a readoption curve. Which never fucking happens. So everyone is betting on the next model being better, and more people giving it all a second chance... Which are two open questions. But no power means no new model and no readoptions. This no profit. Those other steps can fail, but without power it all fails.
                    ? 1 Reply Last reply
                    1
                    0
                    • ? Guest
                      I've been able to use cuda accelerated cycles rendering in blender with my nearly 12 year old gt 750 for a decade. We aren't even talking RT cores, though they still have a solid lead yes. AMD didn't get the capability till basically the covid chip shortage and crypto bubble. When everything was unobtanium. Likewise, go talk to anyone that edits video semi professionally. Accelerated timeline rendering via cuda in premiere was massive. AMD and now Intel are supporting both finally. But are only roughly a decade late. And software is still maturing for them. I'm looking at upgrading to a battle mage card since they can support my workflows. Gaming and 3d modeling/raytracing. 2 years ago that wouldn't have been a possibility. Nvidia made a massively good investment and position with cuda.
                      ? Offline
                      ? Offline
                      Guest
                      wrote last edited by
                      #48
                      Again I don't know anyone in that space, nor do I know numbers on gamers vs pros, so I cannot comment further.
                      1 Reply Last reply
                      1
                      0
                      • ? Guest
                        I didn't do any of this. In fact, you were the mastermind behind all this. You did this, you need to learn that you not only caused the problem; you are the origin of the problem. You will never learn, will you!
                        ? Offline
                        ? Offline
                        Guest
                        wrote last edited by
                        #49
                        Ok doomer
                        ? 1 Reply Last reply
                        1
                        0
                        • ? Guest
                          Ok doomer
                          ? Offline
                          ? Offline
                          Guest
                          wrote last edited by
                          #50
                          You need to do you, you /S
                          1 Reply Last reply
                          1
                          0
                          • ? Guest
                            This post did not contain any content.
                            ? Offline
                            ? Offline
                            Guest
                            wrote last edited by
                            #51
                            They are in cahoots with the RAM cartels to push gaming onto their cloud services so that competitors like AMD don't just pick them up. Trying to make everything into a service is just a side benefit, although I'm sure they realize 16 bit SNES games are still fun and that people will just be driven to less powerful entertainment platforms.
                            1 Reply Last reply
                            1
                            0
                            • ? Guest
                              >It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good? Because the 2022 GPU amd 7900xtx has better raw performance than the 9000 series, and I'm pissed off they didn't try to one-up themselves in the high-end market. I'm not buying a new Nvidia card but I'm not buying a 9000 series either because it feels like I'm paying for a sub-par GPU compared to what they're capable of.
                              C This user is from outside of this forum
                              C This user is from outside of this forum
                              cancermancer@sh.itjust.works
                              wrote last edited by
                              #52
                              >AMD didn't make a 5090 equivalent so I won't buy their mid-tier card Is there a name for this thinking?
                              1 Reply Last reply
                              1
                              0
                              • ? Guest
                                Data centers are already paid for, so they're being built. But if they can't go online due to power costs. Then that will burst the bubble. As for AI use... Sadly there are a bunch of people using it. And while the drop off rate of people trying it and ditching it is steep, there's actually a readoption curve. Which never fucking happens. So everyone is betting on the next model being better, and more people giving it all a second chance... Which are two open questions. But no power means no new model and no readoptions. This no profit. Those other steps can fail, but without power it all fails.
                                ? Offline
                                ? Offline
                                Guest
                                wrote last edited by
                                #53
                                > Data centers are already paid for, so they’re being built. no they are not… they are contracted in paper by openAI, for example, who has no way of paying for them other than "trust me bro" > But if they can’t go online due to power costs it's not just the cost… the infrastructure to produce all this additional power does not exist… another issue with these massive bubble >As for AI use… Sadly there are a bunch of people using it. And while the drop off rate of people trying it and ditching it is steep, there’s actually a readoption curve. Which never fucking happens. Served by the current infra, why do we need the next 200 gwatt for? to make memes faster? >So everyone is betting on the next model being better, and more people giving it all a second chance… Which are two open questions. by everybody you mean those ped.ling the bubble… we already saw the decline in the newer models and know it´s matematically impossible to get rid of the slop or get to "gen ai" through LLMs
                                1 Reply Last reply
                                0
                                • ? Guest
                                  OP has a 60Hz monitor Friend has a 240Hz monitor and can’t enjoy a game unless it says 240 in the corner /s
                                  ? Offline
                                  ? Offline
                                  Guest
                                  wrote last edited by
                                  #54
                                  Friend plugged his screen into the onboard graphics card output and hasn't figured out how to setup pass-through.
                                  1 Reply Last reply
                                  1
                                  0
                                  • ? Guest
                                    Sure, but do you need a discrete video card if you’re gaming on an ARM SoC? And we’ve seen from the struggles of x86 iGPUs that graphics APIs pretty much have to choose whether they’re going to optimize for dedicated VRAM or shared memory, cuz it has inescapable implications for how you structure a game engine. ARM APIs will probably continue optimizing for shared memory, so PCI-E GPUs will always be second-class citizens.
                                    ? Offline
                                    ? Offline
                                    Guest
                                    wrote last edited by
                                    #55
                                    You need a discrete video card on ARM exactly as much as you do on x86. The GPU is independent of the CPU architecture, so if you don't like x86 iGPU performance expect to not like ARM iGPU performance either. Of course you can dump a full desktop dGPU into the same package as your CPU, like Apple did with the M4 Max, but again this can be done on x86 as well. None of that is dependent on the ISA.
                                    1 Reply Last reply
                                    0

                                    Reply
                                    • Reply as topic
                                    Log in to reply
                                    • Oldest to Newest
                                    • Newest to Oldest
                                    • Most Votes


                                    • 1
                                    • 2
                                    • 3
                                    • Login

                                    • Don't have an account? Register

                                    • Login or register to search.
                                    Powered by NodeBB Contributors
                                    • First post
                                      Last post
                                    0
                                    • Categories
                                    • Recent
                                    • Tags
                                    • Popular
                                    • World
                                    • Users
                                    • Groups