Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

  1. Home
  2. Uncategorized
  3. The post-GeForce era: What if Nvidia abandons PC gaming?
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.

The post-GeForce era: What if Nvidia abandons PC gaming?

Scheduled Pinned Locked Moved Uncategorized
games
55 Posts 29 Posters 24 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • ? Guest
    This post did not contain any content.
    ? Offline
    ? Offline
    Guest
    wrote last edited by
    #12
    Good riddance, may the bubble burst and all that IP be available from a lincenser that charges low license fees, or even free!
    1 Reply Last reply
    0
    • 9 9488fcea02a9@sh.itjust.works
      What makes you think chinese firms wont also jump on the AI bandwagon? someone with an actual CS/engineering background feel free to correct me, but i feel like the only way out of this for gamers is if someone finds a better type of chip for AI work. GPUs just happened to be the best thing for the job when the world went crazy, they were never designed specifically for these workloads. If someone like tenstorrent can design a RISC-V chip for these LLM workloads, it might take some demand off gaming GPUs.
      ? Offline
      ? Offline
      Guest
      wrote last edited by
      #13
      >What makes you think chinese firms wont also jump on the AI bandwagon? the bubble won't last that long
      ? ? 2 Replies Last reply
      0
      • ? Guest
        This post did not contain any content.
        ? Offline
        ? Offline
        Guest
        wrote last edited by
        #14
        - Nvidia abandons x86 desktop gamers - The only hardware that gamers own are ARM handhelds - Some gamers stream x86 games, but devs start selling ARM builds since the x86 market is shrinking - AI bubble pops - Nvidia tries to regain x86 desktop gamers - Gamers are almost entirely on ARM - Nvidia pulls an IBM and vanishes quietly into enterprise services and not much else
        ? C 2 Replies Last reply
        1
        0
        • ? Guest
          All I have to say is thank God and good riddance may your stock price collapse
          V This user is from outside of this forum
          V This user is from outside of this forum
          vitorobles@lemmy.today
          wrote last edited by
          #15
          Seriously. Why would I care that a billion dollar corporation who exploited the market to maximize their revenue is leaving? "Bye bitch."
          ? 1 Reply Last reply
          0
          • B brucethemoose@lemmy.world
            Fact is, Nvidia has the vast majority of market share: https://www.techpowerup.com/337775/nvidia-grabs-market-share-amd-loses-ground-and-intel-disappears-in-latest-dgpu-update It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good? Alas, for *whatever reason* that’s how it is, so Nvidia suddenly becoming unavailable would be a huge event.
            ? Offline
            ? Offline
            Guest
            wrote last edited by
            #16
            For accelerated rendering etc cuda is the standard, and because of it Nvidia. And it's like that for a lot of other niche areas. Accelerated encoding? NVIDIA, and CUDA again. Yes, AMD and Intel can generally do all that these days as well. But that's a much more recent thing. If all you want to do is game, sure that's not a big issue. But if you want to do anything more than gaming. Say video editing and rendering timelines. Nvidia had been the standard with a lot of inertia. To give an example of how badly AMD missed the boat. I have been using accelerated rendering on my gt 750 in blender for a decade now. The card came out early 2014. The first AMD card capable of that came out one month before the end of 2020. Nearly a 7 year difference! I'm looking at a recent Intel arc or battle mage card or a 6xxx series AMD ATM. Because I run BSD/Linux and Nvidia has traditionally been a necessary PITA. But with both of them now supporting my workflows. Nvidia is much less necessary. Others will eventually leave too. As long as AMD and Intel don't fuck it up for themselves.
            B 1 Reply Last reply
            1
            0
            • ? Guest
              This post did not contain any content.
              ? Offline
              ? Offline
              Guest
              wrote last edited by
              #17
              My framerates have never been better since I went full AMD. My friend with a 5080 complains about low framerates on almost every new game, while I'm at max framerate. Don't let the door hit ya on the way out Nvidia.
              ? 1 Reply Last reply
              0
              • ? Guest
                This post did not contain any content.
                ? Offline
                ? Offline
                Guest
                wrote last edited by
                #18
                Games and gaming have fully become like Hollywood and Silicon Valley and I expect zero good things from them at this point. As with movies and music, now most of the good stuff will be from individuals and smaller enterprises. The fact is today's GPUs have enough power to do extraordinary things. The hardware these days moves so fast, no one is squeezing any performance out of anything like they used to have to do. And not every game needs photo realistic ray traced graphics, so these GPUs will be fine for many gamers so long as they remain supported through drivers.
                ? 1 Reply Last reply
                1
                0
                • ? Guest
                  My framerates have never been better since I went full AMD. My friend with a 5080 complains about low framerates on almost every new game, while I'm at max framerate. Don't let the door hit ya on the way out Nvidia.
                  ? Offline
                  ? Offline
                  Guest
                  wrote last edited by
                  #19
                  i love your spirit but you're either delusional or your friend is stupid or both
                  ? ? 3 Replies Last reply
                  1
                  0
                  • ? Guest
                    i love your spirit but you're either delusional or your friend is stupid or both
                    ? Offline
                    ? Offline
                    Guest
                    wrote last edited by
                    #20
                    Borderlands 4 runs at 143 fps with a 1% low of 101.
                    1 Reply Last reply
                    0
                    • ? Guest
                      i love your spirit but you're either delusional or your friend is stupid or both
                      ? Offline
                      ? Offline
                      Guest
                      wrote last edited by
                      #21
                      Well, my friend is using Windows, and I'm not, so...
                      ? 1 Reply Last reply
                      0
                      • ? Guest
                        >What makes you think chinese firms wont also jump on the AI bandwagon? the bubble won't last that long
                        ? Offline
                        ? Offline
                        Guest
                        wrote last edited by
                        #22
                        The only thing that will burst the bubble is electricity. The Dotcom bubble burst due to Dark Fiber, all because massive Internet backbones were easy to build, and the last mile to people's homes, was not. The current electrical grid cannot support the number of data centers being built. The ones that are planned on top of that... Well dark data centers will be the new dark fiber. There's more complexity to it all, but really it all boils down to power for this particular bubble.
                        ? 1 Reply Last reply
                        1
                        0
                        • ? Guest
                          For accelerated rendering etc cuda is the standard, and because of it Nvidia. And it's like that for a lot of other niche areas. Accelerated encoding? NVIDIA, and CUDA again. Yes, AMD and Intel can generally do all that these days as well. But that's a much more recent thing. If all you want to do is game, sure that's not a big issue. But if you want to do anything more than gaming. Say video editing and rendering timelines. Nvidia had been the standard with a lot of inertia. To give an example of how badly AMD missed the boat. I have been using accelerated rendering on my gt 750 in blender for a decade now. The card came out early 2014. The first AMD card capable of that came out one month before the end of 2020. Nearly a 7 year difference! I'm looking at a recent Intel arc or battle mage card or a 6xxx series AMD ATM. Because I run BSD/Linux and Nvidia has traditionally been a necessary PITA. But with both of them now supporting my workflows. Nvidia is much less necessary. Others will eventually leave too. As long as AMD and Intel don't fuck it up for themselves.
                          B This user is from outside of this forum
                          B This user is from outside of this forum
                          brucethemoose@lemmy.world
                          wrote last edited by
                          #23
                          Yeah I mean you are preaching to the choir there. I picked up a used 3090 because rocm on the 7900 was in such a poor state. *** That being said, much of what you describe is just software obstinacy. AMD (for example) has had hardware encoding since early 2012, with the 7970, and Intel quicksync has long been a standard on laptops. It’s just a few stupid propriety bits that never bothered to support it. CUDA is indeed extremely entrenched in some areas, like anything involving PyTorch or Blender’s engines. But a lot has no good reason to not work on AMD/Intel.
                          ? 1 Reply Last reply
                          1
                          0
                          • ? Guest
                            Games and gaming have fully become like Hollywood and Silicon Valley and I expect zero good things from them at this point. As with movies and music, now most of the good stuff will be from individuals and smaller enterprises. The fact is today's GPUs have enough power to do extraordinary things. The hardware these days moves so fast, no one is squeezing any performance out of anything like they used to have to do. And not every game needs photo realistic ray traced graphics, so these GPUs will be fine for many gamers so long as they remain supported through drivers.
                            ? Offline
                            ? Offline
                            Guest
                            wrote last edited by
                            #24
                            You expect zero good games to ever come out again? How edgy.
                            ? 1 Reply Last reply
                            1
                            0
                            • V vitorobles@lemmy.today
                              Seriously. Why would I care that a billion dollar corporation who exploited the market to maximize their revenue is leaving? "Bye bitch."
                              ? Offline
                              ? Offline
                              Guest
                              wrote last edited by
                              #25
                              I don't understand the hate on Nvidia. They raised their prices, people kept paying those prices. AMD has always been there, not quite as good. People who are willing to pay for the brand they want are the problem. Oh nooo I have to render at 2k instead of 4k how will my poor eyes survive?!?
                              ? 1 Reply Last reply
                              1
                              0
                              • ? Guest
                                >What makes you think chinese firms wont also jump on the AI bandwagon? the bubble won't last that long
                                ? Offline
                                ? Offline
                                Guest
                                wrote last edited by
                                #26
                                The bubble might burst, but there are real use cases for AI out there and therefore you will see AI in use even after the current ponzi scheme has collapsed. You can do great voice and handwriting recognition now and speech generation and that won't go away.
                                ? 1 Reply Last reply
                                0
                                • B brucethemoose@lemmy.world
                                  Yeah I mean you are preaching to the choir there. I picked up a used 3090 because rocm on the 7900 was in such a poor state. *** That being said, much of what you describe is just software obstinacy. AMD (for example) has had hardware encoding since early 2012, with the 7970, and Intel quicksync has long been a standard on laptops. It’s just a few stupid propriety bits that never bothered to support it. CUDA is indeed extremely entrenched in some areas, like anything involving PyTorch or Blender’s engines. But a lot has no good reason to not work on AMD/Intel.
                                  ? Offline
                                  ? Offline
                                  Guest
                                  wrote last edited by
                                  #27
                                  Yes, the software will get there long before many people's hearts and minds. As you said already is in many cases. The inertia Nvidia got by being early is why they're so dominant now. But I think Nvidia's focus on crypto and now data center AI is set to hurt them long term. Only time will tell, and they're technically swimming in it ATM. But I'm getting out now.
                                  B 1 Reply Last reply
                                  0
                                  • ? Guest
                                    You expect zero good games to ever come out again? How edgy.
                                    ? Offline
                                    ? Offline
                                    Guest
                                    wrote last edited by
                                    #28
                                    For real! I can't wait for Black Cocks 27: Call for Booty.
                                    ? 1 Reply Last reply
                                    1
                                    0
                                    • ? Guest
                                      Well, my friend is using Windows, and I'm not, so...
                                      ? Offline
                                      ? Offline
                                      Guest
                                      wrote last edited by
                                      #29
                                      A 5080 will play everything at max on either OS so something else is happening here
                                      ? 1 Reply Last reply
                                      1
                                      0
                                      • ? Guest
                                        i love your spirit but you're either delusional or your friend is stupid or both
                                        ? Offline
                                        ? Offline
                                        Guest
                                        wrote last edited by
                                        #30
                                        OP has a 60Hz monitor Friend has a 240Hz monitor and can’t enjoy a game unless it says 240 in the corner /s
                                        ? 1 Reply Last reply
                                        1
                                        0
                                        • ? Guest
                                          I don't understand the hate on Nvidia. They raised their prices, people kept paying those prices. AMD has always been there, not quite as good. People who are willing to pay for the brand they want are the problem. Oh nooo I have to render at 2k instead of 4k how will my poor eyes survive?!?
                                          ? Offline
                                          ? Offline
                                          Guest
                                          wrote last edited by
                                          #31
                                          That didn't happen in a vacuum. For a lot of us we do more than game. And there legitimately wasn't an alternative till much more recently. For instance, for Over a decade. If you were rendering out, a hardware accelerated video through Premiere. It was likely with an Nvidia card. Raytracing, Nvidia has been king at that since long before the 2000 series. It's changing slowly. Thank goodness. I'm more than happy to be able to ditch Nvidia myself.
                                          ? 1 Reply Last reply
                                          1
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • 1
                                          • 2
                                          • 3
                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups