• ryper@lemmy.ca
    link
    fedilink
    English
    arrow-up
    75
    arrow-down
    2
    ·
    edit-2
    2 days ago

    Reminder that Bethesda is owned by Microsoft, the company that insists it’s going to end support for Windows 10 in October and wants everyone to move to Windows 11, which doesn’t officially support perfectly functional but somewhat old CPUs. So of course they don’t care about GPUs too old to support ray tracing.

    • kemsat@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      At some point it was going to happen, this is just earlier than many thought. The real question is “when is AMD going to have an answer to Nvidia when it comes to RT performance?”

      • MystikIncarnate@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        14 hours ago

        Earlier than they thought?

        How long did they think it would take before RT was a requirement? It was introduced with the GeForce 20 series more than six years ago.

        For technology, six years is vintage.

        The only people this should affect is people still using GTX 10 and 16 series cards. I dunno what’s happening with AMD/Radeon. Since they were purchased by AMD the banking schemes have gotten to be more and more nonsensical, so I always have a hard time knowing WTF generation a card is from by the model number.

        In any case. Yeah, people using 5+ year old tech are going to be unable to play the latest AAA games. And?

        Has there ever been a time when a 5+ year old system can reasonably play a modern AAA title without it being a slide show?

        • kemsat@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          14 hours ago

          I’m still hearing from people that they’re using an Nvidia 1000 card. I was expecting to hear 2000 instead of 1000, and then it would happen.

          • MystikIncarnate@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            13 hours ago

            I have a 20 series card, albeit one of the higher tier ones, and I probably won’t be upgrading this year. I probably also won’t be playing any new AAA titles either.

            It’s fine to have an older card, but nobody in that position should be expecting to play the latest and greatest games at reasonable framerates, if at all.

            It is the way of things.

            I am personally rather miffed about the fact that if you want any performance from a GPU, you basically need to spend $800+. Even though some cards are saying they’re available for less, they almost never are, either due to scalping or greed (which are kind of the same thing), or something else like idiotic tariffs. I don’t have nearly a grand I can burn every year to upgrade my GPU the last GPU I bought was a 1060, and my current card was a gift. I haven’t had a budget for a decent GPU in many, many years.

            When I upgrade, I’m likely going Intel arc, because the value proposition makes sense to me. I can actually spend less than $600 and get a card that will have some reasonable level of performance.

            • kemsat@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 hours ago

              The current Intel GPUs aren’t better than a RTX 2070, so that won’t be an upgrade if you’re on a higher tier 2000 series.

              I just went up to a 4070 Ti from a 2080 Ti, because it was the only worthwhile upgrade. $660 used. So you don’t need to spend $800.

    • nossaquesapao@lemmy.eco.br
      link
      fedilink
      English
      arrow-up
      28
      ·
      2 days ago

      They make gaming more and more elitist hobby, and then get surprised when indie games with pixel graphics that can run even on potato devices make a great success.

    • djsaskdja@reddthat.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      Somewhat old CPUs are 8 years old+ now? Windows 11 is crap, but I don’t think the hardware requirements are the reason.

  • Dark Arc@social.packetloss.gg
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    3
    ·
    edit-2
    2 days ago

    They used ray tracing for the hit registration so that’s presumably why.

    It’s a really interesting idea … presumably that means there are some really flashy guns and there is a very intricate damage system that runs at least partially on the GPU.

    • BCsven@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      14 hours ago

      W10 OK slow, but OK. W11 so much jank and buggy bullshit. I moved allmy games to Linux. With Proton and Vulkan all my games work including the RTX settings.

    • WaterWaiver@aussie.zone
      link
      fedilink
      English
      arrow-up
      22
      ·
      edit-2
      1 day ago

      really flashy guns and there is a very intricate damage system that runs at least partially on the GPU.

      Short opinion: no, CPU’s can do that fine (possibly better) and it’s a tiny corner of game logic.

      Long opinion: Intersecting projectile paths with geometry will not gain advantages being moved from CPU to GPU unless you’re dealing with a ridiculous amount of projectiles every single frame. In most games this is less than 1% of CPU time and moving it to the GPU will probably reduce overall performance due to the latency costs (…but a lot of modern engines already have awful frame latency, so it might fit right in fine).

      You would only do this if you have been told by higher ups that you have to OR if you have a really unusual and new game design (thousands of new projectile paths every frame? ie hundreds of thousands of bullets per second). Even detailed multi-layer enemy models with vital components is just a few extra traces, using a GPU to calc that would make the job harder for the engine dev for no gain.

      Fun answer: checkout CNlohr’s noeuclid. Sadly no windows build (I tried cross compiling but ended up in dependency hell), but still compiles and runs under Linux. Physics are on the GPU and world geometry is very non-traditional. https://github.com/cnlohr/noeuclid

        • WaterWaiver@aussie.zone
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          1 day ago

          Ooh thankyou for the link.

          “We can leverage it [ray tracing] for things we haven’t been able to do in the past, which is giving accurate hit detection”

          “So when you fire your weapon, the [hit] detection would be able to tell if you’re hitting a pixel that is leather sitting next to a pixel that is metal”

          “Before ray tracing, we couldn’t distinguish between two pixels very easily, and we would pick one or the other because the materials were too complex. Ray tracing can do this on a per-pixel basis and showcase if you’re hitting metal or even something that’s fur. It makes the game more immersive, and you get that direct feedback as the player.”

          It sounds like they’re assigning materials based off the pixels of a texture map, rather than each mesh in a model being a different material. ie you paint materials onto a character rather than selecting chunks of the character and assigning them.

          I suspect this either won’t be noticeable at all to players or it will be a very minor improvement (at best). It’s not something worth going for in exchange for losing compatibility with other GPUs. It will require a different work pipeline for the 3D modellers (they have to paint materials on now rather than assign them per-mesh), but that’s neither here nor there, it might be easier for them or it might be hell-awful depending on the tooling.

          This particular sentence upsets me:

          Before ray tracing, we couldn’t distinguish between two pixels very easily

          Uhuh. You’re not selling me on your game company.

          “Before” ray tracing, the technology that has been around for decades. That you could do on a CPU or GPU for this very material-sensing task without the players noticing for around 20 years. Interpolate UVs across the colliding triangle and sample a texture.

          I suspect the “more immersion” and “direct feedback” are veils over the real reasoning:

          During NVIDIA’s big GeForce RTX 50 Series reveal, we learned that id has been working closely with the GeForce team on the game for several years (source)

          With such a strong emphasis on RT and DLSS, it remains to be seen how these games will perform for AMD Radeon users

          No-one sane implements Nvidia or AMD (or anyone else) exclusive libraries into their games unless they’re paid to do it. A game dev that cares about its players will make their game run well on all brands and flavours of graphics card.

          At the end of the day this hurts consumers. If your games work on all GPU brands competitively then you have more choice and card companies are better motivated to compete. Whatever amount of money Nvidia is paying the gamedevs to do this must be smaller than what they earn back from consumers buying more of their product instead of competitors.

          • Dark Arc@social.packetloss.gg
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            17 hours ago

            Well like, basically every shooter currently uses a hitbox to do the hitscan and that never matches the model 1:1. The hitboxes are typically far less detailed and the weak points are just a different part of the hitbox that is similarly less detailed.

            I think what they’re doing is using the RT specialized hardware to evaluate the bullet path (just like a ray of light from a point) more cheaply than can be traditionally done on the GPU (effectively what Nvidia enabled when they introduced hardware designed for ray tracing).

            If I’m guessing correctly, it’s not so much that they’re disregarding the mesh but they’re disregarding hitbox design. Like, the hit damage is likely based on the mesh and the actual rendered model vs the simplified hitbox … so there’s no “you technically shot past their ear, but it’s close enough so we’re going to call it a headshot” sort of stuff.

            If you’re doing a simulated shotgun blast that could also be a hundred pellets being simulated through the barrel heading towards the target as well. Then add in more enemies that shoot things and a few new gun designs and… maybe it starts to make sense.

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          It sounds like they’re tying the effect of attacks to the actual fine detail game textures/materials, which I guess are only available on the GPU? It’s a weird thing to do and a bad description of it IMO, but that’s what I got from that summary. It wouldn’t be anywhere near as fast as normal hitscan would be on the CPU, and it also takes GPU time which generally is more limited with the thread count on modern processors being what it is.

          Since there is probably only 1 bullet shot most of the time on any given frame, the minimum size of a dispatch on the GPU is usually 32-64 cores (out of maybe 1k-20k), just to calculate this one singular bullet with a single core. GPU cores are also much slower than CPU cores, so clearly the only possible reason to do this is if the data needed literally only exists on the GPU, which it sounds like it does in this case. You would also first have to transfer that there was a shot taken to the GPU, which then would have to transfer that data back to the CPU, coming with a small amount of latency both ways.

          This also only makes sense if you already use raytracing elsewhere, because you generally need a BVH for raytracing and these are expensive to build.

          Although this is using raytracing, the only reason not to support cards without hardware raytracing is that it would take more effort to do so (as you would have to maintain both a normal raytracer and a DXR version)

    • ProfessorProteus@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 days ago

      Not disputing you, but hasn’t hitscan been a thing for decades? Or is what you’re saying a different thing?

      Also, I always thought that the CPU and GPU either couldn’t communicate with each other, or that it was a very difficult problem to solve. Have they found a way to make this intercommunication work on a large scale? Admittedly I only scanned the article quickly, but it looks like they’re only talking about graphics quality. I’d love to know if they’re leveraging the GPU for more than just visuals!

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 day ago

        It’s a different thing. This is pixel perfect accuracy for the entire projectile. There aren’t hotboxes as I understand it, it’s literally what the model is on the screen.

        • ProfessorProteus@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 day ago

          Ooh, that makes sense. Sounds like it could be much cheaper to process than heavy collision models. Thanks for the clarification!

    • saigot@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      15 hours ago

      The first ray tracing GPU came out 7 years ago (rtx 2080), eternal came out in 2020. In 2013, the top card was a gtx 680. Eternal lists it’s minimum specs as a 1050ti and from some quick google people trying to run on a 680 are getting sub 30fps on low. Of course just supporting ray tracing doesn’t mean it will actually be playable, but indiana jones (the only released game that requires RT today) seems to get 50fps on low with 2080s.

      Fwiw a few 2080 supers are going for sub 50 bucks on my local buy and sell groups.

        • saigot@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          11 hours ago

          A big part of why it doesn’t have as big a visual impact is because scenes have to be designed to be acceptable without ray tracing. Not only that but with mandatory ray tracing mechanics that are reliant on RT can be implemented.

          The opening of Indiana jones has quite a lot that isn’t possible traditionally and looks pretty awesome to my eyes.

    • nul9o9@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      1
      ·
      2 days ago

      I’m just getting weird feeling in general with Dark Ages. Eternal and 2016 were my jam, but this recent gameplay trailer made me worried.

      Glory kills being replaced with a silly ragdoll kick, and the snail paced gameplay with neutered difficulty just felt boring.

      • omarfw@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        ·
        2 days ago

        Bethesda being owned by Microsoft means they’re tainted by the influence of shareholders now. The decay is inevitable. Wall street ruins all studios with time.

    • warm@kbin.earth
      link
      fedilink
      arrow-up
      23
      arrow-down
      3
      ·
      2 days ago

      Imo, they have downgraded each time.

      2016 was a great reboot of the franchise, it felt like a modern version of the originals. It looked great and ran great.

      Then Eternal added pointless wall climbing, loads of skills, put the whole game on rails. The start of the game is like a tutorial for 2 hours, good game design doesnt require you to tell me exactly what to do every 5 minutes. The graphics got downgraded with some shitty AA added, probably Temporal, I cant remember and the performance was down across the board. The game just didnt feel like DOOM anymore and was a poor continuation from 2016.

      Now we have this forced ray tracing, probably going to require DLSS and shit. This has to be a case of Nvidia slipping cash right? No way they would be so stupid? It’ll probably be a case of updating to remove the requirement a few months down the line.

      • wiccan2@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        2 days ago

        Agree completely, Eternal is what stopped me buying preorder, such a let down following 2016 and really didn’t feel like Doom at all.

      • Renacles@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        6
        ·
        2 days ago

        Eternal was amazing, 2016 is a great shooter but it’s way too simple and unbalanced at higher difficulties.

        • moody@lemmings.world
          link
          fedilink
          English
          arrow-up
          18
          ·
          2 days ago

          I felt it was the other way around. 2016 was simple and effective. Eternal just kept throwing shit at you, especially at higher difficulties. I kept hoping the big battles would end so I could move on, but more monsters kept coming. And then they threw the Marauders at you which just didn’t seem to fit in with the rest of the game because of how you have to fight them.

          • I Cast Fist
            link
            fedilink
            English
            arrow-up
            12
            ·
            2 days ago

            I’m with you there. 2016 has a much better pacing and general feel. While both kinda feel like serious Sam, moving from one enemy arena to another, Eternal has a much stronger feeling that that’s all you’re doing. I also disliked the super limited ammo and being semi forced to change weapons for each enemy. All the parkouring and bar swinging didn’t feel like it added “good variety” to the gameplay

          • Renacles@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 days ago

            I prefer the kind of game that I can master once I get used to it’s mechanics so Eternal was perfect for me but I do understand the appeal of Doom 2016.

            Battles in Eternal don’t take too long if you know how to switch the right weapons to maximize your dps.

          • warm@kbin.earth
            link
            fedilink
            arrow-up
            12
            arrow-down
            3
            ·
            2 days ago

            Ignoring graphical aspects. It might be good game, it’s just not a DOOM game other than in name and monsters. I played the shit out of the originals and I felt with D2016 they really captured the essence of it but with just a few new simple additions to keep it fresh, then when I started Eternal it felt like I was playing a different franchise, wall climbing and grappling hooks? The pacing was all off. Maybe the dev team moved on, I dont know, but I was left extremely disappointed.

  • Cassa@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    2 days ago

    To be fair, ray tracing is far less of a workload for developers. Though it sucks it isn’t too far off games requiring 3D acceleration cards

    • Justin@lemmy.jlh.name
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      That’s true. Ray tracing libraries and engines like UE5 are a lot easier to develop on than older engines.

      But I’m not sure it’s such a simple comparison. 3d acceleration made games look better, and the weakest gpus didn’t make your fps tank afaik. Your average gpu these days will tank your FPS in ray tracing and cause awful visual artifacts, either from bad denoising algorithms, or from the upscalers used to hide the bad FPS and bad denoising.

      This move reduces development costs, but given that the consumer doesn’t get any benefits from it, it’s hard not to have the cynical view that this is just greedy cost cutting on Microsoft’s part.

    • Agent Karyo@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 days ago

      I would have to agree. And I don’t think it’s a good idea to have ray tracing as a mandatory feature.

      I think some basic level of support for ray-tracing will be mandatory for dGPUs in the coming years.

  • Justin@lemmy.jlh.name
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 days ago

    Microsoft getting me to buy this game requires me to care.

    Not buying a new GPU for a new Doom game.

  • ILikeBoobies@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    4
    ·
    2 days ago

    8gb is intro level for gpus anyway so that’s not a big deal

    I suppose if you’re going to have ray tracing it cuts down development time to not have to redo lighting again for when the feature is off

    • h0rnman@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      Fair, but it’s been shown time and time again that most users are either on “intro level” gpus or weaker. Heck, a midrange from 2 years ago is an 8gb card. I’m not sure how they expect to sell this game at all unless it’s just planned to be a bundle add-on for the 50xx/90xx series cards.

      • saigot@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        15 hours ago

        Currently the most popular gpu according to the steam survey is a 3060. That plays the only other mandatory RT game, indiana jones, at 60fps on high. A 2080 can play on low at 50.

        • h0rnman@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          14 hours ago

          Yeah, but in this case I’m referring to vram. RT is what it is, and most “recent” cards support some kind of RT, even if not well. The concern is more that, for instance, the 3070 only has 8GB. I wouldn’t ever say that the 3070 is nearing it’s EoL either. The 3060 is the top card in steam, sure, but the next two dozen or so places are almost universally 8GB cards (with varying degrees of RT support) , including several 40xx series. I’m just saying that I don’t see a hard RT and >8GB VRAM requirement playing out as well as a lot of people think.

      • ILikeBoobies@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 hours ago

        I would say from 3000 series on 8gb was intro

        Just a scam to get anything lower than that between then and now

    • Kogasa
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      The amount of VRAM isn’t really the issue, even an extremely good GPU like the 7900XTX (with 24GB VRAM) struggles with some ray tracing workloads because it requires specially designed hardware to run efficiently

  • baatliwala@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 days ago

    How does RT only titles work on consoles? The RT really isn’t that powerful, aren’t they supposed to be equivalent to an RTX 2070 at best? It sounds like the graphics difference will be quite a lot for PC vs consoles.

    • SoJB@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      Consoles have been seriously struggling to run anything past 30fps blurry messes for the past several years. It’s real bad on that side of the fence.

      Although PC gamers aren’t much better off, having to buy $900 GPUs every year just to run the latest AAAA blurfest at 30 FPS with AI frame gen on top of upscaling on top of interpolation frame gen.

      • burgersc12@mander.xyz
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 day ago

        No one should spend $900 on a GPU when about $500 gets you a product that’s about 90% as good

    • alphabethunter@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Both current gen consoles are RT capable, so they’ll just use lowered graphical settings, some amount of optimization, and upscaling. Indiana Jones ran great though, way better than you’d expect. I was getting a perfectly smooth 75 fps on a 6750 XT on 1080p high, no upscaling or framegen in use.