• FizzyOrange
    link
    fedilink
    arrow-up
    3
    arrow-down
    12
    ·
    1 day ago

    Not true. It takes advantage of hardware features that are available on consoles but not on PC. That isn’t laziness.

    • Rusty Shackleford
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 hours ago

      My comment on a different post relates to this well:

      I think a lot of the plugin tooling for Unreal promotes bad practices with asset management, GPU optimization, and memory management. I’m trying to say that it allows shitty/lazy developers and asset designers to rely on over-expensive hardware to lift their unoptimized dogshit code, blueprints, models, and textures to acceptable modern fps/playability standards. This has been prevalent for a few years but it’s especially egregious now. Young designers with polygon and vertex counts that are out of control. Extraneous surfaces and naked edges. Uncompressed audio. Unbaked lighting systems. Memory leaks.

      I’ve found that in my personal experience in experimenting with Unreal, the priority matches developing DNNs and parametric CAD modelling applications for my day job: effective resource, memory, and parallelism management from the outset of a project is (or should be) axiomatic.

      I think Unreal 5 runs exceptionally well when that’s the case. A lot of the time, one can turn off all of the extra hardware acceleration and frame generation AI crap if your logic systems and assets are designed well.

      I know this is a bit of an “old man yells at cloud” rant, but if one codes and makes models like ass, of course their game is gonna turn out like ass. And then they turn around and say “tHe EnGiNe SuCkS”.

      No. Fuck you. You suck.

    • Mia@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      9
      ·
      20 hours ago

      Which? Because consoles just use AMD APUs which have the exact same hardware features as their current CPUs and GPUs. UE5 games run like crap on consoles too.

      • FizzyOrange
        link
        fedilink
        arrow-up
        3
        ·
        19 hours ago

        It literally says in the article. Hardware IO controllers that handle compression. I guess this is related to DirectStorage but it doesn’t seem like that takes advantage of dedicated hardware on PC (because as far as I know it doesn’t exist) and apparently only a handful of games actually use it.

        They also have integrated RAM (like Apple M-series laptops).

        • Mia@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          12 hours ago

          All of which are completely irrelevant as to why games run like crap. Those things have zero impact on the game’s framerate, they only affect asset loading and streaming, and even then they do pretty much nothing from what I can see.

          I’m not gonna say it’s just marketing, but it comes close imo. I personally benchmarked Ratchet and Clank: Rift Apart’s loading times between a PS5, an NVMe SSD and a SATA SSD. Literally no difference, save for the SATA one being a fraction of a second slower. And that was one of the games thar was supposed to showcase what that technology can do! (I know it doesn’t run on UE5, but it’s just an example)

          UE5 runs like garbage on all platforms. You can load assets as fast as you want, but if the rendering pipeline is slow as hell it doesn’t matter, games will still run like garbage regardless.