• narc0tic_bird@lemm.ee
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    edit-2
    1 year ago

    While current Nvidia cards are certainly more efficient, RDNA3 still improves efficiency over RDNA2, which itself was actually more efficient than Ampere (mostly due to Ampere being based on the Samsung 8nm process).

    A 7800 XT is more efficient than both a 6800 XT and an RTX 3080, with the RTX 4070 being the most efficient in this performance ballpark.

    I feel like you’re blowing this way out of proportion.

    • lowleveldata
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      5
      ·
      1 year ago

      What is the right proportion? 7800 XT uses 25% more power than 4070 (200W vs 250W). It seems outstanding to me.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        1 year ago

        Are you measuring power actually used, or are you just looking at TDP figures on the marketing material? You can’t directly compare those marketing numbers on products from different gens, much less different companies.

        To really understand what’s going on, you need to look at something like watts per frame.

        • lowleveldata
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I’m getting the numbers from GamersNexus’ power consumption chart from their review of the card.

        • DavLemmyHav@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          1 year ago

          The numbers here are the maximum number of watts used if i recall correctly. So most of the time when your gaming its probably gonna be close to those numbers

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            1 year ago

            No, it’s TDP like with CPUs. So a 200W GPU needs a cooler rated to dissipate 200W worth it thermal load (and that’s not scientific, AMD and NVIDIA do it differently). The actual power usage can be higher than that under full load, and it could be lower during normal, sustained usage.

            So the wattage rating doesn’t really tell you much about expected power usage unless you’re comparing two products from the same product line (e.g. RX 6600 and 6700), and sometimes between generations from the same company (e.g. 6600 and 7600), and even then it’s just a rough idea.

      • narc0tic_bird@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        You think 50 watts difference will noticeably heat up your room? You must have a tiny room then or the difference will hardly be measurable.

        • lowleveldata
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          1 year ago

          It is already hot enough that I don’t want to add more heat to it. Also yes I have a tiny room.