Nvidia isn’t the only horse in town. AMD (and to an extent Intel) usually offer much better value at these mid-range (and dare I say “low-end” at like $200) price points.
And while Nvidia probably still sells more GPUs than AMD (for whatever reason there are actually people out there buying 4060 (Ti) cards), it’s not like AMD doesn’t sell any cards. The 7800 XT was priced very well from AMDs standpoint because it was just at the edge of what people thought was actually solid price to performance. It probably sold and still sells quite well.
Can anyone give me a suggestion for what cards I should be looking at to get a little over ps5 graphics without breaking the bank? It’s been a while since I worked on my last pc and I’m really lost these days
Those should all be about as good or a little better than the PS5.
That said, YMMV may vary because games for console may be better tuned for console hardware than for PC, even if the hardware is equivalent. So maybe to up a step to be safe. If you want RTX, do NVIDIA, otherwise AMD or Intel will probably offer better value.
I paid a little over $200 for my RX 6650XT, so expect to pay $200-300 to match or slightly exceed the PS5.
This list sounds about right. The whole “but it’s optimized for one console” thing is a pretty moot point nowadays as well. Sure, crappy ports exist but solid ports perform evenly to console with similar hardware specs.
While current Nvidia cards are certainly more efficient, RDNA3 still improves efficiency over RDNA2, which itself was actually more efficient than Ampere (mostly due to Ampere being based on the Samsung 8nm process).
A 7800 XT is more efficient than both a 6800 XT and an RTX 3080, with the RTX 4070 being the most efficient in this performance ballpark.
I feel like you’re blowing this way out of proportion.
Are you measuring power actually used, or are you just looking at TDP figures on the marketing material? You can’t directly compare those marketing numbers on products from different gens, much less different companies.
To really understand what’s going on, you need to look at something like watts per frame.
The numbers here are the maximum number of watts used if i recall correctly. So most of the time when your gaming its probably gonna be close to those numbers
No, it’s TDP like with CPUs. So a 200W GPU needs a cooler rated to dissipate 200W worth it thermal load (and that’s not scientific, AMD and NVIDIA do it differently). The actual power usage can be higher than that under full load, and it could be lower during normal, sustained usage.
So the wattage rating doesn’t really tell you much about expected power usage unless you’re comparing two products from the same product line (e.g. RX 6600 and 6700), and sometimes between generations from the same company (e.g. 6600 and 7600), and even then it’s just a rough idea.
That’s more or less true. NVIDIA knows they’re holding aces with DLSS+Frame Gen which is just strictly superior to FSR and so they’ll probably try to bully the market into accepting current pricing. Better Ray Tracing performance for NVIDIA cards might also be a factor if we’ll start seeing more and more games where it really makes a difference, like Alan Wake 2.
What they end up doing with the rumoured Super series coming next year will be a good indication of where we’re at I think.
Really? The consensus I gathered from reviews was that pricing of 40 series is pretty much shit.
Nvidia isn’t the only horse in town. AMD (and to an extent Intel) usually offer much better value at these mid-range (and dare I say “low-end” at like $200) price points.
And while Nvidia probably still sells more GPUs than AMD (for whatever reason there are actually people out there buying 4060 (Ti) cards), it’s not like AMD doesn’t sell any cards. The 7800 XT was priced very well from AMDs standpoint because it was just at the edge of what people thought was actually solid price to performance. It probably sold and still sells quite well.
Can anyone give me a suggestion for what cards I should be looking at to get a little over ps5 graphics without breaking the bank? It’s been a while since I worked on my last pc and I’m really lost these days
This article claims your baseline should be:
Those should all be about as good or a little better than the PS5.
That said, YMMV may vary because games for console may be better tuned for console hardware than for PC, even if the hardware is equivalent. So maybe to up a step to be safe. If you want RTX, do NVIDIA, otherwise AMD or Intel will probably offer better value.
I paid a little over $200 for my RX 6650XT, so expect to pay $200-300 to match or slightly exceed the PS5.
Thanks you for this very well put together response! I definitely have an idea of what to look for now
No problem! Good luck!
This list sounds about right. The whole “but it’s optimized for one console” thing is a pretty moot point nowadays as well. Sure, crappy ports exist but solid ports perform evenly to console with similar hardware specs.
I have nothing against AMD but the power consumption is outstandingly high. My room is already hot enough.
While current Nvidia cards are certainly more efficient, RDNA3 still improves efficiency over RDNA2, which itself was actually more efficient than Ampere (mostly due to Ampere being based on the Samsung 8nm process).
A 7800 XT is more efficient than both a 6800 XT and an RTX 3080, with the RTX 4070 being the most efficient in this performance ballpark.
I feel like you’re blowing this way out of proportion.
What is the right proportion? 7800 XT uses 25% more power than 4070 (200W vs 250W). It seems outstanding to me.
Are you measuring power actually used, or are you just looking at TDP figures on the marketing material? You can’t directly compare those marketing numbers on products from different gens, much less different companies.
To really understand what’s going on, you need to look at something like watts per frame.
I’m getting the numbers from GamersNexus’ power consumption chart from their review of the card.
Ok, then those numbers are at full load running a benchmark, assuming you’re talking about charts like this. Actual power usage in games could be a fair amount lower.
It could be, depending on the game. It’s still a good indicator that 7800XT would run hotter than 4070 in general cases.
The numbers here are the maximum number of watts used if i recall correctly. So most of the time when your gaming its probably gonna be close to those numbers
No, it’s TDP like with CPUs. So a 200W GPU needs a cooler rated to dissipate 200W worth it thermal load (and that’s not scientific, AMD and NVIDIA do it differently). The actual power usage can be higher than that under full load, and it could be lower during normal, sustained usage.
So the wattage rating doesn’t really tell you much about expected power usage unless you’re comparing two products from the same product line (e.g. RX 6600 and 6700), and sometimes between generations from the same company (e.g. 6600 and 7600), and even then it’s just a rough idea.
Oh ok thanks
You think 50 watts difference will noticeably heat up your room? You must have a tiny room then or the difference will hardly be measurable.
It is already hot enough that I don’t want to add more heat to it. Also yes I have a tiny room.
That’s more or less true. NVIDIA knows they’re holding aces with DLSS+Frame Gen which is just strictly superior to FSR and so they’ll probably try to bully the market into accepting current pricing. Better Ray Tracing performance for NVIDIA cards might also be a factor if we’ll start seeing more and more games where it really makes a difference, like Alan Wake 2.
What they end up doing with the rumoured Super series coming next year will be a good indication of where we’re at I think.