What if AMD produced a desktop graphics card that effectively mirrored console GPU specifications? Would such a product guarantee console-level performance in the PC space, or is it the case that there are already console-equivalent GPUs currently available that already do the job – and if so, how much do they cost? It's been known for a while now that AMD's Radeon RX 6700 10GB (the non-XT) model has starting similarities to the technological make-up of the PlayStation 5's GPU – so a while back, I bought one used from eBay and the head- to-head results are certainly intriguing.
Looking at the core specs, the RX 6700 certainly looks like a ringer for the PS5's GPU – to a certain extent. Both have 2,304 shaders across 36 AMD dual compute units. Both are based on the RDNA 2 graphics architecture. Both have 144 texture mapping units and 64 ROPs. However, equally, there are differences. The RX 6700 has a 2.45GHz boost clock up against PS5's maximum 2.23GHz (although the way they boost is somewhat different). There are fundamental differences in the memory set-up too: the desktop GPU version of RDNA 2 has a different memory cache hierarchy, up to and including 80MB of Infinity Cache. However, PlayStation 5 operates over a much wider memory interface with more bandwidth, the caveat being that this resource is shared between CPU and GPU.
On paper then, the RX 6700 is marginally faster. However, there are a range of potentially confounding factors that make any kind of head-to-head benchmarking problematic – even beyond the spec variations. Sony uses a different GPU compiler and has its own graphics API, not using the typical DirectX12 or Vulkan used on PC. Not only that, while we're confident in our ability to match settings in any given game between PS5 and PC (often with developer assistance) for benchmarking purposes, some games may have unexposed settings we're unable to tweak on PC which may subtly impact our scores.
Specs Overview | PlayStation 5 | Radeon RX 6700 |
---|---|---|
Architecture | Custom RDNA2 | RDNA2 |
Shaders/CUs | 2304 Shaders/36 CUs | 2304 Shaders/36 CUs |
Boost Clock | 2.23GHz | 2.45GHz |
Texture Mapping Units | 144 | 144 |
ROPs | 64 | 64 |
Memory | 16GB GDDR6 (System-Wide) | 10GB GDDR6 |
Mem Interface | 256-bit (Shared with CPU) | 160-bit |
Mem Bandwidth | 448GB/s (Shared with CPU) | 320GB/s |
However, perhaps the most challenging aspect comes down to the quality of the PC port itself. Naughty Dog's PC version of The Last of Us Part 1 is in much better shape than it was at launch, but even so, despite so many hardware similarities between the RX 6700 and the PlayStation 5 GPU, the console version blows it out of the water . The Rx6700 is theoretically around 10 percent faster, yet the console version on matching settings (a customized version of the PC's high preset) is 30 percent faster, rising to 43 percent in another tested scene. It requires the use of FSR 2 upscaling from 1440p for the RX 6700 to match and beat the PlayStation 5 at native 4K rendering – a remarkable state of affairs.
Looking at Avatar: Frontiers of Pandora, we have an interesting situation as the developers have talked about their bespoke console optimizations but more than that, owing to the game's dynamic resolution, actually finding a point with consistent drops to PS5 performance is challenging. My colleague, Tom Morgan discovered an area where PlayStation 5 always drops frames and where we can safely assume we are at DRS lower bounds – 720p, reconstructing to 1440p with FSR2, something we can match on PC. Here, we find that the 6700 commands a circa 10-point lead over the PlayStation 5. This lands pretty much where expected based on theoretical performance.
Alan Wake 2 offers quality and performance modes and again, finding a stress point in the forest sees both modes drop beneath their frame-rate targets on PS5. In the quality mode, it's another six percent performance differential, but this time it's in favor of the PlayStation 5. Of course, we can run through this sequence again with the lower quality performance mode – lower settings, lower resolution, higher frame-rates – but this time it's the 6700 that takes point with a 10.8 percent lead.
Manage cookie settings
However, other games showed big differences. Hitman 3 has performance bottlenecks on PS5, and I found the RX 6700 to deliver a vast 44 percent lead in matching area – which is perhaps down to Infinity Cache, AMD driver optimisations or maybe the fact that the game launched relatively early in the console lifecycle , before the development environments were mature. It's difficult to say really.
Similarly, Monster Hunter Rise is a simplistic Nintendo Switch port, with an actual PC-style settings menu on PlayStation 5 – and the ability to render at both 4K resolution and supersampling down from 2700p (!). Here, the RX 6700 outperforms PS5 by around 32 percent and it's 30 points clear at 2700p. I'd say those are exceptions though and it can be seen in the other direction. It's rare to find performance dips in Cyberpunk 2077 in the 30fps RT mode on PS5, but apart from the market in the Phantom Liberty expansion definitely has issues – but not as many issues as the RX 6700. The PS5 outperforms is by 45 percent – a remarkable state of affairs that is not matched in performance mode, where the 6700 seems much the same as PlayStation 5.
For all the differences, there can be close parity too, more reminiscent of the Alan Wake 2 tests. In A Plague Tale: Requiem, for example, there are in-the-second variances between PS5 and RX 6700 but across the run of the benchmark sequence both deliver an identical 36.5fps average in the very last cutscene of the first chapter.
All of which got me thinking about other GPUs, specifically GPUs we've cited in the past as delivering performance broadly in line with PS5 graphics. The RTX 2070 Super – essentially a slightly slowed down RTX 2080 – is often used in DF coverage. We're likely to replace that with an RTX 3060 and an RTX 4060 to cover off the current most popular GPU and its Ada successor, so I tested those too. Here's the table of results on a slightly smaller sample of tests compared to the RX 6700/PS5 benchmarks.
Average FPS | PlayStation 5 | RX6700 | RTX 3060 | RTX 4060 | RTX 2070 Super |
---|---|---|---|---|---|
Alan Wake 2 Quality Mode | 28.17 | 26.49 | 25.64 | 29.94 | 27.54 |
Alan Wake 2 Performance Mode | 49.93 | 55.88 | 47.81 | 53.81 | 52.02 |
Avatar Performance Mode | 52.03 | 57.50 | 45.99 | 61.82 | 43.48 |
Cyberpunk 2077 RT Mode | 27.10 | 18.37 | 23.24 | 28.09 | 24.90 |
Plague Tale Requiem Quality Mode | 36.50 | 36.50 | 30.38 | 34.13 | 35.36 |
I go into the results in more depth in the video at the top of this page, but ultimately, we're looking at a rather tight grouping, all told – but there are some trends. In almost all tests, the RTX 3060 trails the pack, while the RX 6700 delivers some wins – though not exactly game-changingly so, bearing in mind it's a product that should exists in a class above Nvidia's 60-series offerings. What's fascinating is that between the RX 6700, RTX 2070 Super and RTX 4060, we essentially get products that are all within the ballpark of the kind of performance delivered by the PlayStation 5.
What we can say with some degree of certainty at this point is that today's consoles are basically delivering the same kind of performance you can get from a £300/$300 graphics card in the PC space – but there is one major caveat we need to factor in: the amount of memory attached to any given GPU. All of the games in the table above work fine with 8GB of framebuffer memory, but two of them have an extra advantage: the RX 6700 ships with 10GB of GDDR6, while the RTX 3060 has a generous 12GB of memory. And it can make a difference. With The Last of Us Part 1 on PS5-equivalent settings, you basically need around 11GB of memory to match the console's 'custom high preset'. In the first test with the RX 6700 I had to decrease texture streaming rate and environmental texture quality to fit within 10GB of RAM. Ignoring that completely and running an RTX 3060 12GB vs RTX 4060 8GB comparison, Nvidia's newer offering ran much slower and suffered from egregious stuttering during traversal.
In summary then, our choice of RTX 2070 Super has been a pretty strong one over the years in terms of saying, “this class of card can deliver console-like performance” but it is the year 2024 now and the RTX 4060 is a touch faster. Today's $300 GPU matches and slightly exceeds 2019's $500 Nvidia offering, the only catch being its 8GB of framebuffer memory when the RTX 3060 is offering 50 percent more. Even so, I do think replacing the venerable 2070 Super with both 3060 and 4060 will give a good spread in mainstream PC game testing in future DF coverage, but the RX 6700 is certainly an intriguing proposition.
I'm not sure you can buy the RX 6700 new at this point – and it seemed to be a pretty limited release when it did launch. It can clearly do the job for 1080p gaming just as any of the cards tested here can but it's curious to see how this class of hardware is servicing 4K living room TVs in the PS5, but is better suited to 1080p and 1440p screens on PC. That said, with Avatar and Alan Wake's performance modes, those are also outputting a reconstructed 1440p on the PS5 – the price we need to pay for what is essentially low to mid-range GPU performance.
Will it stay like that? Probably not. The longer a console generation persists, the more developers manage to squeeze from fixed platforms in a way that doesn't necessarily translate to PC. The PS4's GPU is best described as a hybrid of Radeon HD 7850 and HD 7870 with custom async compute pipelines, and yet I would be that these products wouldn't run a game like the latest Call of Duty anything like as well as a PS4. The amount of low-level optimization there is frankly astonishing, down to a software-driven version of variable rate shading that seems to be of higher quality and with more powerful performance improvements than the hardware VRS built into RDNA 2.
Meanwhile, deprecated driver support, less developer focus on older kit and the general upward trend of GPU performance means that consoles can age like fine wine unlike PC GPUs, particularly those with lower amounts of video memory. To illustrate, a while back I bought an AMD Radeon R9 270X with 4GB of RAM. It's essentially the same as the Radeon HD 7870, but with higher clocks and more memory – so built from the same building blocks as the PlayStation 4's GPU – but no matter which games I tested with it, the theoretically less capable console wiped the floor with Item. It didn't even have the DX12 support Death Stranding required to boot!
In the meantime though, checking out the RX 6700 has been an interesting experience – serving to highlight the merits of console-equivalent or optimized settings versus whacking everything up to ultra. And while we've covered a lot of ground in stacking up mainstream GPUs against the PlayStation 5, I'm now wondering about budget PC builds – and how Intel's Arc A750 may well be the best balance of price vs performance compared to anything tested here …
#AMD39s #Radeon #ringer #PS5 #GPU #faster