RTX 4080 GeForce NOW Tier Isn’t Actually as Fast as a Real RTX 4080 GPU

Yesterday, NVIDIA started rolling out the GeForce NOW RTX 4080 SuperPODs in select locations in the United States (Los Angeles, San Jose, and Dallas) and Central Europe (Frankfurt).

However, according to the German website HardwareLuxx, the performance enabled by the upgrade doesn’t quite match up to a real RTX 4080 GPU. While GeForce NOW users are restricted to actual games and cannot use software to manually create benchmark sessions, there are plenty of games with built-in benchmarks that allow testing.

For example, in CD Projekt RED’s Cyberpunk 2077, the GeForce NOW Ultimate tier trailed considerably behind a real RTX 4080. At 1080p, GFN Ultimate scored 92.8 average FPS, whereas the bona fide GPU ran 53.6% faster at 142.6 FPS; at 1440p, the cloud test registered 78.1 average FPS, while a physical PC ran 37.1% faster at 107.1 FPS; at 4K, GFN Ultimate registered a similar difference (38.1%), with the actual RTX 4080 coming out on top with 52.1 average FPS against the cloud’s 37.7 average FPS.

HardwareLuxx also tested Eidos Montréal’s Marvel’s Guardians of the Galaxy which, for some reason, didn’t allow 4K resolution to be selected (even though GeForce NOW has supported 4K resolution for a long time). At 1080p, the real RTX 4080 scored 157 average FPS, 46.7% higher than GFN Ultimate’s 107 FPS; at 1440p, the difference mentioned in the website’s slide is even starker at 80.5%, though we have to question the result of 195 average FPS recorded for the RTX 4080. If they used the same graphics preset (Very High), the 1080p benchmark should have scored much higher than 1440p’s.

Lastly, Crystal Dynamics’ Shadow of the Tomb Raider painted a similar picture, except for the 4K resolution. In this test, the GeForce NOW Ultimate tier somehow beat the actual RTX 4080, albeit only slightly (137 average FPS vs. 134).

Hardware-wise, the benchmark pointed to an AMD Ryzen 16-core CPU with 28 GB of central RAM. As for the GPU, the website’s best guess is that NVIDIA is using the L40 data center GPU, which is based on Ada Lovelace’s architecture just like the consumer-oriented RTX 4080.

The L40 is equipped with 48 GB of GDDR6 VRAM, though since each data center GPU actually serves two cloud instances, each instance gets 24 GB of VRAM (as listed in the benchmarks). The author also estimates that each GeForce NOW Ultimate cloud instance receives around 70-72 shader multiprocessors, a bit less than the 76 of the real RTX 4080. This could partly account for the lower performance.

Even with all these caveats, the upgraded GeForce NOW Ultimate tier offers the best cloud experience available anywhere with its support for 4K/120 FPS (on PC and Mac apps), HDR displays, and Ultrawide displays. While playing via cloud will hardly be feasible for any competitive eSports player, there’s a new 1080p/240 FPS option to minimize latency as much as possible. Owners of G-SYNC and G-SYNC compatible displays can even take advantage of the technology to dynamically vary the streaming rate to match the display’s refresh rate, further driving down latency. However, this feature is only available in Reflex-supported games.

Besides, while it may not fully match the performance and visual quality of a real RTX 4080 GPU, the far lower power consumption of GeForce NOW could lead to significant money savings over extended periods, particularly in the current energy crisis.

The post RTX 4080 GeForce NOW Tier Isn’t Actually as Fast as a Real RTX 4080 GPU by Alessio Palumbo appeared first on Wccftech.