ASUS ROG Strix GeForce RTX 4080 O16G GAMING OC Edition Video Card Review

magoo

Quasi-regular
Joined
Jul 9, 2019
Messages
381
Points
63
Introduction Today we are reaching up to the top shelf to grab an ASUS ROG Strix GeForce RTX 4080 O16G GAMING OC Edition (90YV0IC0-M0AA00) video card to review. This video card is the top-of-the-line available from ASUS in the newest version of GeForce RTX graphics cards. This ROG Strix RTX 4080 16GB GDDR6X OC Edition […]

See full article...
 
1. I liked the entire article being on one page. Thank you for that.

2. Why no 7900xt or 7900xtx comparison? I just don't understand leaving those out but having the 6900xt in there. It feels like it was to make a 1500 dollar 4080 seem more acceptable.
 
1. I liked the entire article being on one page. Thank you for that.

2. Why no 7900xt or 7900xtx comparison? I just don't understand leaving those out but having the 6900xt in there. It feels like it was to make a 1500 dollar 4080 seem more acceptable.

At the time of review, we did not have a 7900 XT available for comparison. 7900 XT will be included on 4080 reviews in future reviews. This review was planned with 3090 Ti and 6900 XT comparisons before 7900 XT was launched.
 
At the time of review, we did not have a 7900 XT available for comparison. 7900 XT will be included on 4080 reviews in future reviews. This review was planned with 3090 Ti and 6900 XT comparisons before 7900 XT was launched.
Ah ok just a timing thing. I get it. Look forward to them!!
 
Yep, just a little insight, usually when new reviews are planned either before, or around a new video card launch, it can take some time for both me and Rick to sync up on hardware we have available to use for comparisons. Either because the card isn't out yet when the review is planned, or, the hardware is out, but only one of us has the card at any one given time, so we do the best we can. In fact, I need to seed the 4070 Ti to Rick so he can get data from it for future reviews, so these things take a little time since we are distanced apart.
 
I was comparing benchmarks between this one and the FE and was wondering the discrepancies in nr's until I checked the test bench specs, big differences in test setup explains a lot but also makes comparing kinda pointless.

Still, it was semi informative
 
In looking at our test platforms and other published data, the 5800X3D is very competative. A recent review at Guru3D looked at DDR5 6200 and 7000 vs DDR4 3600 and 4000 finding just a few percent performance difference.
The proper AM4, DDR4 platform is still stout.
The difference between reviews is likely the video card.
 
I was comparing benchmarks between this one and the FE and was wondering the discrepancies in nr's until I checked the test bench specs, big differences in test setup explains a lot but also makes comparing kinda pointless.

Still, it was semi informative

A couple of the games in the line up (i.e. Dying Light 2) have had significant recent performance changes as a result of patching - that's a leading cause of making things less comparable. In general, the 4k numbers should be pretty close, but in some games, 1440p is very much CPU bound which will cause variations there. On the FSR/DLSS side of things, this can also introduce some weirdness related to being CPU bound or the upscaler to increase quality instead of frame rate. Also, Far Cry 6 is... well.. not a consistent benchmarker for me no matter what I do.

That being said, we absolutely make sure the results are comparable within a single review, but it's not feasible to go back and compare against all the other reviews of that particular card type and update accordingly (though, we certainly look at those numbers as a frame of reference for performance expectations).
 
That being said, we absolutely make sure the results are comparable within a single review, but it's not feasible to go back and compare against all the other reviews of that particular card type and update accordingly (though, we certainly look at those numbers as a frame of reference for performance expectations).
This is one of the greatest pitfalls of tech reviews. Using results from previous reviews can be very hit and miss since something as simple and everyday as a driver update or OS update can make a big difference in the results obtained. Mix together OS updates, game updates, chipset updates, BIOS updates and video driver updates and it becomes a recipe for some really weird results. I do like the recipe analogy here, though. You can use the same basic ingredients to make a recipe but it can turn out very differently depending on brands of ingredients used and small differences in quantity.
 
When you say that the results can change significantly --

I'm going to discount individual game patches - those are the responsibility of the developer to produce and out of the hands of the GPU manufacturer or the reviewer, and if you are using immature benchmarks to the point that early-release patches are significantly swinging performance, that's probably not the best review practice for hardware and should be limited to tech reviews for the game.

Driver updates - I'd say those are important, but I'd also say if a product comes out Day 1, and a later driver update changes the performance by more than single-digit percent, that's on the manufacturer of the card - they should do a better job on their drivers up front if they want to put their best foot forward.

I expect things to drift a bit - I do not expect to see a review state that Game XYZ got 98.2 FPS on a test rig, and then cry foul because XYZ on my rig only gets 97.4 FPS. There's some variability to be expected no matter what.

So the real question is: what's acceptable in terms of drift?

I lament that we do not have that database to look things up. I understand things drift, I do not expect reviews to be exact when presenting benchmarks, but I do expect them to be representative of the product at the time of the review, consistent between products, and relative to other products. And in that vein, I've always thought a reference data base to be able to quickly infer relative performance would be really really handy. Right now, I've got.. UserBenchmark, and that's about it. That is 100% admittedly not the best comparison to have, but it's just about the only tool I have left if I want to compare any two arbitrary pieces of hardware, particularly when I want to look across generations.
 
I'm going to discount individual game patches
Recent example: Dying Light 2, a 1+ year old game, got a patch a few weeks ago that dropped about 10 fps on 4090s at 4K. These swings are not limited to new games/new drivers.
 
It's been almost a month since I commented so I kinda forgot but there were some decent differences in benchmark numbers.

Now first of all I was dissapointed that the 4080 was not tested against a 3080 which imo would have been the logical thing to do but I digress (mostly because I have a 3080 and a 4080 would be a logical upgrade price not withstanding)

If I look at Cyberpunk 2077, the 4080 Strix get's 47fps and the 4080 FE 65.4fps; the 3090 ti goes from 50 to 55 and the 6900xt from 30 to 43.5 so the game performance in 2 months dropped significantly if the fact that the CPU's are different is not the root cause of the difference.

Anyways, I wrongly assumed that if an outlet is going to test GPU's they would use the same base system to make the comparison as fair and comparable as possible.
 
Anyways, I wrongly assumed that if an outlet is going to test GPU's they would use the same base system to make the comparison as fair and comparable as possible.
That's a budget issue for us at this point. We are so far in the red that I'm not willing to spend more to purchase identical platforms - priority for spending is going to content/labor. Having identical test platforms is a future goal, just not feasible right now.
 
Right now, I've got.. UserBenchmark, and that's about it. That is 100% admittedly not the best comparison to have, but it's just about the only tool I have left if I want to compare any two arbitrary pieces of hardware, particularly when I want to look across generations.

I think Linus tech tips new labs is going for some kind of ultimate benchmark database kind of thing but no eta on when or how.
 
Now first of all I was dissapointed that the 4080 was not tested against a 3080 which imo would have been the logical thing to do but I digress (mostly because I have a 3080 and a 4080 would be a logical upgrade price not withstanding)

In our 4080 launch review, a 3080 Ti is included as comparison. https://www.thefpsreview.com/2022/11/15/nvidia-geforce-rtx-4080-founders-edition-video-card-review/

Typically we go in-depth for the launch review of a new GPU, but subsequent AIB reviews don't go as in-depth on comparisons after that.

That said, I always had it in my mind to do a 3080 vs 4080 comparison review, but time has eluded me, it has been non-stop with new launches. It's still on the brain though as a specific potential comparison article.
 
Become a Patron!
Back
Top