NVIDIA GeForce RTX 3080 Founders Edition Review

I think he means a faster/beefier version of the 3080, wether its called super, Ti, hyper, ultra, jumbo, is irrelevant.
Yeah, from memory of 'Ti' and 'Super' releases, the only real common thread is that they have better specifications than whatever they are a 'Ti' or 'Super' of. Could be the same GPU die with faster memory, more memory, more compute resources unlocked, the next largest GPU die, or some combination.
 
I have to say I'm a little bit dissapointed on DLSS+RTX performance hit as it seems to be comparatively the same as Turing. (about 10-20% depending on the game @4K) I'm getting this figure having as a reference 1440p RTX performance, since this is how its rendered under DLSS.

I was expecting much better performance as ampere tensor cores are supposedly 3x faster and rtx cores 2x faster than Turing. Some untapped potential, maybe?
 
I have to say I'm a little bit dissapointed on DLSS+RTX performance hit as it seems to be comparatively the same as Turing. (about 10-20% depending on the game @4K) I'm getting this figure having as a reference 1440p RTX performance, since this is how its rendered under DLSS.

I was expecting much better performance as ampere tensor cores are supposedly 3x faster and rtx cores 2x faster than Turing. Some untapped potential, maybe?
I agree.
But also pretty cool that $699 is legit 4K60fps with just about every game out there.
Hopefully they can tweak things a bit more down the line.
 
I have to say I'm a little bit dissapointed on DLSS+RTX performance hit as it seems to be comparatively the same as Turing. (about 10-20% depending on the game @4K) I'm getting this figure having as a reference 1440p RTX performance, since this is how its rendered under DLSS.

I was expecting much better performance as ampere tensor cores are supposedly 3x faster and rtx cores 2x faster than Turing. Some untapped potential, maybe?
The full capability for RT will not be seen with the older RTX titles since they used DXR 1.0. Wolfenstein update, which showed a much better spread between the 2080Ti and 3080 represents more the 3080 RT potential pretty sure is using the much better parallel ability of DXR1.1 enhancements.

I put a decent amount of effort in obtaining this card, Nvidia, Bestbuy, really wanted the FE. Nvidia failed to deliver or sell me one. Been trying since without any luck. If Nvidia cannot take care of their customers then best to move on.
 
Last edited:
The full capability for RT will not be seen with the older RTX titles since they used DXR 1.0. Wolfenstein update, which showed a much better spread between the 2080Ti and 3080 represents more the 3080 RT potential pretty sure is using the much better parallel ability of DXR1.1 enhancements.

I put a decent amount of effort in obtaining this card, Nvidia, Bestbuy, really wanted the FE. Nvidia failed to deliver or sell me one. Been trying since without any luck. If Nvidia cannot take care of their customers then best to move on.
My fear, that regardless of the manufacturer, is that we've gotten to a point where scripts rule over the consumer base. Anyone, with enough capital funds could control the market as long as supply is limited at release. There's no penalty for the bot-world to just buy up anything and everything at launch, as long as there is demand for their resale.
 
When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money :eek:.
 
When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money :eek:.

From what I can tell, whenever the NDA lifts it'll probably be a smaller selection of reviewers than what was seen for the 3080. I'd also guess the lift will happen no later than the card on sale date of 9/24 @ 6AM PDT. At this point, we don't have one nor do we have any confirmed in the pipeline. As with the 3080, I'll be F5'ing to try to get one when they launch and we'll continue to shake down manufacturers for one...
 
When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money :eek:.
I'm kind of feeling the same. Haven't had this much ambivalence in a while. For me the real decision will be pricing. It it costs over $1000 then I'll still go for the 3090. Whether or not it's DDR6X could be also be a factor.
 
When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money :eek:.
It is certainly a weird position to be in.

While there are uses for the, uh, 'excess' performance, they don't seem to merit significant increases in costs.

Feels kind of like we're on a divide, where more performance isn't really useful for pure rasterization on desktops, but alsol isn't nearly enough for say VR or RT (or both).
 
It is certainly a weird position to be in.

While there are uses for the, uh, 'excess' performance, they don't seem to merit significant increases in costs.

Feels kind of like we're on a divide, where more performance isn't really useful for pure rasterization on desktops, but alsol isn't nearly enough for say VR or RT (or both).
Only thing really holding me back from a 3080 right now is the 10GB memory simply because I've seen the 11GB on my 2080 Ti maxed out in a few games at 4K resolution. And it wasn't simply usage, in those cases, having experienced degraded performance before turning down texture quality or other settings to reduce VRAM needed. Most games fall into the 6-8GB range right now, but I'm just worried that more games will be coming down the pipe that start running into limitations with 10GB. I do understand that they probably could not hit their $700 target if they added more, though. I can see the 20GB version being $1,000 or close to it unless Micron will have 16Gb chips ready when it hits production.
 
10 is still more than 8, remember, the 3080 isn't an upgrade from the 2080 ti, it's an upgrade from the 2080/super, if you have a 2080 Ti i'd recommend keeping it, the 3090 is really closer to the 2080 ti replacement, but maybe there will be a middle card in the future, or there is of course the more expensive 20GB 3080 option

as for games utilizing more VRAM, well, I'm not sure what the trend will be, if DLSS is used more, that's the answer to the VRAM capacity problem, it will alleviate so much pressure on capacity when used

games are also constantly developer new compression methods, and ways to load balance everything correctly, with RTX I/O and Microsoft DirectStorage decompression should be a lot better and again the vram capacity issue won't be such a problem

I know your concerns for sure, and it will really all depends on the games themselves, but I do implore you, if a new game supports DLSS, give it a try, I'm actually liking the technology now, I've used it now, and DLSS 2.0 gives you good image quality and a perf increase
 
10 is still more than 8, remember, the 3080 isn't an upgrade from the 2080 ti, it's an upgrade from the 2080/super, if you have a 2080 Ti i'd recommend keeping it, the 3090 is really closer to the 2080 ti replacement, but maybe there will be a middle card in the future, or there is of course the more expensive 20GB 3080 option

as for games utilizing more VRAM, well, I'm not sure what the trend will be, if DLSS is used more, that's the answer to the VRAM capacity problem, it will alleviate so much pressure on capacity when used

games are also constantly developer new compression methods, and ways to load balance everything correctly, with RTX I/O and Microsoft DirectStorage decompression should be a lot better and again the vram capacity issue won't be such a problem

I know your concerns for sure, and it will really all depends on the games themselves, but I do implore you, if a new game supports DLSS, give it a try, I'm actually liking the technology now, I've used it now, and DLSS 2.0 gives you good image quality and a perf increase
DLSS is great, I agree, but unfortunately I do not think it will become ubiquitous.
 
These are just rumors, but rumors are AMD will have something similar to DLSS coming.

If that can happen, and maybe some form of standard API can be achieved, then maybe it will be used more.

Like Ray Tracing, someone had to get the ball rolling first.
 
DLSS made enough of a difference to me personally that I simply would not buy a GPU without it.
Control and Wolfenstein:YB alone was worth the price of admission to play in 4K with a RTX2070.

I was originally going to buy a R7. Glad I didnt.
RTX and DLSS was way more fun and useful to me that an extra 8GB of ram could have ever been.

Maybe I should just get a 3090 this time around and game on for the next 3 years, it seems very likely that 2080ti owners will get 3 years out theirs.
Something to be said about buying the best available stuff...
 
These are just rumors, but rumors are AMD will have something similar to DLSS coming.

If that can happen, and maybe some form of standard API can be achieved, then maybe it will be used more.

Like Ray Tracing, someone had to get the ball rolling first.
Is Contrast Adaptive Sharpening not AMD's version of DLSS?
 
Only thing really holding me back from a 3080 right now is the 10GB memory simply because I've seen the 11GB on my 2080 Ti maxed out in a few games at 4K resolution. And it wasn't simply usage, in those cases, having experienced degraded performance before turning down texture quality or other settings to reduce VRAM needed. Most games fall into the 6-8GB range right now, but I'm just worried that more games will be coming down the pipe that start running into limitations with 10GB.
I do feel the same way; it's not even that the 3080 has less than the 2080 Ti (which as noted by others, the 2080 should be the point of comparison), but that memory didn't increase much.
I do understand that they probably could not hit their $700 target if they added more, though. I can see the 20GB version being $1,000 or close to it unless Micron will have 16Gb chips ready when it hits production.
I kind of feel like it's worth waiting. Part of that at least is coming from a 1080Ti and not really wanting to go backward in VRAM capacity particularly given how long I'm likely to keep the new card.

as for games utilizing more VRAM, well, I'm not sure what the trend will be, if DLSS is used more, that's the answer to the VRAM capacity problem, it will alleviate so much pressure on capacity when used

games are also constantly developer new compression methods, and ways to load balance everything correctly, with RTX I/O and Microsoft DirectStorage decompression should be a lot better and again the vram capacity issue won't be such a problem
As much as I admire upcoming solutions to the VRAM problem... these are 'high-end' solutions that require significant developer support. I can't help but imagine that there might be games that slip through the cracks which wind up benefiting from the increased VRAM due to lack of optimization.

That's also compounded by waiting every other generation or so to upgrade in my case. More frequent upgraders probably have less to worry about!
 
Is Contrast Adaptive Sharpening not AMD's version of DLSS?

No, that's closer to NVIDIA's Sharpening filter in the control panel

https://nvidia.custhelp.com/app/ans...-image-sharpening-in-the-nvidia-control-panel

DLSS uses AI (Tensor Cores) to take an image and scale it upwards by like 16x samples, so it renders at a lower resolution but is upscaled by AI to a baseline highly super-sampled image processed by NVIDIA servers offline. It's much more complex.

This is why NVIDIA's method provides faster performance, cause it's rendering at a lower resolution and then uses hardware to basically upscale it to a reference image with no loss in performance doing so.

AMD's method still renders at the same resolution, and there is no AI upscaling. It doesn't improve performance, only sharpens image quality when temporal antialiasing is used.

Now, there is supposed to be a feature of CAS that can scale an image. However I don't know an example of it, and you really don't hear about performance increases when CAS is used. The video card is not using AI to upscale, cause there's no method for that ATM. That's what sets DLSS far apart from CAS, it's much more functional.

However, I need to read into CAS a bit more, I'm not 100% on how it exactly works, I need to read a whitepaper or something. But so far, it hasn't been marketed as a feature to improve performance, but only to improve image quality.

It's quite possible AMD could make a new version of CAS that is DLSS like when they have the hardware to do so. Or, they could brand their "DLSS" equivalent into a whole new feature name. Who knows, but the rumor is AMD will be coming out with something DLSS 'like and I'm not sure that's CAS.
 
Last edited:
Become a Patron!
Back
Top