Auer
FPS Regular
- Joined
- Jul 2, 2019
- Messages
- 1,043
- Points
- 113
man, I want a 3080 Jumbo!I think he means a faster/beefier version of the 3080, wether its called super, Ti, hyper, ultra, jumbo, is irrelevant.
Edit: Make that an Ultra Jumbo!!
man, I want a 3080 Jumbo!I think he means a faster/beefier version of the 3080, wether its called super, Ti, hyper, ultra, jumbo, is irrelevant.
Yeah, from memory of 'Ti' and 'Super' releases, the only real common thread is that they have better specifications than whatever they are a 'Ti' or 'Super' of. Could be the same GPU die with faster memory, more memory, more compute resources unlocked, the next largest GPU die, or some combination.I think he means a faster/beefier version of the 3080, wether its called super, Ti, hyper, ultra, jumbo, is irrelevant.
I think he means a faster/beefier version of the 3080, wether its called super, Ti, hyper, ultra, jumbo, is irrelevant.
I agree.I have to say I'm a little bit dissapointed on DLSS+RTX performance hit as it seems to be comparatively the same as Turing. (about 10-20% depending on the game @4K) I'm getting this figure having as a reference 1440p RTX performance, since this is how its rendered under DLSS.
I was expecting much better performance as ampere tensor cores are supposedly 3x faster and rtx cores 2x faster than Turing. Some untapped potential, maybe?
The full capability for RT will not be seen with the older RTX titles since they used DXR 1.0. Wolfenstein update, which showed a much better spread between the 2080Ti and 3080 represents more the 3080 RT potential pretty sure is using the much better parallel ability of DXR1.1 enhancements.I have to say I'm a little bit dissapointed on DLSS+RTX performance hit as it seems to be comparatively the same as Turing. (about 10-20% depending on the game @4K) I'm getting this figure having as a reference 1440p RTX performance, since this is how its rendered under DLSS.
I was expecting much better performance as ampere tensor cores are supposedly 3x faster and rtx cores 2x faster than Turing. Some untapped potential, maybe?
My fear, that regardless of the manufacturer, is that we've gotten to a point where scripts rule over the consumer base. Anyone, with enough capital funds could control the market as long as supply is limited at release. There's no penalty for the bot-world to just buy up anything and everything at launch, as long as there is demand for their resale.The full capability for RT will not be seen with the older RTX titles since they used DXR 1.0. Wolfenstein update, which showed a much better spread between the 2080Ti and 3080 represents more the 3080 RT potential pretty sure is using the much better parallel ability of DXR1.1 enhancements.
I put a decent amount of effort in obtaining this card, Nvidia, Bestbuy, really wanted the FE. Nvidia failed to deliver or sell me one. Been trying since without any luck. If Nvidia cannot take care of their customers then best to move on.
When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money .
I'm kind of feeling the same. Haven't had this much ambivalence in a while. For me the real decision will be pricing. It it costs over $1000 then I'll still go for the 3090. Whether or not it's DDR6X could be also be a factor.When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money .
It is certainly a weird position to be in.When should we expect a review of the 3090? I am actually contemplating waiting for the 20GB 3080 depending on how the 3090 performs. For the first time in history I feel like the 3080 is "enough" for my gaming resolution and needs, and that paying double for the 3090 is a waste of money .
Only thing really holding me back from a 3080 right now is the 10GB memory simply because I've seen the 11GB on my 2080 Ti maxed out in a few games at 4K resolution. And it wasn't simply usage, in those cases, having experienced degraded performance before turning down texture quality or other settings to reduce VRAM needed. Most games fall into the 6-8GB range right now, but I'm just worried that more games will be coming down the pipe that start running into limitations with 10GB. I do understand that they probably could not hit their $700 target if they added more, though. I can see the 20GB version being $1,000 or close to it unless Micron will have 16Gb chips ready when it hits production.It is certainly a weird position to be in.
While there are uses for the, uh, 'excess' performance, they don't seem to merit significant increases in costs.
Feels kind of like we're on a divide, where more performance isn't really useful for pure rasterization on desktops, but alsol isn't nearly enough for say VR or RT (or both).
DLSS is great, I agree, but unfortunately I do not think it will become ubiquitous.10 is still more than 8, remember, the 3080 isn't an upgrade from the 2080 ti, it's an upgrade from the 2080/super, if you have a 2080 Ti i'd recommend keeping it, the 3090 is really closer to the 2080 ti replacement, but maybe there will be a middle card in the future, or there is of course the more expensive 20GB 3080 option
as for games utilizing more VRAM, well, I'm not sure what the trend will be, if DLSS is used more, that's the answer to the VRAM capacity problem, it will alleviate so much pressure on capacity when used
games are also constantly developer new compression methods, and ways to load balance everything correctly, with RTX I/O and Microsoft DirectStorage decompression should be a lot better and again the vram capacity issue won't be such a problem
I know your concerns for sure, and it will really all depends on the games themselves, but I do implore you, if a new game supports DLSS, give it a try, I'm actually liking the technology now, I've used it now, and DLSS 2.0 gives you good image quality and a perf increase
Is Contrast Adaptive Sharpening not AMD's version of DLSS?These are just rumors, but rumors are AMD will have something similar to DLSS coming.
If that can happen, and maybe some form of standard API can be achieved, then maybe it will be used more.
Like Ray Tracing, someone had to get the ball rolling first.
I do feel the same way; it's not even that the 3080 has less than the 2080 Ti (which as noted by others, the 2080 should be the point of comparison), but that memory didn't increase much.Only thing really holding me back from a 3080 right now is the 10GB memory simply because I've seen the 11GB on my 2080 Ti maxed out in a few games at 4K resolution. And it wasn't simply usage, in those cases, having experienced degraded performance before turning down texture quality or other settings to reduce VRAM needed. Most games fall into the 6-8GB range right now, but I'm just worried that more games will be coming down the pipe that start running into limitations with 10GB.
I kind of feel like it's worth waiting. Part of that at least is coming from a 1080Ti and not really wanting to go backward in VRAM capacity particularly given how long I'm likely to keep the new card.I do understand that they probably could not hit their $700 target if they added more, though. I can see the 20GB version being $1,000 or close to it unless Micron will have 16Gb chips ready when it hits production.
As much as I admire upcoming solutions to the VRAM problem... these are 'high-end' solutions that require significant developer support. I can't help but imagine that there might be games that slip through the cracks which wind up benefiting from the increased VRAM due to lack of optimization.as for games utilizing more VRAM, well, I'm not sure what the trend will be, if DLSS is used more, that's the answer to the VRAM capacity problem, it will alleviate so much pressure on capacity when used
games are also constantly developer new compression methods, and ways to load balance everything correctly, with RTX I/O and Microsoft DirectStorage decompression should be a lot better and again the vram capacity issue won't be such a problem
Is Contrast Adaptive Sharpening not AMD's version of DLSS?