NVIDIA GeForce RTX 3080 Ti Expected to Cost Around $1,099

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,073
Points
83
nvidia-geforce-rtx-3080-key-art-1024x576.jpg
Image: NVIDIA



NVIDIA’s GeForce RTX 3080 Ti is probably going to be priced somewhere around $1,099. This figure is based off of a report shared by IT Home today, which claims that the long-rumored GPU will be sold for 7,999 yuans (around $1,230) in China. Being that the cost difference between the GeForce RTX 3080 Ti and GeForce RTX 3080 on NVIDIA’s official Chinese site is 2,500 yuans (around $385), it seems safe to assume that the new flagship will carry an MSRP of at least $999 in the U.S. (The GeForce RTX 3080 Founders Edition launched at $699.99.)



nvidia-china-geforce-rtx-3080-pricing-1024x352.jpg




The product is expected to cost ¥7999 in China, which is presumably a recommended price with a 13% VAT ($1090 without VAT). At around $1,099 in the U.S., the GeForce RTX 3080 Ti will be considerably...

Continue reading...


 
So the actual street price by that scale should be 4k from scalpers right?
 
So the actual street price by that scale should be 4k from scalpers right?
I'd hope that the idea is that they don't sell out immediately...


...and perhaps take more than 5.28 seconds this time.
 
Stupid price knowing that that the 6900XT is cheaper and faster in rasterisation.

As much as I enjoy my RT games, the fact is, today's RT games brings too many compromises to gaming and until these lazy devs, and hardware, evolves, RT gaming over rasterisation gaming, for this generation sadly, is only a pipedream.
 
Stupid price knowing that that the 6900XT is cheaper and faster in rasterisation.

As much as I enjoy my RT games, the fact is, today's RT games brings too many compromises to gaming and until these lazy devs, and hardware, evolves, RT gaming over rasterisation gaming, for this generation sadly, is only a pipedream.

People will buy whatever is available.

I wouldnt spend a cent on a non RT/DLSS card tho. Because I dont have to.
 
$100 over my expectations. I still expect it to be right at the $1,000 mark when it releases in the States.
Stupid price knowing that that the 6900XT is cheaper and faster in rasterisation.

As much as I enjoy my RT games, the fact is, today's RT games brings too many compromises to gaming and until these lazy devs, and hardware, evolves, RT gaming over rasterisation gaming, for this generation sadly, is only a pipedream.
How do you know that the 6900 XT is faster when the 3080 Ti isn't even out, yet? Regardless, we know that the 3080 Ti is using the full fat chip from the 3090, so we can expect performance to only be a few percentage points behind it. In the majority of games I've seen the 3090 is faster than the 6900 XT except at low resolutions when they're basically tied as the CPU comes more into play. But nobody is buying an enthusiast card to play games at 1920x1080. If they are then they are just wasting money.

I don't know why you think ray tracing is such a pipe dream. I've been quite enjoying every game using it since I got my 2080 Ti and even more so now that I have a 3090. What compromises are ray tracing bringing that other fancy, expensive graphical effects don't?
 
Better have your 1000w+ PSU ready on standby.

And no way these will be retail prices, it will be double this. I see 3080's well over $1k even from likes of Newegg and MC (when they have stock)
 
I don't know why you think ray tracing is such a pipe dream. I've been quite enjoying every game using it since I got my 2080 Ti and even more so now that I have a 3090. What compromises are ray tracing bringing that other fancy, expensive graphical effects don't?

I see RT in the same light as I see PhysX.

When PhysX first came out, it required an accelerator card. Then even after it was purchased by nVidia, it still required an nVidia GPU for acceleration for a long while. Now... it's a middleware that doesn't require anything fancy.

Now, RT isn't exactly the same as physics software. But I think we will see some parallels.

I play exactly 0 games that support RT of any flavor. There are games on my radar that do support RT - but none of them ~require~ RT.

In the same vein that back in the day - a handful of high profile games supported PhysX acceleration, but none of them required it. At least until the API was tweaked to be scalable enough to run on something that didn't have acceleration available - now physics software (PhysX included) is everywhere, in almost every title, and will run (to some degree) on almost every machine.

It was cool to watch Batman's cape flutter in the wind and trash blow around, but it didn't fundamentally change the game. Now, every game supports that to some degree, and it doesn't require anything specific to run.

I think RT is going to take the same avenue. Today - it requires specific cards with high horsepower, and provides some graphical flare but nothing requires it. Tomorrow - probably will run on every GPU card produced, but still won't be "required". The day after - will run on a potato, nearly every title will have some RT to some degree, and we won't be talking about it anymore.

By the time we get to titles that require RT, the API will look completely different than it does today, the hardware will be completely different than today, and the GPU you bought today to run RT will be inconsequential. (And hopefully we can buy today's hardware by then...)

If you want to pay for the privilege of running RT now - I have no problem with you doing so. I applaud it, in fact, as it helps push it forward as a standard. But you shouldn't go assuming that everyone will be willing to pay hundreds (thousands) of dollars just to have better shadows and ultra-reflective mud puddles. For me, RT would be a bonus, but I will buy whatever is available, RT isn't even a consideration in that. And given the choice between rasterization speed and RT speed, I'd pick the faster rasterizing card every time, because (almost) ~every~ title requires rasterization to some degree, while only a handful support RT for some additional effects.
 
Better have your 1000w+ PSU ready on standby.

And no way these will be retail prices, it will be double this. I see 3080's well over $1k even from likes of Newegg and MC (when they have stock)
I'm running a 3090 and 9900K on a 750W PSU just fine.
 
Stupid price knowing that that the 6900XT is cheaper and faster in rasterisation.

As much as I enjoy my RT games, the fact is, today's RT games brings too many compromises to gaming and until these lazy devs, and hardware, evolves, RT gaming over rasterisation gaming, for this generation sadly, is only a pipedream.

I'd love to know where you are finding 6900XT's for less than $1100. And, if you do find one, buy it, and I'll gladly send you the money for it.
 
Better have your 1000w+ PSU ready on standby.

And no way these will be retail prices, it will be double this. I see 3080's well over $1k even from likes of Newegg and MC (when they have stock)

It won't require a 1,000 watt power supply. Around 850w is all you really need if its a quality unit.

I'm running a 3090 and 9900K on a 750W PSU just fine.

Not having given you any trouble at this point isn't the same thing as "just fine." The fact is, depending on how your system is configured, you are pretty close to the maximum output that a 750w PSU can handle. I've seen overclocked 9900K's easily pull close to 300w when overclocked and a 3090 FE can pull about 350w or so as well. While games won't generally push your CPU too hard, some like Cyberpunk 2077 can. Do the math and you are already somewhere around the 650w range and that's before adding in fans, drives, RAM, motherboard, etc.

A quality unit should be able to handle that, but its likely going to have a shorter life span than it would otherwise have with a reduced load or if it was sized for the system with a bit more headroom. That being said, if its working now I wouldn't rush out to replace it and though NVIDIA does recommend 850+ for an RTX 3090 FE, we all know they do this to help account for unknown system configurations and to give themselves headroom when users have less than stellar PSU's. If you have a lean configuration and don't play Cyberpunk 2077 all the time, it will probably last awhile. Again, most games won't push a CPU super hard.
 
It won't require a 1,000 watt power supply. Around 850w is all you really need if its a quality unit.



Not having given you any trouble at this point isn't the same thing as "just fine." The fact is, depending on how your system is configured, you are pretty close to the maximum output that a 750w PSU can handle. I've seen overclocked 9900K's easily pull close to 300w when overclocked and a 3090 FE can pull about 350w or so as well. While games won't generally push your CPU too hard, some like Cyberpunk 2077 can. Do the math and you are already somewhere around the 650w range and that's before adding in fans, drives, RAM, motherboard, etc.

A quality unit should be able to handle that, but its likely going to have a shorter life span than it would otherwise have with a reduced load or if it was sized for the system with a bit more headroom. That being said, if its working now I wouldn't rush out to replace it and though NVIDIA does recommend 850+ for an RTX 3090 FE, we all know they do this to help account for unknown system configurations and to give themselves headroom when users have less than stellar PSU's. If you have a lean configuration and don't play Cyberpunk 2077 all the time, it will probably last awhile. Again, most games won't push a CPU super hard.

It's not like quality 1000W PSU's are expensive either. Why someone buying top tier CPU's and GPU's would skimp on the PSU makes no sense to me.
 
It's not like quality 1000W PSU's are expensive either. Why someone buying top tier CPU's and GPU's would skimp on the PSU makes no sense to me.

I wouldn't call them cheap, but if you bought an RTX 3090, it stands to reason you can probably afford a PSU that can handle it comfortably.
 
It won't require a 1,000 watt power supply. Around 850w is all you really need if its a quality unit.



Not having given you any trouble at this point isn't the same thing as "just fine." The fact is, depending on how your system is configured, you are pretty close to the maximum output that a 750w PSU can handle. I've seen overclocked 9900K's easily pull close to 300w when overclocked and a 3090 FE can pull about 350w or so as well. While games won't generally push your CPU too hard, some like Cyberpunk 2077 can. Do the math and you are already somewhere around the 650w range and that's before adding in fans, drives, RAM, motherboard, etc.

A quality unit should be able to handle that, but its likely going to have a shorter life span than it would otherwise have with a reduced load or if it was sized for the system with a bit more headroom. That being said, if its working now I wouldn't rush out to replace it and though NVIDIA does recommend 850+ for an RTX 3090 FE, we all know they do this to help account for unknown system configurations and to give themselves headroom when users have less than stellar PSU's. If you have a lean configuration and don't play Cyberpunk 2077 all the time, it will probably last awhile. Again, most games won't push a CPU super hard.
1619107823418.png

My CPU peaked at about 220W when I was stress testing my 5 GHz all core with AVX in Prime95. It hovers right around 100W when gaming. Those must have been some crazy overclocks if it was seeing 300W.

I'm pulling around 400W right now on my 3090, but I have not tried overclocking it yet (just nudging the power limit). The FTW3 Ultra Hybrid can go up to 500W if you want, but I don't really see a need to.
 
In going back and looking at my data, the power draw on the 9900K was actually closer to 250w than 300w. Regardless, Skylake variants are power hungry bitches. When you push them past a certain point, they need a disproportionate amount of power to increase their clock speeds by modest amounts. 5.1GHz or 5.2GHz is the top end of what the 9900K can do on water, but that will solidly place them in the 250w range. Total system power draw at even 5.0GHz all core came out to about 285w in my testing.

Here, you can see the 9900K at stock speeds.
1619109844132.png

When overclocked, we see about another 22w of power consumption from that alone as no other variables changed.

1619109877535.png

Only about 10-15w of that was the idle GPU. The rest was motherboard, CPU, RAM, two NVMe drives and the water pump. That's doing the math with HWInfo64 reporting about 240-250w of power draw if memory serves. So, not exactly 300w, but again you'd still be close to 650w either way. That's still assuming you have a lean configuration and not 10 hard drives, custom cooling loop, and a million RGB LED's.

Going to 5.1GHz would likely cause considerable additional draw as it does with the Cascade Lake parts, which are themselves Skylake++++++++++++++++++++++++++++. You can see that with the 10980XE, which went from boosting to 4.8GHz on a couple cores to a 4.7GHz all core overclock, and well....you can see what that does to the power draw.

The point is, with your setup I'd want a bit more headroom from my power supply, but that's just me. I change my system configuration more often than most people change the oil in their cars. As a result, I never know what I'll be running in six months time and don't want to change my power supply every time I upgrade my machine.
 
It's really sad to see the make-believe MSRP on these units.
First, there will be only a handful released this year.
Secondly, the scalp price will be 3 times that.

I have really given up on upgrading this year.
Maybe this fall I'll check the etail again and see if I can pick up a new GPU, but I'm not holding my breath.

Sadly, I've not played a single minute of a game since this past summer.
 
I'm running a 3090 and 9900K on a 750W PSU just fine.
My AIO 1080Ti and 9900K didn't blink on a 650w Seasonic that I'd bought used years ago. That PSU is still going strong in another build; I'd swapped in an 850W EVGA that I'd bought as a contingency and simply haven't bothered to swap them back.

Granted, the 1080Ti wasn't as hungry as a 3090 is IIRC, but I've definitely been surprised at how little power some stuff uses if you're not pushing it to the edge!
 
Just to prove a point, back when I first built my rig - a 4790K and 980 (non-Ti), I ran it on a 450W PSU.

It ran just fine at stocks. For a couple of years. Then it started acting up. Swapped out to a 650W at that point, back to normal.
 
Become a Patron!
Back
Top