NVIDIA GeForce RTX 4060 to Consume More Power than GeForce RTX 3070 (220 W): Report

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,877
Points
113
News broke yesterday of NVIDIA's alleged launch schedule for its upcoming GeForce RTX 40 Series, including a supposed CES 2023 unveiling date for a newly mentioned (but predictable) GeForce RTX 4060 graphics card.

Go to post
 
My question is: how are they going to target lower tiers? What will be the 'level' of a GPU that can rely entirely off of PCIe slot power like a 1050Ti can?
 
nVidia's Idea of efficiency = Throw MOAR POWAH at it! Sure it'll perform laps around the competition, but the efficiency will blow, and not in a good way!
 
My question is: how are they going to target lower tiers? What will be the 'level' of a GPU that can rely entirely off of PCIe slot power like a 1050Ti can?
They'll release a 4030 that'll be gimped to oblivion. Look at the 1630 they are set to release. 64-bit data bus rumored to be in play. If that does come to fruition, then I wouldn't be surprised to see a 4030 built the same way..
 
nVidia's Idea of efficiency = Throw MOAR POWAH at it! Sure it'll perform laps around the competition, but the efficiency will blow, and not in a good way!
Both major companies have teetered in both directions - this is really the first instance that we're looking at AMD both advancing in performance and efficiency enough to highlight Nvidia's potential lack thereof in quite some time, IMO.

They'll release a 4030 that'll be gimped to oblivion. Look at the 1630 they are set to release. 64-bit data bus rumored to be in play. If that does come to fruition, then I wouldn't be surprised to see a 4030 built the same way..
So, I hope they don't do that - I hope they take the low-end segment more seriously than they have been and than AMD seems to be doing, and hopefully Intel is taking it seriously enough to spur some competition here too.

It'd be nice for a US$100 video card to actually be worth... US$100.
 
Both major companies have teetered in both directions - this is really the first instance that we're looking at AMD both advancing in performance and efficiency enough to highlight Nvidia's potential lack thereof in quite some time, IMO.


So, I hope they don't do that - I hope they take the low-end segment more seriously than they have been and than AMD seems to be doing, and hopefully Intel is taking it seriously enough to spur some competition here too.

It'd be nice for a US$100 video card to actually be worth... US$100.
Oh.. I know AMD is all too guilty of this as well. Its been a LONG time since nVidia has been forced to do it though. Just stands out more. AMD's video selection still hasn't been able to fully catch up, and nVidia is trying to make sure the nail is firmly planted in their coffin..
 
Oh.. I know AMD is all too guilty of this as well. Its been a LONG time since nVidia has been forced to do it though. Just stands out more. AMD's video selection still hasn't been able to fully catch up, and nVidia is trying to make sure the nail is firmly planted in their coffin..
I think AMD is doing pretty good, IMO. They're behind on RT, but not materially as their tech works, just needs more grunt. They've made progress on smart upsampling despite lacking a technological equivalent to DLSS, so the big question in my eyes is ongoing support for video encoding for streaming. But even Intel will beat Nvidia to the punch here if they deliver ARC before RTX 4000-series GPUs hit.
 
Over AMD, Nvidia actually has some production uses that AMD is lacking in it's cards. Known compatibility with various encoders, the whole background sound and video elimination enhancements that you can run for free on RTX cards... stuff like that. As an AMD card owner I would want to tinker with but can't... yet on my desktop I don't run a webcam... though I've considered it... if I was running an RTX card. I'd like to see AMD eventually have some parity in those feature spaces as well.
 
Over AMD, Nvidia actually has some production uses that AMD is lacking in it's cards. Known compatibility with various encoders, the whole background sound and video elimination enhancements that you can run for free on RTX cards... stuff like that. As an AMD card owner I would want to tinker with but can't... yet on my desktop I don't run a webcam... though I've considered it... if I was running an RTX card. I'd like to see AMD eventually have some parity in those feature spaces as well.
This! AMD needs to step up to the plate with some real encoder support and utilities making use of the power they've got. I steered clear of their video cards strictly because of this.
 
This! AMD needs to step up to the plate with some real encoder support and utilities making use of the power they've got. I steered clear of their video cards strictly because of this.
Using antiquated GPU IP in their APUs has been to blame at least in part for this - outdated tech with questionable software support is not particularly conducive to drumming up support from the software development community.

AMD is rectifying this with Zen 3+ (currently for mobile) and Zen 4 heading to the desktop next. If the hardware and drivers are there, then we're likely to get more interest from software houses.
 
outdated tech with questionable software support is not particularly conducive to drumming up support from the software development community.
Well, the flip side of that is... the tech is mature, and the software ~should~ be plentiful with widespread support.

The fact that it never really was, I think, is indicative of the problem AMD has.
 
Well, the flip side of that is... the tech is mature, and the software ~should~ be plentiful with widespread support.

The fact that it never really was, I think, is indicative of the problem AMD has.
Even if the tech were perfect (this being pre-RDNA, cannot draw conclusions for current hardware), and all indications point to it not having been, a lack of sync between APUs and GPUs works against adoption. APUs lagging works against it even more, as we'd expect those to be more plentiful - except that AMD chose not include graphics capabilities in their main SKUs.

And we're comparing against Intel, where SKUs without IGPs are the exception, and Nvidia, the market leader and standard bearer - both of which are broadly supported across consumer and professional product ranges.

Essentially, AMDs tech stood little chance of community acceptance and adoption - outside of the FOSS community, at least. Commercially it was a non-starter, with houses like Blender ignoring them entirely, and Adobe not being bothered with compatibility testing. The latter being a point of pain for myself; Adobe products stopped crashing constantly on her desktop when I swapped out an RX460 with what should be an inferior GTX 1050Ti.

AMD choosing to put GPUs in their 'uncore' dies is a huge step forward here, as is the release of Ryzen 6000 mobile SKUs using RDNA IP. And with Nvidia pummeling them with featuresets that extend beyond raster performance along with Intel chomping at the bit to consume the bottom half of the discrete GPU market and bringing a similar featureset, it seems like AMD is in a 'now or never' position and they're acting with a material response.
 
Become a Patron!
Back
Top