News broke yesterday of NVIDIA's alleged launch schedule for its upcoming GeForce RTX 40 Series, including a supposed CES 2023 unveiling date for a newly mentioned (but predictable) GeForce RTX 4060 graphics card.
Go to post
Go to post
Kinda funny how the RTX 4000 cards are looking to shape up like the GTX 400 line.... Hot and hungry!
They'll release a 4030 that'll be gimped to oblivion. Look at the 1630 they are set to release. 64-bit data bus rumored to be in play. If that does come to fruition, then I wouldn't be surprised to see a 4030 built the same way..My question is: how are they going to target lower tiers? What will be the 'level' of a GPU that can rely entirely off of PCIe slot power like a 1050Ti can?
Both major companies have teetered in both directions - this is really the first instance that we're looking at AMD both advancing in performance and efficiency enough to highlight Nvidia's potential lack thereof in quite some time, IMO.nVidia's Idea of efficiency = Throw MOAR POWAH at it! Sure it'll perform laps around the competition, but the efficiency will blow, and not in a good way!
So, I hope they don't do that - I hope they take the low-end segment more seriously than they have been and than AMD seems to be doing, and hopefully Intel is taking it seriously enough to spur some competition here too.They'll release a 4030 that'll be gimped to oblivion. Look at the 1630 they are set to release. 64-bit data bus rumored to be in play. If that does come to fruition, then I wouldn't be surprised to see a 4030 built the same way..
Oh.. I know AMD is all too guilty of this as well. Its been a LONG time since nVidia has been forced to do it though. Just stands out more. AMD's video selection still hasn't been able to fully catch up, and nVidia is trying to make sure the nail is firmly planted in their coffin..Both major companies have teetered in both directions - this is really the first instance that we're looking at AMD both advancing in performance and efficiency enough to highlight Nvidia's potential lack thereof in quite some time, IMO.
So, I hope they don't do that - I hope they take the low-end segment more seriously than they have been and than AMD seems to be doing, and hopefully Intel is taking it seriously enough to spur some competition here too.
It'd be nice for a US$100 video card to actually be worth... US$100.
I think AMD is doing pretty good, IMO. They're behind on RT, but not materially as their tech works, just needs more grunt. They've made progress on smart upsampling despite lacking a technological equivalent to DLSS, so the big question in my eyes is ongoing support for video encoding for streaming. But even Intel will beat Nvidia to the punch here if they deliver ARC before RTX 4000-series GPUs hit.Oh.. I know AMD is all too guilty of this as well. Its been a LONG time since nVidia has been forced to do it though. Just stands out more. AMD's video selection still hasn't been able to fully catch up, and nVidia is trying to make sure the nail is firmly planted in their coffin..
This! AMD needs to step up to the plate with some real encoder support and utilities making use of the power they've got. I steered clear of their video cards strictly because of this.Over AMD, Nvidia actually has some production uses that AMD is lacking in it's cards. Known compatibility with various encoders, the whole background sound and video elimination enhancements that you can run for free on RTX cards... stuff like that. As an AMD card owner I would want to tinker with but can't... yet on my desktop I don't run a webcam... though I've considered it... if I was running an RTX card. I'd like to see AMD eventually have some parity in those feature spaces as well.
Using antiquated GPU IP in their APUs has been to blame at least in part for this - outdated tech with questionable software support is not particularly conducive to drumming up support from the software development community.This! AMD needs to step up to the plate with some real encoder support and utilities making use of the power they've got. I steered clear of their video cards strictly because of this.
Well, the flip side of that is... the tech is mature, and the software ~should~ be plentiful with widespread support.outdated tech with questionable software support is not particularly conducive to drumming up support from the software development community.
Even if the tech were perfect (this being pre-RDNA, cannot draw conclusions for current hardware), and all indications point to it not having been, a lack of sync between APUs and GPUs works against adoption. APUs lagging works against it even more, as we'd expect those to be more plentiful - except that AMD chose not include graphics capabilities in their main SKUs.Well, the flip side of that is... the tech is mature, and the software ~should~ be plentiful with widespread support.
The fact that it never really was, I think, is indicative of the problem AMD has.