NVIDIA GeForce RTX 4090 Is Only Worth $700, according to Micro Center

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,660
Points
113
The GeForce RTX 4090, NVIDIA's flagship gaming GPU, is worth no more than $700, according to screenshots shared online by Micro Center customers that show how the retailer is offering as little as $699.95 for custom versions of the GPU as part of its trade-in program, which allows old hardware to be turned into instant in-store credit.

See full article...
 
Given that they still keep melting in droves it's a security risk
Too true. I saw those post on Reddit over the weekend but I still have my doubts about user errors and/or third-party adapters/cables. So far I've had 3 of v1 and now 1 of v2 and all have worked out fine but I have put extreme effort into making sure they are all connected properly
 
R.E. the 12VHPWR melting issue, it is quite clearly a problem with the inherent design of the cable and plug, pushing that much power, with narrow margin of error and tolerances, with wires that are smaller and closer together, it's a design issue, and PCIe SIG should have never qualified the thing for production.
 
For sure it is a crap design but it can work. I think the concept of a 12-pin or some kind of design with more than 8-pins makes sense since the 3x 8-pin thing was getting ridiculous but I do wonder why they couldn't pick a bigger one. I guess I'll count my blessings with my luck so far because it has been on my mind and I do periodically check all the cards I have that have it.

However, in a world where the following question still solves so many problems, one does have to again wonder why NV ever thought the general populace could navigate dealing with such a weak connector.

 
For sure it is a crap design but it can work. I think the concept of a 12-pin or some kind of design with more than 8-pins makes sense since the 3x 8-pin thing was getting ridiculous but I do wonder why they couldn't pick a bigger one. I guess I'll count my blessings with my luck so far because it has been on my mind and I do periodically check all the cards I have that have it.

However, in a world where the following question still solves so many problems, one does have to again wonder why NV ever thought the general populace could navigate dealing with such a weak connector.

I don't want to put words into their mouths, so take what I'm about to say with a grain of salt - Allegedly, I believe NVIDIA thought they could control the production and supply and quality of components, and in a perfect world, they'd be right, if the components are of sufficient quality, then the tolerances are there, and it works fine. The problem is, all things are never equal, and components of lesser quality and quality control get put into production and manufacturing, you ultimately cannot control the supply and production, and thus you get components that have variable tolerances.

That all said, if PCIe SIG designed the connector to have higher base tolerances in the first place, like PCIe 8-pin, then the tolerances would be there (like they were) for the new connector, despite lower quality components.

NV is not all to blame, I do not blame them at all for wanting to push and progress the nature of the power connector, and the idea of reducing cables for power, and requiring less space on the PCB, these are good goals to pursue.
 
However, in a world where the following question still solves so many problems, one does have to again wonder why NV ever thought the general populace could navigate dealing with such a weak connector.
After having owned three of the 4090's all from different manufacturer's I haven't had an issue. I admit I jumped ship there for a while because of the possibilities of melting and the power draw associated with it. Then I decided to get a 4090 FE card and again no issues at all. I still believe that most of this is problem is on the end user. You can tell people all you want to make sure they are plugged in fully, but they will do what they want. Unfortunately the amount of people floating around on the internet on social media mainly think they are experts on computers and just disregard any sound advice.
 
NV is not all to blame, I do not blame them at all
I totally agree with everything you said was trying to fathom a reason for this particular choice and I think I might be onto something. Ergonomics plain and simple. A possible strategy of applying it to all cards means that at some point an AIB could make a really small or slim card and something like this could be ideal. Just pulling that out of the air but was trying to see other positive, productive lines of thought. I've thought about motherboard, or similar, connectors but those can go to the other extreme of too much.
 
After having owned three of the 4090's all from different manufacturer's I haven't had an issue. I admit I jumped ship there for a while because of the possibilities of melting and the power draw associated with it. Then I decided to get a 4090 FE card and again no issues at all. I still believe that most of this is problem is on the end user. You can tell people all you want to make sure they are plugged in fully, but they will do what they want. Unfortunately the amount of people floating around on the internet on social media mainly think they are experts on computers and just disregard any sound advice.

Unfortuntely, the issue is bigger and more complex than just 'the user is plugging it in wrong'. While that can contribute, it is not inherently the main issue with the new cable and connector, it's a design choice ultimately that has caused these issues. It needs to go back to the drawing board.
 
Talk about ironic. So I've strongly wanted to do an ITX build for a while now and that Sharkoon C20 case I posted about the other day looks like an ideal candidate for what I want to do. It checks pretty much all the boxes for what I'm thinking about including the ability to fit the monstrous Zotac 4080 Super I got recently. It's been incredibly happy in a HAF X case but I think the Sharkoon's design could achieve the same results.

However, in thinking about what Brent has said, plus taking into my account with all my good experiences probably being directly related to using much larger cases that give room for that connector to have a lot of clearance before the bend, now I'm not so sure about wanting to go that route. Sigh, truly ironic.
 
Sad part is even if it was priced at that I'm pretty sure that scalpers would screw it up just like they did with the RTX 3080 when everyone praised what a great value that was and then nobody could find one after for under $1500. I know that had to do with crypto but that insanity still exists anytime a powerful card comes out at a good price.

On the flipside and of no real help to most, I have been noticing more and more AIB cards becoming available at their MSRP but even then that's $300-500 more than the non-existent FE model.
 
Unfortuntely, the issue is bigger and more complex than just 'the user is plugging it in wrong'. While that can contribute, it is not inherently the main issue with the new cable and connector, it's a design choice ultimately that has caused these issues. It needs to go back to the drawing board.
I agree and I'm not in any way defending the design. I'm just saying there are probably a lot of issues with user errors out there, which of course comes back to the design.
 
Talk about ironic. So I've strongly wanted to do an ITX build for a while now and that Sharkoon C20 case I posted about the other day looks like an ideal candidate for what I want to do. It checks pretty much all the boxes for what I'm thinking about including the ability to fit the monstrous Zotac 4080 Super I got recently. It's been incredibly happy in a HAF X case but I think the Sharkoon's design could achieve the same results.

However, in thinking about what Brent has said, plus taking into my account with all my good experiences probably being directly related to using much larger cases that give room for that connector to have a lot of clearance before the bend, now I'm not so sure about wanting to go that route. Sigh, truly ironic.

I think that the shear size of the 4090 in combination with the somewhat flawed design of the connector with lack of space inside some (a lot of) cases causes these issues which you could in part also consider as user error.
 
I think that the shear size of the 4090 in combination with the somewhat flawed design of the connector with lack of space inside some (a lot of) cases causes these issues which you could in part also consider as user error.
I totally agree and it's something I've always thought about when I read the posts on Reddit since there are some that even mention how their case didn't really allow enough space for proper installation but like Brent said, it is a problematic design. I also agree that users shouldn't have to put so much attention into such things due to a design that is inherently fragile for the application it's being used for.

The writing has been on the wall for some time that something new was needed and while it's easy to point fingers at any GPU manufacturer in regard to power efficiency things have actually improved when considering the increased performance metrics of some cards. That video in the 1080 Ti thread from GN touches on that topic by explaining how some later top-end cards that outperformed it used the same or less power but we have gotten to a point with the 4090, and 7900 XTX which need enough power to justify 3x 8-pin connectors, which is insane to think about compared to 5+ years ago, but do perform near equally as much over the 1080 Ti.

Max TDP for 1080 Ti was 250W, and 4090 is 600W, according to user benchmark scores the average performance difference is also very similar so there you have it. Now granted most users will rarely ever have either of these cards peak that high but even the normal use scenarios would also likely scale similarly.

1713442218524.png

Jump over to the end of the SLI/Crossfire era and we had already reached that, or more when factoring in how many power cables were needed for various mGPU setups. Those running two cards with just two connectors each were already at 4x connectors and then the ones with three connectors each needed 6x 8-pin connectors. So yeah, the writing was on the wall that something new was needed but I think we can all agree that 12VHPWR wasn't it.
 
Price is probably set by depreciation value not overinflated idiotic street prices. Remember even a Rolls Royce lux car depreciates quickly we know someone whose 1 mill drophead RR dropped to 500k in like two yrs.
 
Become a Patron!
Back
Top