Report: NVIDIA Sacrifices Clock Speeds (Slightly) to Fix GeForce RTX 3080 Crashes

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,595
Points
113
nvidia-geforce-rtx-badge-1024x576.jpg
Image: NVIDIA



PCWorld’s Brad Chacos has published a story regarding NVIDIA’s new GeForce Game Ready 456.55 WHQL driver. As suggested by various user reports over the last few hours, the driver does appear to put a band-aid on the game-crashing issues that custom GeForce RTX 3080 owners have been complaining about over the past few days. But there’s a cost: reduced clock speeds.



“…with the original drivers, the [Horizon Zero Dawn] benchmark ran at a mostly consistent 2010MHz on...

Continue reading...
 
Last edited by a moderator:
I saw this coming a mile away.

However, I would counter, why not send AIBs an updated minimum requirement for the reference spec for hardware components instead? Amend the minimum reference spec AIBs need to design cards, and that also solves the problem, and we'd be able to retain the full GeForce Boost clock.
 
I saw this coming a mile away.

However, I would counter, why not send AIBs an updated minimum requirement for the reference spec for hardware components instead? Amend the minimum reference spec AIBs need to design cards, and that also solves the problem, and we'd be able to retain the full GeForce Boost clock.

That, and more opportunity for testing.

As this little "Scandal" wanes away tho, no real harm done.
Availability is still a far bigger issue.
 
Also, this just continues to go on to show how bad Samsung 8nm (really a custom 10nm node) was as a choice. I don't think this would be happening on TSMC 7nm.
 
Also, this just continues to go on to show how bad Samsung 8nm (really a custom 10nm node) was as a choice. I don't think this would be happening on TSMC 7nm.

I don't get it.

This seems like a lack of time, testing and poor component choices rather than the 8nm being bad...?
I mean, these cards perform very well now (so far).

Are you saying that the TSMC 7nm would have behaved differently with the same under spec'd components?
 
I don't get it.

This seems like a lack of time, testing and poor component choices rather than the 8nm being bad...?
I mean, these cards perform very well now (so far).

Are you saying that the TSMC 7nm would have behaved differently with the same under spec'd components?
I don't think it was poor component choice, just going by what has worked in the past. Lack of time and testing was certainly the big factor in this issue. Doesn't seem like FE cards are seeing this issue, so it's only one for AIB cards.
 
I don't think it was poor component choice, just going by what has worked in the past. Lack of time and testing was certainly the big factor in this issue. Doesn't seem like FE cards are seeing this issue, so it's only one for AIB cards.
Wrong component for this type of boost, if you watch Der8aer switching out caps the crashes end...so too much boost for the caps used no?

Either way, a adjustment in drivers seems to have fixed that. So far most testers have reported little to no negative effects that I can tell.
I don't count loosing 1-2 fps...
 
I'll add this here too:

" During testing, we also re-ran the benchmarks, and it had offset effects that are close to zero, meaning at 100 FPS you'd perhaps see a 1 FPS differential, but that can be easily assigned to random anomalies as well. As to why there is so little performance decrease is simple, not many games trigger the GPU all the way the end of the spectrum at say 2050 MHz. That's isolated to very few titles as most games are GPU bound and hover in the 1900 MHz domain."

 
I don't think it was poor component choice, just going by what has worked in the past. Lack of time and testing was certainly the big factor in this issue. Doesn't seem like FE cards are seeing this issue, so it's only one for AIB cards.

The FE cards don't have the issue because NVIDIA built them above and beyond the minimum reference spec, with 4 POSCAPs + 20 MLCC.

It's a component issue primarily.
 
Remember, this generation, FE cards are NOT the reference design or spec. NVIDIA has a minimum, separate reference spec sent to AIBs. That spec says 6 POSCAPS is the minimum.

The Founders Edition is NVIDIA being like an AIB and making their own card, superseding the reference spec. They put the power filters on back in a more robust config for the FE than the minimum reference spec calls for, hence, no issues.

What has been found out though, is that with GeForce Boost boosting the clock, 6 POSCAPs aren't 100% stable. It's stable for the reference base clock, but the fact that GPU Boost boosts the clock much higher is the issue, they just can't take it. However, just removing 1 POSCAP with a better 10 MLCC, seems to solve the issue. And overall, the less POSCAPs there are, and more MLCC's, the better potential for overclocking is there as well.
 
No, they had a hardware issue and fixed it with a driver.

Kind of sounds more like they had the wrong settings for the driver that the hardware wasn't able to utilize. I mean, isn't that the same with ALL driver issues? Something the driver requests or wants but the hardware isn't capable of providing?
 
So Nvidia has driver issues. This isn't AMD right?
If we (and AMD) were still trying to figure out what was going on six months later, then it'd be like AMD.

It's definitely still a black mark for Nvidia, and makes two in a row with the 2080Ti's Space Invaders.
 
Is this going to downgrade all 3080s or just the ones with the lesser power delivery? That should be the real question.
 
Kind of sounds more like they had the wrong settings for the driver that the hardware wasn't able to utilize. I mean, isn't that the same with ALL driver issues? Something the driver requests or wants but the hardware isn't capable of providing?
Yes, I don't think this is the first time a situation like this has happened on release.
 
It looks more like the AIB's were trying to push the reference design just a tad past what it was capable of. All of those cards effected hit reference clocks with no problem. It's when they tried to boost beyond reference boost clocks that the issue presents itself.

More of a nothing burger and AIB's just trying to squeeze a bit more performance out than they should have.
 
I just think NVIDIA should have beefed up the minimum spec a bit, send out a higher minimum spec in the beginning, problem solved up front. /shrug
 
Become a Patron!
Back
Top