Diablo IV Players Claim Blizzard’s New Game Is Bricking NVIDIA GeForce Graphics Cards: “RIP my RTX 3080 Ti”

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,202
Points
83
Demons and monsters may not be the only things that Diablo IV players are inadvertently killing. In a scenario that seems reminiscent of the GPU drama that surrounded Amazon's MMORPG, New World, a growing number of Diablo IV players are taking to social media and other online channels to complain about how Blizzard's new game is crashing, or, worse, bricking, NVIDIA GeForce RTX 3080 Ti graphics cards. Some of the vendors that have been named include GIGABYTE and EVGA, and while many have suggested that this is strictly a hardware issue and that the developer isn't to blame, Blizzard has confirmed that it is investigating reports of the GPU issues, which appear to be increasing. Diablo IV is currently available to play as part of an early beta phase, while its open beta will run from March 24 to 26.

See full article...
 
Here we go again.

No game or compute load is capable of bricking any GPU, unless the GPU has some other defect.

There are no exceptions to this rule.

I remember when Civilization V first came out and the 2k forums were hit by a stream of players who claimed that Civilization V had bricked their 9800GT's (when the true culprit was Nvidias well poublicized solder issue which they never dealt with)

It wasn't true for Civilization V, it wasn't true for New World and it isn't true for Diablo IV.

It won't ever be true for any software, unless it is something like MSI Afterburner messing with voltages or something else.
 
No, its possible that it can run a card at high fps in menus, cinematics or certain maps/modes that will expose flaws in weak gpu components. I've got two games recently that are ramping up my gpu when they shouldn't be. A new card game that at 60fps ramps up the fans to 2500rpm unless I cap it at 30fps in nvidia control panel. The game has min specs of a 1st gen intel and GTX 770. Total War Warhammer3 has had a known issue with its campaign map for a year now that seems to make some cards run at 100% utilization and high fan speeds. The battles usually run smoother even with hundreds of units on screen.
 
No, its possible that it can run a card at high fps in menus, cinematics or certain maps/modes that will expose flaws in weak gpu components.

Yes, that is possible.

But, as a general rule:

Hardware should always protect itself from run-away conditions. Failure to do so is a flaw - be it in exposing otherwise weak components or manufacturing issues or “user error” or inadequate thermal protection.

Running at 100% and just having fans ramped up isn’t a flaw per se, it’s lack of optimization — but bricking the card definitely is a flaw.
 
Here we go again.

No game or compute load is capable of bricking any GPU, unless the GPU has some other defect.

There are no exceptions to this rule.

I remember when Civilization V first came out and the 2k forums were hit by a stream of players who claimed that Civilization V had bricked their 9800GT's (when the true culprit was Nvidias well poublicized solder issue which they never dealt with)

It wasn't true for Civilization V, it wasn't true for New World and it isn't true for Diablo IV.

It won't ever be true for any software, unless it is something like MSI Afterburner messing with voltages or something else.
It's not that simple. The game is doing the equivavlent of putting a car to the limiter consistently for no reason. It should not brick engines, but you know it will if done too much especially older more performance than reliability tuned engines. The problem is that GPUs have no hard rev limiter. Now should there be one is another question which must be put to nvidia.
 
The reference 30 series is known to have voltage/transient spike issues.

If software exposes this hardware defect that is still on the hardware.

Imagine if a CPU randomly spiking to 100% caused the CPU to fail. It wouldn't be on the software.

Engine comparison would be if redline is 6000rpm and when you told the engine to go to 6000rpm for a split second it instead went to 9000rpm+

 
Last edited:

I don't see how high framerate would do damage or expose underlying GPU flaws.

You could have one older and lighter title load the GPU less at 300fps at lower resolutions, and have a modern title at high resolutions load the GPU more at 60fps.

Framerate is a mostly irrelevant metric here.
 
I don't see how high framerate would do damage or expose underlying GPU flaws.

You could have one older and lighter title load the GPU less at 300fps at lower resolutions, and have a modern title at high resolutions load the GPU more at 60fps.

Framerate is a mostly irrelevant metric here.
In this scenario the framerate ingame was capped and then in a cutscene it would go to likely 1000+ in an instant which can and will either trip OCP and/or kill any card not built to handle the instant increase in load
 
Become a Patron!
Back
Top