AMD: 110-Degree Temps in Radeon RX 5700, 5700 XT Nothing to Worry About

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,075
Points
83
Radeon RX 5700 and 5700 XT owners may not like the higher temperatures (e.g., 110 degrees Celsius) they're seeing in their reference cards, but the new thermals are normal and by design. That's according to a recent blog post by AMD, which elaborates on how its new cards measure temperature differently.

While older GPUs utilized a single sensor to report core temperature, the RX 5700 series (and Radeon VII) has multiple sensors spread out across the GPU, which allows for more accurate temperature reporting (there's an implication that pre-Navi cards may have ran at similarly high temperatures). It's inherent to AMD's new AVFS (Adaptive Voltage and Frequency Scaling) system, whereby voltage is automatically adjusted for peak performance.

This doesn't mean that the efficiency of these GPUs are remarkable, but it should reassure enthusiasts that they're operating at safe levels.

The 110-degree junction temperature is not evidence of a problem or a sudden issue with AMD graphics cards. AMD now measures its GPU temperature in new locations and reports additional data points that capture this information because it adopted more sophisticated measuring methods.
 
I remember when Hawaii came out and people scoffed at the 94C limit and I thought it was smart to make silicon work better at higher temps. If it's within engineering specifications then I wouldn't worry. If you like it cooler there's always water.
 
Toasty!
I keep setting my Wattman to ramp up fan speeds to keep my card a smidge cooler ... it keeps resetting and saying something crashed .. good to know I don't need that extra fan cooling power I guess ... :unsure:
 
Thats quite a temperature jump!
It looks like they had no choice to get the performance up on the smaller node.
Even if by design, it wont be good for longevity, solder joints and nearby components will age faster.
It will be unfortunate if the GPU life is shortened beyond its usefulness.
 
Anyone got a thermal camera(flir?) to confirm and/or compare these new sensor points? I don't know a whole lot about the differences between NV and AMD pcb's but it'd be interesting to see if these temps are really nothing new and just a different measuring spot as AMD says.
 
waterblock.......enough said.

My 5700 XT runs at 40C overclocked to 2100.
Use an EKWB in a custom loop.
 
I remember when Hawaii came out and people scoffed at the 94C limit and I thought it was smart to make silicon work better at higher temps. If it's within engineering specifications then I wouldn't worry. If you like it cooler there's always water.

This assumes that we can trust them to get the thermal properties right. I still remember Nvidia's issues in the 8800 to 9800 era when their solder just wasn't up to handling the temperatures and frequent heat/cool cycles that the GPU's produced.

I'd like to think that AMD would do this better when they are wandering into new temperature territories, but who knows?
 
Thats quite a temperature jump!
It looks like they had no choice to get the performance up on the smaller node.
Even if by design, it wont be good for longevity, solder joints and nearby components will age faster.
It will be unfortunate if the GPU life is shortened beyond its usefulness.

Well that depends on the area under effect of that temperature and the range of dissipation. This could be an effect of the number of temperature sensors and severely limited in area and hopefully being dissipated into the heat sink.
 
Other than endothermic reactions, I'm not sure of anything that runs better at high temperatures.
 
Other than endothermic reactions, I'm not sure of anything that runs better at high temperatures.

Well, there is one advantage of higher temps, and that is the heat transfer rate.

1565983630273.png

So, heat will transfer faster if there is a greater temperature difference.

This means that if whatever you are trying to transfer heat away from can tolerate a higher temp, you can run your fans at a lower (quieter) speed.

Personally I hate fan noise...
 
Well, there is one advantage of higher temps, and that is the heat transfer rate.

View attachment 94

So, heat will transfer faster if there is a greater temperature difference.

This means that if whatever you are trying to transfer heat away from can tolerate a higher temp, you can run your fans at a lower (quieter) speed.

Personally I hate fan noise...

I should have known that :confused:. Although the next question is where will it transfer first and are the components around it designed to handle that heat? It would be nice being able to run components at 100C+ with almost zero noise though since the amount of heat dumped in the room isn't dependent on that. Sadly I think most components lower life expectancy exponentially as temperatures increase.
 
I should have known that :confused:. Although the next question is where will it transfer first and are the components around it designed to handle that heat? It would be nice being able to run components at 100C+ with almost zero noise though since the amount of heat dumped in the room isn't dependent on that. Sadly I think most components lower life expectancy exponentially as temperatures increase.


While these are indeed fair concerns... My question is how long SHOULD they last? My CPU before this one lasted easily 6 yeas. The video card 2, and the memory 6. And they were all still running just fine. So fine I sold them and they are still running. And my 970 had a bit of coil whine to it to boot.

What life expectancy should we want/expect from a CPU. More than a platter drive sure.... but what is reasonable?

I've personally never hit that number whatever it is. Even with higher than temp's I liked.
 
Become a Patron!
Back
Top