- Joined
- May 6, 2019
- Messages
- 12,614
- Points
- 113
I think...That is truly bonkers. We need someone to make a BIG efficiency leap.
They get more efficient most generations. What we're looking at is not an efficiency regression, in terms of getting lower performance per watt, but rather seeing an increase in the maximum power draw limits that accompany some performance gains.That is truly bonkers. We need someone to make a BIG efficiency leap.
All of this has happened before, and all if it will probably happen again - with the improvement in both wired and wireless interlinks, for those that need on-site computing power, a move back toward centralized big iron starts making sense. Even at home!GeForce RTX 4090? Titan?
Great way to turn those unused hydronic baseboard heaters into a central GPU heating system. Not terribly efficient but MOAR FPS!
I sure hope there's an implicit smiley face in there. Sometimes it's hard to tell.All of this has happened before, and all if it will probably happen again - with the improvement in both wired and wireless interlinks, for those that need on-site computing power, a move back toward centralized big iron starts making sense. Even at home!
And why not integrate it into HVAC? If there's a heating need but not a compute need - sell the compute!
Nope - serious, if not somewhat far-fetched at the moment.I sure hope there's an implicit smiley face in there. Sometimes it's hard to tell.
I don't think I posted about it but I remember reading about 6 months ago about a country, I think in the EU, that had a data center that was plumbed into the local community heating setup.And why not integrate it into HVAC? If there's a heating need but not a compute need - sell the compute!
They need to bring the Titan branding back. It was never cheap which should alleviate some of the confusion about the pricing and performance for them. I know the x90 cards predated them were halo products unto themselves and used to feature 2x GPUs on the same PCB but the Titan nomenclature always separated it from the pack. The whole stack is just so confusing now between x80, x90, TI, and Super, and then the absence of the Titan. From Maxwell to now NV has really gotten messy with its branding, especially when the refresh cycles kick in towards the end of a product generation.GeForce RTX 4090? Titan?
This I agree with, it would have made much more sense for the 3090 / 3090 Ti, and would have highlighted the main driver for the product - the extra VRAM for content creators. Though I do get the 8K angle, from a marketing perspective. Don't agree with it, but it was valid.They need to bring the Titan branding back.
Makes it confusing for reviewers and consumers - as well as for competitors and competitor marketing teams.The whole stack is just so confusing now between x80, x90, TI, and Super, and then the absence of the Titan. From Maxwell to now NV has really gotten messy with its branding, especially when the refresh cycles kick in towards the end of a product generation.
Not really there were companies that were making 'heaters' for offices that were nothing but compute engines they would pay you to host for the benefit of generating heat in colder climates for effectively free. They even looked like fancy wall mounted radiators. Not sure if it ever went anywhere. Lets see if my google foo can find them...Nope - serious, if not somewhat far-fetched at the moment.
Look I HAVE the power to spare... it just seems like we need some gains in efficiency... Otherwise the power issue is going to go off the rails... if it hasn't already.They get more efficient most generations. What we're looking at is not an efficiency regression, in terms of getting lower performance per watt, but rather seeing an increase in the maximum power draw limits that accompany some performance gains.
Unlike an efficiency regression - or even efficiency improvement (velocity) regression, we're just seeing higher-performance options being available for those that are willing to foot the cost of purchasing and running them.
For those worried about efficiency or just energy thriftiness, lower-powered models are certainly available
I remember about three of four years ago someone was selling a crypto mining space heater. It was just a headless PC for the most part - you connected it to Wi-Fi, and it had a thermostat that would start/stop computations around whatever temperature you set.I sure hope there's an implicit smiley face in there. Sometimes it's hard to tell.
I think one of two things will happen first.Look I HAVE the power to spare... it just seems like we need some gains in efficiency... Otherwise the power issue is going to go off the rails... if it hasn't already.
I think one of two things will happen first.
First off there is a hard limit that most people can pull from the wall without hiring an electrician. Most typical US homes run their bedrooms / offices on 15A 120V circuits. To go beyond that will be a big leap - it will require a disclaimer that you can’t just plug it into any old outlet and have it run — so I think, at least for consumer purposes — that represents a ceiling.
The second will be the heat / noise that a consumer will be willing to put up with. An 800W card, together with a CPU cranking 100-200W, along with a 150-250W monitor, plus anything else in the room… you do have what amounts to a space heater. That will either be very large or very noisy. And it’s going to throw off a lot of heat. Enthusiasts may put up with it just so they can post those sweet screenshots of crazy benchmark numbers, but ~most~ consumers won’t fool with it.
Just like most consumers don’t drive exotic sports cars, even among those who could afford them, but we all like to drool over them