Intel Tests Mesh Shading of Xe HPG GPU

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
8,878
Points
113
intel-xe-logo-spotlight-1024x576.jpg
Image: Intel



Intel’s Raja Koduri has shared an image demonstrating the mesh shading capabilities of his company’s upcoming gaming GPU, the Xe HPG, which represents the high-performance, gaming-oriented branch of the Xe architecture. The GPU is rumored to feature 512 execution units, which should equate to roughly 4,096 shading units. It is also confirmed to feature GDDR6 memory.



Xe HPG mesh shading in action, with the UL 3DMark Mesh Shader Feature test that is coming out soon pic.twitter.com/fnYeWoM08c— Raja Koduri (@Rajaontheedge) February 10, 2021



The latest image marks the ongoing advancements that Intel is making with its Xe product line. Just a few days ago, Raja shared...

Continue reading...


 
Let me know when they've got something to challenge AMD and NVIDIA.

Availability may go a long way these days even if it is on the lower end of the spectrum i.e. xx50 or xx60 performance.
 
Let me know when they've got something to challenge AMD and NVIDIA.


On 1st try? I mean.. 2nd, isn't third chance the charm? :LOL: :LOL: :unsure: :unsure:

I really don't know what to expect from Intel, just keep in mind that there are no bad products, just bad prices. So if intel can take on say the RTX2080Ti/3070 for less cash and actual availability, that would be great.
 
On 1st try? I mean.. 2nd, isn't third chance the charm? :LOL: :LOL: :unsure: :unsure:

I really don't know what to expect from Intel, just keep in mind that there are no bad products, just bad prices. So if intel can take on say the RTX2080Ti/3070 for less cash and actual availability, that would be great.
I think you have the right idea about expectations with respect to a first generation product, but you probably need to further temper those expectations...

o if intel can take on say the 1650/1660 for less cash and actual availability, that would be great.

That's probably as much as we can really expect from the first run.
 
I think you have the right idea about expectations with respect to a first generation product, but you probably need to further temper those expectations...



That's probably as much as we can really expect from the first run.
Well, I was gladly surprised on what AMD delivered this round, not that you can buy any of their stuff, but still.

I owned an i740 back in the day, someone got it for me IIRC, I didn't use it that much since by then I already had a TnT2, nice card though. I remember it being faster than my ATI 3dRagePro, but it was no match for my TnT2
 
On 1st try? I mean.. 2nd, isn't third chance the charm? :LOL: :LOL: :unsure: :unsure:

I really don't know what to expect from Intel, just keep in mind that there are no bad products, just bad prices. So if intel can take on say the RTX2080Ti/3070 for less cash and actual availability, that would be great.

No, there are definitely bad products. A few have crossed my test bench on occasion.

Well, I was gladly surprised on what AMD delivered this round, not that you can buy any of their stuff, but still.

I owned an i740 back in the day, someone got it for me IIRC, I didn't use it that much since by then I already had a TnT2, nice card though. I remember it being faster than my ATI 3dRagePro, but it was no match for my TnT2

Everything was faster than the vast majority of ATi cards back then. What really kept ATi in business back in the day was the 2D graphics chip that went into every server that was made from the mid-late 1990's well into the 2000's. What really mattered was ATi stayed in the game long after many companies were either gave up or were forced to give up.
 
Is this the one with the special bios/ oem and some such?
Is mesh shading like super advanced, and something no one has ?
 
No, there are definitely bad products. A few have crossed my test bench on occasion.



Everything was faster than the vast majority of ATi cards back then. What really kept ATi in business back in the day was the 2D graphics chip that went into every server that was made from the mid-late 1990's well into the 2000's. What really mattered was ATi stayed in the game long after many companies were either gave up or were forced to give up.
Yeah, I guess you are right, I did come across a couple of turds I wouldn't even keep for free :LOL: :LOL:
 
Yeah, I guess you are right, I did come across a couple of turds I wouldn't even keep for free :LOL: :LOL:

I will say that price can dictate whether a product is deemed as good or bad in a lot of cases. It's more important in some cases than the product's technical merit. For example: There were ABIT and EPoX boards from the early 2000's which were bad products. Not by design, but because they used cheap capacitors that failed rather quickly and caused the boards to fail. These are bad products at any price because they were defective. The designs weren't the issue, it was the quality of components selected. Others in the industry at the time fell prey to this, including Apple IMacs which continued to fail from this long after everyone else got their acts together.

Conversely, the Core i7 6950X (Broadwell-E) was a terrible product in my mind due specifically to its price. The model it replaced was the Core i7 5960X (Haswell-E). The 5960X cost right around $1,049 or so. The 6950X on the other hand launched at a cost of $1,599.99. At the time, it was the most expensive desktop processor Intel had ever produced not adjusting for inflation. I thought it was a bad product because it offered a small uptick in IPC which was offset by lower clock speeds (about 200MHz less) which made it no faster than its predecessor outside of the few applications that could leverage its two extra cores. Back then, many people built high end gaming rigs using HEDT parts to get more PCIe lanes for multi-GPU configurations. If you were building a gaming rig, the 6950X was a bad product.

Then you have parts like Intel's Core i7 7740X. This was a bad product on a purely technical level. It was bad from a design perspective and just all around stupid. It was a 7700K that didn't have an iGPU and could overclock about 200MHz higher than it's mainstream counterpart but required the X299 platform. However, it disabled two DIMM slots and all you got were the lanes from the PCH and the 16x lanes attached to the CPU. The extra slots and I/O of the platform basically didn't work with a 7740X installed. It needed a very expensive HEDT board to work while providing none of the features that made the platform enticing. It's only claim to fame was overclocking to 5.0GHz or higher, which is something the desktop CPU's no longer did at that time. I kept my review chip as part of my collection. However, I'd never actually use it. This was a CPU that was so bad it was EOL early and the refresh X299 motherboards that released for the Core i7 9xxx series didn't support it at all.

Another example would be the 8800 Ultra. It was considerably more expensive than the 8800GTX but only offered something like a 6-8% boost in performance over the much cheaper card. There was more than $100 difference at the time. This was at a time when I think cards like the 8800GTX were only around $499 or so. The 8800 Ultra wasn't a bad card, it was indeed the fastest at the time but the price point it was at made no sense when the 8800GTX was still in production. The 8800 Ultra was simply priced wrong making it a bad choice, but the card was anything but bad.
 
Last edited:
Another example would be the 8800 Ultra. It was considerably more expensive than the 8800GTX but only offered something like a 6-8% boost in performance over the much cheaper card. There was more than $100 difference at the time. This was at a time when I think cards like the 8800GTX were only around $499 or so. The 8800 Ultra wasn't a bad card, it was indeed the fastest at the time but the price point it was at made no sense when the 8800GTX was still in production. The 8800 Ultra was simply priced wrong making it a bad choice, but the card was anything but bad.
Assuming availability at MSRP, and a hypothetical 20GB 3080Ti, isn't this what we'd call the 3090?

Lots of if's of course, but it's a situation we thought we'd be in before the global supply chain started disintegrated.
 
Another video card to be sold out to bots and marked up to high heaven by scalpers... no thanks.
 
Assuming availability at MSRP, and a hypothetical 20GB 3080Ti, isn't this what we'd call the 3090?

Lots of if's of course, but it's a situation we thought we'd be in before the global supply chain started disintegrated.

No. The RTX 3090's price / performance ratio is bad, there is no denying that. When you compare it to the gaming performance of the RTX 3080, its price / performance ratio makes little sense for pure gamers. However, that's not the intent of the RTX 3090. It's a Titan class card without the name. It's both a gaming card and a prosumer part. The reason why gamers buy that is akin to why people buy Ferrari's or other premium items. It is the fastest card available. Especially at 4K. Also, their availability is higher than that of RTX 3080's and you can realistically score a 3090 for what scalped 3080's go for.

The difference between an RTX 3080 and a RTX 3090 is greater than the difference between the 8800GTX and the 8800Ultra. One can also make the argument that an RTX 3090 FE is a bargain as its not much more than the RTX 2080 Ti was and its in line with previous Titan cards. Additionally, people like me used to buy GPU's in pairs, so the price of the RTX 3090 is par for the course.

Now, if a theoretical RTX 3080 Ti at $999 with 20GB of VRAM might be seen as something which would make a 3090 a bad product by price point, but the RTX 3090 was available long before the theoretical RTX 3080 Ti would be. As a result, it's only going to be a really bad buy if such a card comes out. Indications are though that we might get a 3080 Ti with 12GB of VRAM and a GPU that isn't fully unlocked, but is closer to the 3090 than the 3080 is.

Given that the RTX 3080's low VRAM can be problematic today and certainly in the near future, jumping up to the RTX 3090 for people who keep hardware a long time also makes it a potentially better buy.
 
Become a Patron!
Back
Top