Intel Arc A770 16GB Limited Edition Video Card Review

Brent_Justice

Administrator
Staff member
Joined
Apr 23, 2019
Messages
792
Points
93
Introduction Intel now has a discrete graphics card, in fact, they have a few. We have all known Intel for its Intel Graphics integrated (iGPU) inside its desktop CPUs. Intel has now branched out to take on the dedicated GPU space, providing an actual video card you can plug in and play your games on, […]

See full article...
 
Nice review. I had no idea Intel would be able to put out something that is actually competitive, AND at a lower price point than Nvidia. And I have been reading how the drivers are improving. Let's hope they keep cracking the whip on the driver team.... there is more to life than just CS:GO and F1 numbers.

I doubt this is going to supplant my 1080ti, but for someone on a budget or building new, this is interesting. We really do NEED competition in the GPU field to get the prices down. I know it won't compare to a $800+ 4070ti, but it seems to be holding up well to the $400+ 3060.

Oh almost forgot.... does this one provide any benefit to video compression work, since it supports all the common high end codecs?
 
Nice review. I had no idea Intel would be able to put out something that is actually competitive, AND at a lower price point than Nvidia. And I have been reading how the drivers are improving. Let's hope they keep cracking the whip on the driver team.... there is more to life than just CS:GO and F1 numbers.

I doubt this is going to supplant my 1080ti, but for someone on a budget or building new, this is interesting. We really do NEED competition in the GPU field to get the prices down. I know it won't compare to a $800+ 4070ti, but it seems to be holding up well to the $400+ 3060.

Oh almost forgot.... does this one provide any benefit to video compression work, since it supports all the common high end codecs?

I have not personally tested video encoding, but it does famously support AV1, and according to YT videos I've watched, it's competitively fast at it.
 
Thanks, @Brent_Justice for the great review. I really am bummed with how the launch of this card went because I do believe that it is a good card and deserves better. Here's hoping the next launch goes better. I've been following the stories about the driver updates and how they've helped close the FPS gap with many games and have improved overall performance.
 
Wow the benches are all over the place -- not necessarily a bad thing, it's a new third-party entry. Just interesting to see it doesn't boil down to the usual camps of "plays well on Green" or "plays well on Red".

Hard to really compare anything with RT - the numbers are barely playable even at best, and we are looking at only last-gen stuff from AMD/nVidia (rightfully so, neither has released anything in the lower price tiers on current-gen architectures, but those current gen things have much different RT performance). That, and... I don't really care a whole lot about RT performance in the first place.
 
Nice review. This entry by Intel is definitely interesting. Truth be told it's a lot more impressive than anything I expected out of them considering how new they are in this space. The gains from drivers is also impressive. When it was first released a lot of people commented that it seemed the hardware was quite good overall but the drivers were holding it back and that definitely seems to be the case in plenty of instances. While the card is far from perfect it does show that Intel has promise in the GPU space.

I'd like to get my hands on one just to mess around with it and see what it does for my uses.
 

Wow, I was impressed the A770 pretty much keeps pace with 1080ti. Yeah I know that's old hat by now, but still an awesome card. And for under $400 is impressive.
 
Lol.

Is there any combination of title, resolution and settings in which this thing is fast enough to be playable, where it will actually use 16GB of VRAM?

I have to wonder why they decided to release this product instead of just the 8GB version.

That said, $349 is by far the cheapest 16GB video card I can think of, so if you have the need for a large amount of VRAM (and not much else) maybe this is your best choice?
 
Is there any combination of title, resolution and settings in which this thing is fast enough to be playable, where it will actually use 16GB of VRAM?
Well, if you think about when this card was being designed - Etherium mining could use all the VRAM you could throw at it, and that was really the market Intel was planning on chasing. So... yeah, that's why.
 
Well, if you think about when this card was being designed - Etherium mining could use all the VRAM you could throw at it, and that was really the market Intel was planning on chasing. So... yeah, that's why.
I thought Ethereum liked more memory bandwidth than memory size.
 
I thought Ethereum liked more memory bandwidth than memory size.
I never mined it myself but yeah, what you say probably makes more sense - but I know there's certainly a floor for it where you have to have a minimum amount of VRAM to play.
 
I tinkered with mining during its peak since I had so many old cards laying around and was in the process of updating a few as well. It does like more ram but favors speed even more once you're at/over the 8GB mark. I have an EVGA RTX 2080 Super with (GDDR6) that did pretty well because of the faster ram. My older cards, the 1080s (from my last SLI setup) and 1080 Ti did ok but that Super was about 20% better than the 1080 Ti even though it had more VRAM. The cards that did the best were the 2080 Ti and a 3090 but that's not including power usage although undervolting/underclocking could do a lot to balance that.

In the end, before I stopped altogether, the optimal setup on my old X79 motherboard was using the 2080 Super, 2080 Ti, and a 3090. I have to say that the Super was an interesting beast in that while on some levels it was only just more powerful in terms of rasterization of the 1080 Ti, the tensor cores+DLSS allowed it to take a step further and that combined with the faster VRAM gave it just a bit more to boot. The other neat thing is that it only needed 2x 8-pin connectors while the 1080 Ti used 3. I originally bought that Super to use with my original 3700X build with the LG C9 since it was the only thing I could afford to be able to use the C9's 120Hz HDR g-sync, albeit at 1440p. It was also a nice BF deal in that it cost less than the 1080 Ti did at launch.
 
I tinkered with mining during its peak since I had so many old cards laying around and was in the process of updating a few as well. It does like more ram but favors speed even more once you're at/over the 8GB mark. I have an EVGA RTX 2080 Super with (GDDR6) that did pretty well because of the faster ram. My older cards, the 1080s (from my last SLI setup) and 1080 Ti did ok but that Super was about 20% better than the 1080 Ti even though it had more VRAM. The cards that did the best were the 2080 Ti and a 3090 but that's not including power usage although undervolting/underclocking could do a lot to balance that.

In the end, before I stopped altogether, the optimal setup on my old X79 motherboard was using the 2080 Super, 2080 Ti, and a 3090. I have to say that the Super was an interesting beast in that while on some levels it was only just more powerful in terms of rasterization of the 1080 Ti, the tensor cores+DLSS allowed it to take a step further and that combined with the faster VRAM gave it just a bit more to boot. The other neat thing is that it only needed 2x 8-pin connectors while the 1080 Ti used 3. I originally bought that Super to use with my original 3700X build with the LG C9 since it was the only thing I could afford to be able to use the C9's 120Hz HDR g-sync, albeit at 1440p. It was also a nice BF deal in that it cost less than the 1080 Ti did at launch.

I had a GTX1070 and later a GTX1070Ti that paid themselves in just a few months with ethereum. Both mined pretty much the same, I figured it was because they had the same bandwidth.
 
Nicely done!

It is good to see Intel doing very well in the low/mid range gpu space. Perhaps less profit per unit, a whole lot more units sold per year though.

Once Intel is comfortable with its silicone and drivers I am hoping for them to come out swinging with a high end GPU. Looks like they will be able to compete against the raster performance of AMD and the Ray Trace/AI of NV - or so I hope.

Not this generation - I am thinking this was their test run and unofficial Beta testing. In 6-18 months I expect to see second generation Intel GPU's.
 
Nicely done!

It is good to see Intel doing very well in the low/mid range gpu space. Perhaps less profit per unit, a whole lot more units sold per year though.

Once Intel is comfortable with its silicone and drivers I am hoping for them to come out swinging with a high end GPU. Looks like they will be able to compete against the raster performance of AMD and the Ray Trace/AI of NV - or so I hope.

Not this generation - I am thinking this was their test run and unofficial Beta testing. In 6-18 months I expect to see second generation Intel GPU's.
Now that Raja Koduri is gone, I don't know if even a next gen is in the works.
 
Once Intel is comfortable with its silicone and drivers I am hoping for them to come out swinging with a high end GPU. Looks like they will be able to compete against the raster performance of AMD and the Ray Trace/AI of NV - or so I hope.
I kinda want to see them continue to succeed in the low-mid tier role, given that nVidia has more or less abandoned it and AMD is content to play second fiddle to nVidia. I don't see a good reason for yet another player to go chasing the performance crown when there's a huge underserved niche in the low-middle tiers.
 
Become a Patron!
Back
Top