Intel Reiterates Q1 2022 Launch of Arc Alchemist Graphics Cards with New Gameplay Video

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,595
Points
113
intel-arc-logo-with-alchemist-mascot-1024x576.jpg
Image: Intel



Intel has shared a new video reminding PC gamers that a third combatant will be entering the heated and hotly contested GPU arena next year.



The video, released during yesterday’s The Game Awards, confirms that Intel’s Arc Alchemist graphics cards will be available sometime during Q1 2022. It also includes gameplay footage of various titles such as Back 4 Blood and Hitman III, highlighting some of the platform’s graphics features such as ray tracing and upscaling via Intel’s DLSS and FidelityFX Super Resolution competitor, XeSS.









Viewers who reach the end of the video will be treated to an Arc-branded trio of sleek-looking hardware that comprises a monitor, laptop, and desktop with clear window showing an Arc graphics card inside. These are presumably just marketing renders, however.



“A new player has entered the game,” wrote Intel...

Continue reading...


 
Viewers who reach the end of the video will be treated to an Arc-branded trio of sleek-looking hardware that comprises a monitor, laptop, and desktop with clear window showing an Arc graphics card inside. These are presumably just marketing renders, however.

This is pretty much all I've seen of Arc. Lots of videos of things (supposedly) running on Arc. Lots of marketing renders of systems with the cards. Tons of code names and logos and everything. Lots of mentions of releases of some pretty low end gear, and promises of better stuff, but ... Nothing really substantial.

And even going beyond hardware - Remember when Intel was going to step up their driver updates for gaming, and add some gamer-centric features? What happened to that? Past the hardware, are they going to have the driver support in place to keep pace?

I want this to succeed, and not be just another direct-to-miners-in-bulk style release, but I have little faith, and not really seeing anything that is either giving me faith, or getting me excited. If they want to start up hype the right way, start showing more than marketing renders -- even woodscrews are better than some intern playing on Blender. The absolute best would be to get some pre-release cards out to some influencers for them to start looking at. "Lose" a couple of engineering samples on e-bay, and start uploading some sexy benchmarks "accidently" to benchmark sites -- those tricks the CPU folks do all the **** time.

Maybe the current environment of nothing available unless your a miner or scalper is just getting to me. Maybe I am too old and cynical. I don't know, but I'm seeing a lot of smoke and mirrors here, and knowing that Raja Koduri is involved that is par for the course with him - overhype and underwhelm.
 
I'm happy what I got for now.... 1.5-3 years from now I may well be looking really hard at Gen2/3 ARC.

Once upon a time I owned, first a Matrox G400, then (gasp) a G450.

For it's day it was an amazing performer.

Had Matrox the resources to go into an 18 month (or even 24) development cycle the gpu market would (IMO) be a much different space today.

Intel does have what it takes to go into a fast upgrade cycle, I am looking forward to the pressure on the big green monster.
 
Intel does have what it takes to go into a fast upgrade cycle
They do, but aren’t immune to issues; Skylake, Skylake+, Skylake++, Skylake+++, Skylake++++, Skylake++++++ …

I kid, but only because the truth hurts
 
Intel may have what it takes to upgrade their parts on a cycle, but do they have what it takes to produce yields to satisfy demand?

My bet is that AMD is well far ahead of Intel and Nvidia on this front. Chiplets and 3D stacking are the future. No need for a single massive piece of silicon with lower yields when you can slice it up into smaller pieces with much higher yields.
 
Intel may have what it takes to upgrade their parts on a cycle, but do they have what it takes to produce yields to satisfy demand?

My bet is that AMD is well far ahead of Intel and Nvidia on this front. Chiplets and 3D stacking are the future. No need for a single massive piece of silicon with lower yields when you can slice it up into smaller pieces with much higher yields.

AMD might be ahead, but Nvidia has been looking into this for a while now too https://research.nvidia.com/publication/2017-06_MCM-GPU:-Multi-Chip-Module-GPUs
 
Become a Patron!
Back
Top