Report: Cyberpunk 2077 Won’t Support Ray Tracing for AMD Radeon RX 6000 Series GPUs at Launch After All

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,071
Points
83
cyberpunk-2077-screenshot-purple-club-1024x576.jpg
Image: CD PROJEKT RED



It appears that Cyberpunk 2077 fans who wish to play the game with ray-traced effects at launch will allegedly need an NVIDIA GeForce RTX graphics card. In a statement delivered to ComputerBase today, CD PROJEKT RED clarified that AMD Radeon RX 6000 Series GPUs will allegedly lack ray-tracing options when the game debuts on December 10.



“We are working together with AMD to integrate ray tracing options for their graphics cards as quickly as possible,” CD PROJEKT RED wrote. “However, this will not be the case when the game launches.”...

Continue reading...
 
Last edited by a moderator:
hmm that's kind a wierd. I guess CP2077 has some nvidia rtx optimizations.
Or, if the AMD ray tracing demo is any indication, Project Red might need to implement RT effects in specific parts of the game along with regular effects.
 
Tic for Tat.

AMD: Our game Godfall (and our deceitful D5 RT numbers) will not have RT support at launch for the RTX's cards.

NVIDIA: oh yeah, our super-mega game CP2077 won't have RT support at launch for the RDNA2 cards.

NVIDIA: mic drop!
 
Yea this is going to be a thing but the sad fact is or not sad depending... Nvidia can afford to out spend AMD on RT title compatibility.
 
This is a case where there wasn't anything to optimize and CDPR already had too much on their plate.
That said, the system requirements look like DLSS is pretty much required for playable RT so they are likely working with AMD to get Super Resolution working.

There is no excuse for Godfall or Dirt 5.

Imagine the backlash if NVIDIA gave reviewers a beta branch code for a game that ran abnormally faster on NVIDIA cards.
 
I thought ray tracing was " agnostic" as its a "direct x blah blah blah " that the hardware " blah blah blah" so the blah blah agnostic and directx... The blah blah is stuff I don't remember or understand.
So, not agnostic? So yeah it will be different when done for NV or AMD vs software or some such.
 
Imagine the backlash if NVIDIA gave reviewers a beta branch code for a game that ran abnormally faster on NVIDIA cards.
Wha? Plenty of games run "abnormally" faster on nvidia cards. What game would you give a beta code for? One that runs slow on your card? I didn't think so.
 
Beta code should not be used as a comparison tool for public consumption.
The only reason AMD should have issued it was specifically so reviewers could say hey AMD ray tracing works well in this game not look how bad Nvidia runs this game comparatively.
 
I thought ray tracing was " agnostic" as its a "direct x blah blah blah " that the hardware " blah blah blah" so the blah blah agnostic and directx... The blah blah is stuff I don't remember or understand.
So, not agnostic? So yeah it will be different when done for NV or AMD vs software or some such.
It is agnostic, it's the optimizations that aren't (which is what RTX is).
Unoptimized, AMD GPUs are just too slow at RT to use at this time, and further, CDP has had Nvidia RT GPUs available in hand for years - as have we all - where they may have had an AMD GPU to work with for perhaps a month. Gonna take some time to get all that working.
 
Beta code should not be used as a comparison tool for public consumption.
The only reason AMD should have issued it was specifically so reviewers could say hey AMD ray tracing works well in this game not look how bad Nvidia runs this game comparatively.
Well duh! That's the exact point. Have you never seen marketing at work before? Bury what we are bad at under things that we are good at. If the reviewer draws false conclusions from one outlier that's on them. It's not for public consumption it was for reviewers.

BTW every game is a beta now, hell some games are barely in alpha state when they come out. I'm fully prepared to see a game somewhere between an alpha and a beta this december, if they finally manage to get it released.
 
It is agnostic, it's the optimizations that aren't (which is what RTX is).
Unoptimized, AMD GPUs are just too slow at RT to use at this time, and further, CDP has had Nvidia RT GPUs available in hand for years - as have we all - where they may have had an AMD GPU to work with for perhaps a month. Gonna take some time to get all that working.
Just like everything else about DX12. Surprise!
 
He's right, though. Why use it as a "marketing point" when consumers can call you out for your BS and use caution for all your products in the future?

That is what you call "stupidity and terrible marketing" on your brand.

It shows distrust to use these "beta code" nonsense in a head-to-head review comparison as mentioned above and AMD should be shamed for it. PERIOD!
 
Love my AMD CPU and a new one on the way but am soooo glad I got a 3080 in time for this as was the main reason I rushed to buy one at launch.. something I've never done before as normally very patient. I assume they will have to scale it back for 6000 to get payable frame rates based on benchmarks I've seen for it. Rdna3 just like zen3 is when AMD will truly match or best the competition but nvidia will be alot tougher to beat than Intel.
 
Scale it back? The consoles have bloody ray tracing and theyre weaker AMD chips than the 6k series. This seems like a shady exclusivity deal with Nvidia.
 
Scale it back? The consoles have bloody ray tracing and theyre weaker AMD chips than the 6k series. This seems like a shady exclusivity deal with Nvidia.
It's out in the open, nothing shady about it.
It works with Nvidia because that was who they worked with on this.
AMD is new to the scene is all.
 
The consoles have bloody ray tracing and theyre weaker AMD chips than the 6k series.
And their ray tracing is even worse than the AMD GPUs. That's not Nvidia's fault, that one belongs to AMD.
This seems like a shady exclusivity deal with Nvidia.
How?

People were complaining about software not being optimized for AMD CPUs when Ryzen hit and literally made the same silly arguments, forgetting that AMD had been shipping a sub-par architecture for a decade that no one chose to use on purpose.

Don't forget that developers can only develop for hardware that exists. That's only 'exclusive' when the competition hasn't bothered competing in the first place.
 
Become a Patron!
Back
Top