I am passing, for now, but I'm also posting something right now that might catch your interest.I honestly think I'll be passing on this gen. It sounds like this whole release will be just as bad or worse than the 30 series cards.
The DLSS3 frame insertion does not excite me though. It will almost certainly be accompanied by input lag.
That was my understanding too, hence my expectation that it would work on Ampere even if it was not as fast.I was wrong. The RTX 30 series and RTX 40 series both have the Optical Flow Accelerators. It is just that the RTX 40 series has 2x the performance and 2x the accuracy, with its Optical Flow Accelerator. It is this performance improvement and accuracy improvement that NVIDIA is using to allow DLSS3 Frame Generation. Technically, it could be supported on RTX 20 series with a lot of software and development work. Therefore, the end result is that it is locked to RTX 40 series at the moment.
That was my understanding too, hence my expectation that it would work on Ampere even if it was not as fast.
Unfortunately that probably means there will no more improvements for DLSS 2 either.
I think I can use some more horsepower for my 3440*1440p at 144 hz monitor, but I might need a bigger case with all these behemoths.
Also it seems we Europeans are taking it up the *** as prices ar 22% higher then in the US, may have to skip this one or hope to grab a cheap 3090ti
I just can't see that.but they are talking about 2x performance increases
I recall a few months back, one of NVIDIA's researchers demonstrated this tech WORKING on a 2000 series card, so, yeah...I was wrong. The RTX 30 series and RTX 40 series both have the Optical Flow Accelerators. It is just that the RTX 40 series has 2x the performance and 2x the accuracy, with its Optical Flow Accelerator. It is this performance improvement and accuracy improvement that NVIDIA is using to allow DLSS3 Frame Generation. Technically, it could be supported on RTX 20 series with a lot of software and development work. Therefore, the end result is that it is locked to RTX 40 series at the moment.
NVIDIA is using its Reflex technology in DLSS3 Frame Generation in order to reduce latency, in essence removing the render queue from the pipeline to reduce input lag.
I just can't see that.
Maybe, in some very specific circumstances with tricks (DLSS, etc). But an across the board 100% speed increase, without the subsequent increase in cores or power draw --- even considering a process node improvement, there just isn't enough in the specifications to really justify an across-the-board 100% increase in performance -- unless you have ~miraculous~ architectural improvements... which is why I am significantly doubting this claim.
25% - I would readily believe. 100% - not so much
NVIDIA is using its Reflex technology in DLSS3 Frame Generation in order to reduce latency, in essence removing the render queue from the pipeline to reduce input lag.
There is also core count - the 4080 is at 9,728, the 3080Ti (since that was the TDP comparison) at 10,240which results in a theoretical max performance increase of ~38%.
Yeah, I think it may do frame insertion so as to allow VSync/VRR to be smoother -- which tech like that has been around forever, and used a lot for motion blur reduction.My humble eyes will believe it when my humble eyes see it.
Predicting the next frame solely based on the current frame is going to be iffy at best. Usually you have to wait for the next frame, then interpolate between them, but if you do that, input lag is horrendous.
I expect something like this will invariably be choppy and awkward as the system regularly predicts the wrong future, and then snaps back to reality with the next fully rendered frame.
Was that meant to rhyme?Couldn't be happier, HAPPY-HAPPY, I TELL YA, with our Strix 3090Tis, especially after today's sick news and EVGA's demise!