Not really. At least not if you want to see the whole screen.
You are going to have to back up far enough that the whole screen will fit in your field of view, and if you do there will be no perceivable difference between 4k and 8k.
I mean, it may be noticeable if you are one of those people who like to sit really close to the screen and keep turning your head (or at least your eyes) throughout the film, but that's not how most people like to watch films.
The only benefit I can think of is the argument that high resolution provides a natural form of antialiasing, but it is a very computationally expensive way of accomplishing this goal, as going from 4k -> 8k roughly cuts the framerate to a quarter on the same GPU.
That, and you don't need to increase the resolution of the screen to get those benefits. In fact, the results may even be better if you don't. As an example 4k displayed on a 1080p screen in DSR/VSR has fewer "jaggies" than native 4k. So if you really wanted this benefit, it would be better to run 8k DSR/VSR on a 4k screen than it would be to run 8k native.
I love running DSR / VSR in older titles. Nothing looks better from an AA perspective, but requiring 300% more GPU power to do so makes it hugely inefficient. Every single other form of AA uses way less GPU power, even 8x MSAA.