Report: PCs Are the Worst Devices to Watch Streaming Content On, Average Bitrate of 2.95 Mbps

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,595
Points
113
laptop-with-popcorn-and-3d-glasses-1024x576.jpg
Image: Pixabay (FrankundFrei)



Subscribers of streaming services who want to enjoy their content at the highest quality possible will probably want to skip the PC, as that platform is getting the shaft in regard to bitrate.



This is according to a new report shared by online video optimization and analytics company Conviva, which includes a section discussing the quality of streaming content on various devices. The PC is dead last, with an average bitrate of just 2.95 Mbps versus alternatives such as Smart TVs, which boast the highest average bitrate of 8.80 Mbps.



“Picture quality […] improved across the board with gaming consoles up the most at 16 percent but at 7.89 Mbps bitrate, they came in second to smart TVs which averaged 8.80 Mbps,” the relevant section reads before going into how poorly the PC fared in contrast.



“Desktop made the least gains, up just...

Continue reading...


 
I... don't stream anything on my PC really, apart from porn the occasional youtube bit. Anything i watch is on the TV either from native SmartTV apps or AppleTV. SmartTV apps are convenient (LG in my case), but I generally find the AppleTV ones look better, sound better, and run smoother.
 
I question that statement.
Which one? I'm pretty sure they do nerf bitrate on PC, I've experienced it with netflix and amazon as well. That's why I don't stream on pc, I Torrent. So in a supposed effort to fight piracy yet again they make piracy more appealing.
They'll never learn.

The phrasing is total bull on the other hand, it's not the PC's fault that streaming providers are jerks.
 
I don't see laptops in the list.

Could be due to low res screens people still use? I have no idea, then again netflix does seem to look better on my TV via console.
 
I don't see laptops in the list.

Could be due to low res screens people still use? I have no idea, then again netflix does seem to look better on my TV via console.
The netflix pc client is also locked to HD, unless you have a hi-end graphics card. For example with the 1030GT in my third PC I'm not allowed to stream in 4K.
 
The streaming ecosystem as a whole is a bit of a mess and it's clear none of the major players have any interest in device optimizations beyond TV or console. Even then, the TV apps can be hit or miss when an app doesn't update and certain audio/video features get locked out (here's looking at you Disney+). Meanwhile, most Android high-end tablets have 2K or higher screens, along with some powerful processors, but those apps are capped at 1080p. I agree the phrasing from the media on this is misleading as it's obviously not the fault of PCs but actually, the developers making the apps.

We've got 3 smart TVs in our house and at any given moment it's anyone's guess what feature no longer works with one. There's a Hisense Android 4K Dolby Vision connected to a Dolby Atmos 5.1.2 soundbar, LG C9 4K connected to an Onkyo 7.1 Dolby Atmos receiver, and a Sony Z9D 4K that has its own flavor of Sony-Android.

1. Disney+ is perhaps the worst. When it all works it's the best but on the Hisense and Sony, it's a flip of the coin when an update or out-of-date app version knocks out the external speaker Atmos support.
2. Paramount+ is just plain weird. I have to use a shield with the C9 if we want to enjoy Dolby Vision for the shows that have it. Considering how advanced that TV is this is just ridiculous.
3. NVIDIA Shield-Overall this is the fix so we can watch things the C9 doesn't have apps for(HBO Max). Mostly knocks it out of the park in streaming 4K Dolby Vision/Atmos to the TV. However, we also use it for streaming 4K video to our projector. Video, no problem but various apps will limit audio to Dolby 5.1 when connected directly to the receiver.


4K on PC, well if trying to do it legally, is still a mess. We started seeing MS abandon physical playback with blu-ray leaving users to source their own solutions. It only got worse with the insane HDCP requirements for 4K that prevent many enthusiasts from being able to configure a rig that the OS, and some apps, will fully support.
 
The streaming ecosystem as a whole is a bit of a mess and it's clear none of the major players have any interest in device optimizations beyond TV or console. Even then, the TV apps can be hit or miss when an app doesn't update and certain audio/video features get locked out (here's looking at you Disney+). Meanwhile, most Android high-end tablets have 2K or higher screens, along with some powerful processors, but those apps are capped at 1080p. I agree the phrasing from the media on this is misleading as it's obviously not the fault of PCs but actually, the developers making the apps.

We've got 3 smart TVs in our house and at any given moment it's anyone's guess what feature no longer works with one. There's a Hisense Android 4K Dolby Vision connected to a Dolby Atmos 5.1.2 soundbar, LG C9 4K connected to an Onkyo 7.1 Dolby Atmos receiver, and a Sony Z9D 4K that has its own flavor of Sony-Android.

1. Disney+ is perhaps the worst. When it all works it's the best but on the Hisense and Sony, it's a flip of the coin when an update or out-of-date app version knocks out the external speaker Atmos support.
2. Paramount+ is just plain weird. I have to use a shield with the C9 if we want to enjoy Dolby Vision for the shows that have it. Considering how advanced that TV is this is just ridiculous.
3. NVIDIA Shield-Overall this is the fix so we can watch things the C9 doesn't have apps for(HBO Max). Mostly knocks it out of the park in streaming 4K Dolby Vision/Atmos to the TV. However, we also use it for streaming 4K video to our projector. Video, no problem but various apps will limit audio to Dolby 5.1 when connected directly to the receiver.


4K on PC, well if trying to do it legally, is still a mess. We started seeing MS abandon physical playback with blu-ray leaving users to source their own solutions. It only got worse with the insane HDCP requirements for 4K that prevent many enthusiasts from being able to configure a rig that the OS, and some apps, will fully support.
Sheesh dude. Good gawd. These are just more reasons why the only streaming I do is from a local HDD to my monitor (and my friends use NASes, I will get on that level at some point).
 
That is exactly why the only streaming I'm comfortable with is through serviio.

Today I tried to watch an episode of trailer park boys on netflix and of course it crapped out twice during a 30 minute episode. First at the start it buffered for 2 minutes, then midway through it went into an infinite buffering. The only reason I'm watching this on netflix is because I couldn't find it elsewhere.
 
I assume this is some statistical averages. Factor in all the people streaming netflix on old potato laptops or similar ancient computers
 
Become a Patron!
Back
Top