8K TV, Samsung QN800B vs QN900B

My impression is that most all live TV using HDR is HLG format, correct? Or is it HDR-10?

We're seeing all kinds of workflows but yes live Is mostly HLG, predominately using the BBC Luts which don't give the absolute best picture, but round trip the best.

The NBC guys have their own scheme.

Some guys run an entire SDR pipeline then color convert the program output to HDR.

Guys run full and mixed pipelines.

Single_Master_Workflow_diagram_789_334_70_s.jpg
 
What about when presented with higher resolution AND better HDR vs lower resolution + better HDR, what do people chose? Or they can’t distinguish between the two combos? Thanks

I can't recall a test like that being performed by anyone.

Part of the reason is the resolution bandwidth is costly and the color bandwidth is not.
 
Last edited:
I can't recall a test like that being performed by anyone.

Part of the reason is the resolution bandwidth is costly and the color bandwidth is not.

Thanks. In your educated guess, is there ANY measurable merit in an 8k 85 inch tv with HDR at 10 feet viewing distance vs 4k 85 inch HDR one, all else being equal?
 
Thanks. In your educated guess, is there ANY measurable merit in an 8k 85 inch tv with HDR at 10 feet viewing distance vs 4k 85 inch HDR one, all else being equal?

Guessing 85" 10 feet away no. Measurable is the tough word there.
If the source material is 8K then it would be minimal at best, if its up-res'd then none at all.
 
While display resolution has a certain and direct impact on perceived picture quality, there are more important factors at play. Factors that 99% of consumers never consider when comparing TV's.

As many know, resolution is simply the amount of pixels a display contains in total. Manufacturers love this number, because it's BIG. Big numbers sell. What those manufacturers don't want you to realize is resolution plays a relatively small part RE: overall picture quality. Things such as motion handling, color accuracy, contrast ratio, local dimming zones/algorithm, IPS vs. VA... those are the true deciding factors one should look for when considering a new TV.

Even in 2022, one could EASILY compare a properly calibrated Pioneer Kuro plasma 1080p set vs. a current, calibrated 4k/HDR panel with each being fed their native resolution and be unable to tell the difference.

Streaming services have, in my opinion, all but negated the need for constant improvement in display tech. Currently, we're blessed with 4k resolution, HDR/Dolby Vision, WCG, local dimming (OLED excluded), a myriad of motion settings, etc. 2 of those directly apply to streaming content, all apply to the display that's showing said content.

Streaming services face a single major bottleneck: accessibility. Sure, Netflix can claim that they stream in "UHD with HDR", but not all video is created equal. All it takes is a peek at the metadata. A Netflix 4k HDR stream tops out at less than 20Mbps/s, while a physical UHD disc can easily hit 40-50Mbps. Therein lies the truth: you can compress anything down to adapt it to your needs, but there WILL be sacrifices.

As long as streaming is the de-facto for the majority of homes, I believe that the need for further improvement regarding consumer display's has reached a dead end.
 
While display resolution has a certain and direct impact on perceived picture quality, there are more important factors at play. Factors that 99% of consumers never consider when comparing TV's.

As many know, resolution is simply the amount of pixels a display contains in total. Manufacturers love this number, because it's BIG. Big numbers sell. What those manufacturers don't want you to realize is resolution plays a relatively small part RE: overall picture quality. Things such as motion handling, color accuracy, contrast ratio, local dimming zones/algorithm, IPS vs. VA... those are the true deciding factors one should look for when considering a new TV.

Even in 2022, one could EASILY compare a properly calibrated Pioneer Kuro plasma 1080p set vs. a current, calibrated 4k/HDR panel with each being fed their native resolution and be unable to tell the difference.

Streaming services have, in my opinion, all but negated the need for constant improvement in display tech. Currently, we're blessed with 4k resolution, HDR/Dolby Vision, WCG, local dimming (OLED excluded), a myriad of motion settings, etc. 2 of those directly apply to streaming content, all apply to the display that's showing said content.

Streaming services face a single major bottleneck: accessibility. Sure, Netflix can claim that they stream in "UHD with HDR", but not all video is created equal. All it takes is a peek at the metadata. A Netflix 4k HDR stream tops out at less than 20Mbps/s, while a physical UHD disc can easily hit 40-50Mbps. Therein lies the truth: you can compress anything down to adapt it to your needs, but there WILL be sacrifices.

As long as streaming is the de-facto for the majority of homes, I believe that the need for further improvement regarding consumer display's has reached a dead end.
Same with sound track compression on streaming, hence the need for physical disks in order to get lossless audio.
 
Same with sound track compression on streaming, hence the need for physical disks in order to get lossless audio.

Speaking my language. I buy specific title's on disc not only for the PQ, but also due to the fact that I'm not sporting a full Paradigm Studio 5.1 system with 15" Dayton Audio sub just to be let down by "Dolby Plus" audio.

Experiencing a movie in full lossless PCM at reference volume is something to behold.
 
Speaking my language. I buy specific title's on disc not only for the PQ, but also due to the fact that I'm not sporting a full Paradigm Studio 5.1 system with 15" Dayton Audio sub just to be let down by "Dolby Plus" audio.

Experiencing a movie in full lossless PCM at reference volume is something to behold.
Indeed. The TV is only one aspect of the experience and the amount of people that buy massive displays and then listen through the crappy built in speakers or a soundbar is amazing to me. I'm not saying every living room needs to be a home theater but if you are ponying up the coin for 85" of real estate then do better on the audio too. I'm running a 5.2.2.
 
Indeed. The TV is only one aspect of the experience and the amount of people that buy massive displays and then listen through the crappy built in speakers or a soundbar is amazing to me. I'm not saying every living room needs to be a home theater but if you are ponying up the coin for 85" of real estate then do better on the audio too. I'm running a 5.2.2.

I couldn't agree more. As nice as Atmos may be, a solid 5.1 system can be just as immersive. My main setup consists of 2x Paradigm Studio 60 v2's, a Studio CC center, and Studio ADP surrounds. The surrounds are di-pole, so they play much like a larger system.
 
Guessing 85" 10 feet away no. Measurable is the tough word there.
If the source material is 8K then it would be minimal at best, if its up-res'd then none at all.

So, indeed, after 2 or so weeks with an 8k screen (qn800b, 85 inch), I can say that 8k or 12k content from YouTube looks supremely superb and is a measurable step up from 4k.

Also, 4k upscaled to 8k is NOT the same as original 8k or 12k content
 
So, indeed, after 2 or so weeks with an 8k screen (qn800b, 85 inch), I can say that 8k or 12k content from YouTube looks supremely superb and is a measurable step up from 4k.

Also, 4k upscaled to 8k is NOT the same as original 8k or 12k content

Super happy you like it.

I cant see much of a difference myself, but I dont spend much time looking at pictures and frame rates larger than 4K 60.

Always curious to know what the production chain was that delivered it and the size of the file you get - can snapshot the screen bug ?

I specifically said upsrezed footage is not as good.
 
Super happy you like it.

I cant see much of a difference myself, but I dont spend much time looking at pictures and frame rates larger than 4K 60.

Always curious to know what the production chain was that delivered it and the size of the file you get - can snapshot the screen bug ?

I specifically said upsrezed footage is not as good.

That’s right, that’s why I meant by “indeed” in a previous post: I was referring to you stating that 4k upscaled is not going to be the same as original 8k!

I don’t know of this will help to determine the bitrate:

E70338CB-A281-40CE-B1B5-BBAB813AFE94.jpeg

B3F0538F-F5CA-44AC-9C9E-062B087E5ADB.jpeg
 
That’s right, that’s why I meant by “indeed” in a previous post: I was referring to you stating that 4k upscaled is not going to be the same as original 8k!

I don’t know of this will help to determine the bitrate:

View attachment 131191
View attachment 131192

Nice.

Cant get frame size from here either...maybe my engineers could.

I've never seen that codec but you've certain got HDR coming in- Im sure it looks sweet.

Im surprised you see a diff from 10 feet away some calculations have 8K needing to be 3 feet away from an 85" screen.

Id be curious to see a real sarnoff test pattern run through youtube.

congrats on the new set - always great to get a big new set!
 
Last edited:
@UncleDave, like most Americans, you are too nice mate! Should have just called straight BS on my part claiming to be able see a difference between 4k and 8k content - haha

Joking aside, the native 8k content is noticeably crisper vs 4k. It’s as if it’s a little (but enough) less “cloudy” or “fogged”? I’m using “nature, cities” types of showcase videos that are galore on YouTube: 4k hdr vs 8k hdr. Probably has to do with original recording and not the resolution it’s replayed at? Or maybe it’s all in my head? I know humans are very flawed memory, vision, hearing wise and in no way results of those senses can be taken for granted. A blind testing in a controlled environment is necessary to determine the actual empirical reality
 
@UncleDave, like most Americans, you are too nice mate! Should have just called straight BS on my part claiming to be able see a difference between 4k and 8k content - haha

Joking aside, the native 8k content is noticeably crisper vs 4k. It’s as if it’s a little (but enough) less “cloudy” or “fogged”? I’m using “nature, cities” types of showcase videos that are galore on YouTube: 4k hdr vs 8k hdr. Probably has to do with original recording and not the resolution it’s replayed at? Or maybe it’s all in my head? I know humans are very flawed memory, vision, hearing wise and in no way results of those senses can be taken for granted. A blind testing in a controlled environment is necessary to determine the actual empirical reality

All good fun, no reason to get all critical on brother loving his new beautiful set.

I dont have any doubt you can see differences resolution improvement "seems" to always carry some benefit even if theoretical. The eye is an incredible thing.

The source material, encoding schemes, and bandwidth make utterly gigantic differences and few pieces are created in a way that allows direct comparison.
 
I'm of the opinion that I like a screen about 4 fists wide at arms length, or maybe a bit more.

I have the 4k LG OLED 77inch at 14 feet. It is 2.5 fists wide and it's too small. I can just tell the difference between 1080p and 4K.
 
Back
Top