4K, 8K, HDR or UHD?
The talk on and around 4K/UHD video has increased considerably recently. Specifically, many TV manufacturers have been proclaiming that this will be the year in which higher resolution screens will become the norm. This is similar to what was said about higher frame rate sets a couple of years ago – even though at that time, the only video that could be watched at higher frame rates came from video gaming consoles.
Whether or not that turns out to be the case, both customers and some manufacturers are using the terms “4K” and “UHD (TV) – Ultra- High Definition TV” interchangeably, and this is invalid. Although similar in concept – creating/ displaying an image that is approximately 4x the resolution of HD – the two terms are not technically interchangeable.
Click here to download the free guide.
Here are some of the questions we are hearing every day:
- 4K vs. UHD – are they the same?
- If 4K is good, and 8K is better, is 16K best?
- What’s truth and what’s hype regarding HDR?
UHD is a term coined by the consumer electronics industry for this new high-resolution imagery. The image size for UHD is 3840 pixels by 2160 lines – exactly 4 times the size of an HD picture (which is 1080×1920). So, twice as many pixels horizontally and twice as many lines – therefore 4x the size.
4K is the term used by the professional video market to denote this higher resolution as used in film and TV production work. The image size in this case is 4096 pixels wide by 2160 lines, so it is slightly MORE than 4x the image size for HD.
In many cases, this slight difference in horizontal resolution doesn’t matter, but be aware that a UHD TV set cannot display a 4K image without either throwing away 256 pixels on each line, or rescaling the image to fit.
For more information about 4K, 8K, HDR and UHD, download our guide here.