I hear this question pretty often. The short answer is no, especially if none of the output displays support 4K, but let’s unpack what 4K means first before diving into specifics.

What Does 4K Mean?
Basically, 4K has become a marketing term (much like “the cloud”) intended to make the technology easier for consumers to understand, while simultaneously leveraging the “more is better” approach for consumer sales. Prior to 4K, resolution was typically measured by image height (720p, 1080p). ‘4K’ loosely describes the image width (3840 pixels). If you measure 4K resolution by image height, then it’s called 2160p.

There are a few technical standards that are typically lumped together under the 4K umbrella. Ultra-High Definition (UHD) is the 4K standard for television, and has a resolution of 3840 x 2160 (16:9, or approximately a 1.78:1 aspect ratio). For movies, Digital Cinema Initiatives (DCI) is the dominant 4K standard, with a resolution of 4096 x 2160 pixels (approximately a 1.9:1 aspect ratio).

But It’s So Much Better Than 2K!
If you’re looking at a still or low motion image and searching for various kinds of detail, 4K makes sense at reasonable viewing ranges. Computer monitors and specialized applications like medical imaging, data visualization, and geologic surveys are excellent scenarios for rendering 4K, where detail is critical. However, there is a finite upper limit to the amount of data the human eye can process successfully. There aren’t any VRAM upgrades or bionic implants available yet.

Another important consideration is bandwidth — a 4K stream compared to a 1080p stream requires 4 times as much bandwidth, assuming all other parameters are identical. While many compression algorithms, including TesiraLUX’s, can produce visually lossless outputs at low compression ratios, certain content types can be more susceptible to artifacts. I’m looking at you, gradients.

Viewing Distance Is the Key
“Optimal viewing distance” represents the distance beyond which some details in the picture are no longer able to be resolved, so fine-grained details begin to blend together. For a 4K UHD screen, the optimal distance is 1.5 times the screen height. Beyond that distance, the human eye can’t benefit fully from the added resolution of a 4K image compared to a 1080p image. Farther than about 2.5 to 3 screen heights away there is no resolution advantage at all.

Let’s take an example of a 60 inch (~1.5 m) diagonal TV. For a 16:9 screen ratio, the screen height will be 29.4 inches (747 mm). To see all of the image detail on a UHD TV, you would need to view the image from 44 inches (1.1 m) away – assuming you have 20/20 (6/6) vision, that is. For a 1080p image, the optimal viewing distance is 88 inches (2.2 m). If your eyesight is worse than 20/20 (6/6), you may need to get even closer to the screen (or buy a bigger TV).

Bottom line: if people viewing the output display cannot approach to within 3 screen heights, then there’s no benefit to 4K resolution, so you may as well save bandwidth and transmit 1080p. Now HDR and WCG on the other hand…

Have questions about 4K? Leave a comment below!