A friend and contributor on WC asked me to clear up a question for him:
"If possible please do a post about the difference between UHD and 4K resolutions. We are few here a little confused about this subject and no one is better than you making clear the things."
I'll do my best to clear things up...but I don't generally succeed in doing that (lol).
From the linked article:
"The term “4K” originally derives from the Digital Cinema Initiatives (DCI), a consortium of motion picture studios that standardized a spec for the production and digital projection of 4K content. In this case, 4K is 4,096 by 2,160, and is exactly four times the previous standard for digital editing and projection (2K, or 2,048 by 1,080). 4K refers to the fact that the horizontal pixel count (4,096) is roughly four thousand. The 4K standard is not just a resolution, either: It also defines how 4K content is encoded. A DCI 4K stream is compressed using JPEG2000, can have a bitrate of up to 250Mbps, and employs 12-bit 4:4:4 color depth.
Ultra High Definition, or UHD for short, is the next step up from what’s called full HD, the official name for the display resolution of 1,920 by 1,080. UHD quadruples that resolution to 3,840 by 2,160. It’s not the same as the 4K resolution made above — and yet almost every TV or monitor you see advertised as 4K is actually UHD. Sure, there are some panels out there that are 4,096 by 2,160, which adds up to an aspect ratio of 1.9:1. But the vast majority are 3,840 by 2,160, for a 1.78:1 aspect ratio."
So...there really isn't that much of a difference, and honestly, it looks to me like they're conflating to two at this point even though their origins were different between cinema and displays.
The 4K is pure Madison avenue, because it's really 2160p...and it's also too late to expect the 4K to turn into 2160p...in all reality.
Hope this helps.