Skip to content
【☢】 BitHardware.top ▷ Hardware, Reviews, News, Tutorials, Help post

Evolution of HD resolutions and how they affect PC performance

In recent years we have been able to connect our PC to a high definition television, all thanks to the fact that the HD resolutions are good enough for a PC. What has been your evolution?

What are the differences between HD resolutions?

Obviously the first difference is in the amount of pixels on the screen, since despite the fact that the three resolutions have a format of sixteen ninths, where the ratio of the horizontal resolution to the vertical is 16: 9. At each of these resolutions, the number of pixels varies.

All the resolutions that we are going to deal with in this article correspond to television standards and not those of VESA, so we will not deal with intermediate resolutions and image formats that can be given on different television monitors, but do not comply. the standard.

The standardization of HD resolutions

SDR vs HDR Gaming

The first resolution to be standardized was HD Ready or known as 720p, but this way of naming through its vertical resolution is not what was initially thought. For this we have to take into account the standard VGA resolution, that of 640 pixels per line, which has a horizontal frequency of 31 KHz, double that of the 15.5 KHz of the American television standard or NTSC.

The idea for the high definition television standard was initially that television content could be reproduced in that format. We are going to discard the European one due to the fact that the consumer electronics industry in Europe has been fried for a long time and it is the Japanese and American manufacturers who set the standards.

So they only needed to quadruple the horizontal frequency of NTSC and double that of VGA for the first HD standard that was 720p or HD Ready in order to achieve the horizontal frequency. This was what allowed the first HD Ready televisions to have a combination of typical ports for television and computer.

From four thirds to sixteen ninths

HDTV Ratios

The other decision was the choice of the 16: 9 widescreen format and this has a practical motivation. Why was the 4: 3 format not continued? Let’s not forget that when Thomas Edison invented film for storing photos and movies, he came to the conclusion that the best format was precisely 4: 3 and this standard continued to be used for more than a century.

The origin of the panoramic format comes from the world of cinema, when television entered homes in the 50s it competed for interest with cinema, so the solution of the vast majority of production companies was to use a different screen format, that is say, a panoramic one. But at that time several different formats were used, such as the Panavision with an aspect ratio of 2.20: 1 or the famous CinemaScope with a ratio of 2.39: 1.

When the technology of the television panels began to be ready for higher resolutions was when they decided to sit down again to decide the new standard. One of the proponents, Dr. Kerns Powers, took several cutout cartons of the most important aspect ratios and laid them on top of each other with a common center.

The observation was that the largest rectangle, which contained all the rectangles inside it, had the same aspect ratio as the innermost rectangle. Which was? 1.78: 1 or 16: 9. From there, 16: 9 was the chosen standard, so 720p was the resolution chosen for high definition.

The evolution towards Full HD

Full HD

The next standard to be chosen was Full HD or better known as 1080p resolution, but as we have said before, the choice of initial resolutions was not made taking the vertical resolution as a reference, but the horizontal one. In Full HD this is 1920, which means that it is 50% higher than the horizontal resolution of HD Ready.

The reason is that the chips in the televisions responsible for receiving the video signal had room for improvement since they consumed very little, so they decided to create a higher resolution version of the standard. But mainly it was because they realized that with the Blu-Ray they could put a movie full time in Full HD resolution without problems and without having to compromise anything. So Full HD was born accidentally when they saw that they could force the machinery a little more in all aspects.

The other reason has to do with the fact that at the distances in which people usually watch television, 1080P marked a difference with 720P in terms of image quality, in addition, LCD panels despite needing a higher density of pixels had become the same as those of the PC and the transition from 720p to 1080p was done at a very low cost.

The last of the HD resolutions: 4K

AMD NVIDIA 4K

The last standard resolution used in televisions is 4K or UHD, its history is different from the rest and has a very different origin from the other two resolutions, since it was created several years later.

A major leap in LCD displays was the adoption of what Apple commercially called retina displays with its iPhone 4. The particularity? The density of pixels per inch became 300: 1, the same as that of professional photographs, a limit that at the distance at which we normally keep a smartphone if we increase the resolution, the difference would not be noticed. How did this affect televisions? Well, it was allowed to start creating panels with higher pixel density on a large scale by lowering costs.

The choice of 2160 pixels of vertical resolution came from the fact that it is the same resolution of the same type as that used in the cinema, only that in movies it is 4096 pixels horizontally and in televisions it is 3840 pixels. So all this means four times the resolution, which translates into the largest jump that has ever existed, since from standard television to HD Ready it was 3: 1, from HD Ready to Full HD it was 2.5: 1 , but from Full HD to 4K it has been 4: 1.

HD resolutions on the PC today

HD resolutions are today standard on PC, 1080p is the standard par excellence for most people who use a PC. But, the fact of having a resolution with 4 times more pixels in 4K compared to Full HD therefore means a series of requirements in the hardware that are the following:

  • The bandwidth to transport pixel information from the GPU to the GPU is four times greater.
  • With four times more pixels, we have four times more computational needs, not counting improvements in image quality that require more processing power.
  • It also means quadrupling the video memory required.

Of the three points, the third is easier to achieve, but not the fourth, since VRAM has not evolved fast enough and there is a stagnation in the evolution of GPUs, which has led them to have to develop techniques AI-based super-resolution where the GPU only has to render a lower source image resolution and then be scaled.

Obviously 4K for the world of cinema and television is not a problem, a codec capable of reproducing video at 4K costs a few dollars to implement them, not to say a few cents. Hence, 4K television content proliferates and if we add the ease of porting from cinema to television by the same vertical resolution then all is said.