Image Resolution

August 2013

 

Do image resolutions, 1080p, 2K, 4K, cause you confusion? And even if you do know what they stand for, do you wonder why the don’t round up to a neat figure?
It all goes back to the early days of computing. A single character was made up of dots called pixels. “Dot matrix” printers would print these out with inked pin strikes on paper. The characters measured 8 pixels across by 8 pixels high which was pretty much the lowest resolution to be able to distinguish numbers from each other. This produced a total pixel count per character of 8 x 8 = 64. That is why all resolutions are actually rounded up to the nearest multiple of 64. Computer memory and storage is measured in exactly the same way.
The terms 2K, 4K etc are referring to the number of pixels in the width of the image. K refers to the suffix/prefix Kilo which means 1000. However 2K does not mean exactly 2000. That is because 64 doesn’t divide exactly into 2000 and the nearest rounded up multiple of 64 is 2048. 4k is double that at 4096 pixels but Ultra HD which is incorrectly referred to as 4K is actually slightly less, weighing in at 3840 pixels across. That is because it is twice that of HD which weighs in a little less than 2k at 1920 pixels across. So what is 1080p? Firstly the p refers to the method of image scanning employed (progressive as opposed to interlaced) but that’s another topic. It does indicate however, that the number refers to the height or number of lines in the image rather than the width. In an image of standard TV aspect ratio (ie 16:9) the image is 16 units wide and 9 units high. Therefore, if you divide 1080 by 9 and then multiply by 16 it gives you the image width of 1920 or standard HD. The overall image size in digital terms is given by the total pixel count area (ie No. pixels high x No. pixels wide) and often referred to in mega-pixels (even more rounding up but the closest multiple of a million; mega = million) therefore HD is roughly 2 mega-pixels.