Basics Of Computer Graphics
| The Binary Notation
| Color Depth
| Screen Resolution |
'Screen resolution' is not a strange concept, yet there is a surprising ammount of people out there sitting by their computers not
knowing anything about such basic concepts as 'screen resolution', 'colordepth' and 'graphic memory-usage'. We will try to change that now! :)
Different screen resolutions and colordepth can be adjusted on your computer, but how you do it depends on what computer you use (Windows95 /NT: Start: Settings /ControlPanel /Display /Settings), (Amiga: System:Prefs /ScreenMode), (Mac: 'TheApple' /ControlPanels /Monitors)
There is a simple relationship between the ammount of graphics memory and the maximum resolution/colordepth you can use. So, if you know how much graphic-memory you have in your computer, you can also calculate which resolutions/colordepths you should be able to display...
For non-professional users, the most common ammounts of graphic memory these days are between 8 & 64 MB.
The most common resolutions are 800*600 and 1024*768.
A resolution of 800*600 means that the viewable size is divided into 800 picture elements (picture elements=pix_els) horizontally and 600 pixels vertically. The total ammount of pixels in this case is: 800 times 600 = 480000
How much memory this resolution requires depends on the colordepth.
On the computers in use now, there are usually 3 alternatives when it comes to colordepth. Those are: 8-bit (256 colors), 16-bit (65 thousands of colors, a.k.a. HighColor) and 24/32-bit (16,8 million colors, a.k.a. TrueColor).
Computer memory is 'measured' in 'bytes'. e.g. kilo bytes (kB), Mega bytes (MB); One byte= 8 bits.
This is something that is valid for computers nowdays. In the past e.g., there have been computers where one 'byte' was only 4 bits .
Now, back to the 8-bit display...
On an 8-bit display (the 8 bits describe the color depth; amount of colors) every pixel occupies 1 byte in the computers graphic memory. In the example earlier we had a total of 480000 pixels. This means that the total memory usage is: 480000 (ammount of pixels) multiplied with 1 (1 byte occupies 1 pixel). The answere is still 480000.
Simply speaking, a 800*600 display with 256 colors requires 480000 bytes of graphic memory.
Now, if you wanted to have more colors, like 16-bit instead, how much memory would that require? Well, 16 bits means 2 bytes, since one byte is only 8 bits. 16bits/8bits=2
800*600= 480000 pixels in 16 bit means 480000*2=960000 bytes. Quite obviously, the 16-bit display requires twice as much memory as an 8-bit display in the same screen resolution.
The 24-bit display then, quite logically requries: 24bit/8bit=3bytes/pixel. i.e. every pixel uses 3 bytes which lead us to the conclusion that an 800*600*24bit display requires 480000*3 bytes= 1440000 bytes (1400kB or 1.4MB)
So if you happen to have only 1 Megabyte of graphic memory on your ancient GFX card, you should be able to display 800*600 in 16-bit but not in 24-bit (16,8 million colors). In real life though, other factors may me decisive such as your OS (Operating System)...
Here below I've put together a table with all the most common resolutions and how much memory they require in their respective color depths.
Alternatively you could use this table to see how much memory the display requires from your total ammount of graphics memory. If you i.e. play 3D games that use a lot of textures, you can calculate how much memory you have left for textures. [total memory]-[display memory]=[available memory for textures].
What is 'Refresh rate'?
Most people (computer users) are familliar with the term 'refresh-rate'. It is quite simply the rate at which your screen is being updated; refreshed.
For a stable, flicker-free picture, at least 70 refreshes/second are recomended. (For every 'refresh' the picure on your monitor is re-drawn) A refresh rate of 50 updates/second gives you a more 'flickery' display, and less is worse.. (Bare in mind however that this only applies to the type of monitors that use 'Cathode ray Tube' (CRT) technology. Which is basically all monitors that are not flat)
The 'Refresh-rate'is measured in Hz (Herz); 1Hz=1 time/sec
In ads for computer monitors you can sometimes see something called 'Horizontal Sweep Frequency'. I'll explain what that is below...
RefreshRate tells you how often the screen is updated.
Horizontal Sweep Frequency however, is the ammount of horizontal 'pixel-lines' the monitor can output/time unit... e.g. a resolution of 640 (width) * 480 (height) means that the screen consists of 480 horizontal lines that are 640 pixels wide each. The Horizontal Sweep Frequency (measured in kHz=kiloHerz) tells you how many of these horizontal lines that the monitor 'draws' every second. It is not your graphics card that does this job, but rather the monitor itself, so even if you have a very expensive graphics card in your computer, it is still the monitor that sets the upper limit for the quality of your display.
Real life example:
If you use a resolution of 800*600pixels, it means that you have 600 horizontal, 800 pixels wide, lines... Let's say you want to have your refresh-rate at 76 Hz. (Which would give you a nice flickerfree display).
What it means to the monitor is that it must 'draw' 600 horizontal lines 76 times/sec! 600*76= 45600
45600 horizontal -800pixels wide- lines is what the monitor must manage to 'draw' each second. 45600 Hz is the same as 45,6 kHz...
To sum it up:
45,6 kHz is the 'Horizontall Sweep Frequency' that your monitor must 'manage' if you are to display 800*600 at 76Hz!
That's it, wasn't very strange, was it?
Here below, you can see a table containing the most common resolutions, refresh-rates the required HSF's. (Horizontal...)
So, e.g. if you are buying a monitor and you want to use 1600x1200 @ 85Hz, make sure it manages at least 102 kHz Horizontal Sweep frequency. (Keep in mind though that this only applies for CTR (Catode Ray Tube) monitors. If you're buying a flat TFT screen, this is not relevant.)
|Resolution||Refresh-rate||H. Sweep Freq|
|640*480||60 Hz||28,8 kHz|
|640*480||76 Hz||36,5 kHz|
|640*480||85 Hz||40,8 kHz|
|640*480||100 Hz||48 kHz|
|800*600||60 Hz||36 kHz|
|800*600||76 Hz||45,6 kHz|
|800*600||85 Hz||51 kHz|
|800*600||100 Hz||60 kHz|
|1024*768||60 Hz||46 kHz|
|1024*768||76 Hz||58,4 kHz|
|1024*768||85 Hz||65,3 kHz|
|1024*768||100 Hz||76,8 kHz|
|1280*1024||60 Hz||61,4 kHz|
|1280*1024||76 Hz||77,8 kHz|
|1280*1024||85 Hz||87 kHz|
|1280*1024||100 Hz||102,4 kHz|
|Note that the resolution 1280x1024 doesn't have the same pixel aspect as the other resolutions. (Pixel aspect= Width/heigth. In this case it's 1.25 instead of the standard 1.33) While this isn't a problem in itself it may cause some distortion on LCD screens as the screen would have to be scaled unevenly.|
|1600*1200||60 Hz||72 kHz|
|1600*1200||76 Hz||91,2 kHz|
|1600*1200||85 Hz||102 kHz|
|1600*1200||100 Hz||120 kHz|
|2048*1536||60 Hz||92,2 kHz|
|2048*1536||76 Hz||116,7 kHz|
|2048*1536||85 Hz||130,5 kHz|
|2048*1536||100 Hz||153,6 kHz|
The Megapixel evolution
We have been talking about screen resolutions and in the early sections we also discussed how computer images are created. Obviously, if you want a picture with fine detail, it will require a vast amount of single picture elements (pixels) to give you that fine detailed information.
Most home users are not too concerned with the technical details of their screen as long as they can do whatever it is they want to do and see. But since digital cameras became a cheap mainstream product, almost everyone living in the developed part of the world has seen the term 'Megapixel'.
Now, Mega is obviously a prefix from the metric system meaning Million so it's not difficult to understand that 'Megapixel' refers to a resolution which is measured in millions of pixels. The resolution in question is obviously the resolution of the image that the digital camera can capture. Because there is no (chemical) film to capture the light as in older cameras, a digital camera relies on the quality of the sensor to 'record' the information. I'm not going to describe in detail how digital camera sensors work, but let me just quickly say that there are basically two types of sensors on the market today. The cheaper cameras use CMOS (Complementary Metal Oxide Semiconductor) sensors and the more expensive cameras rely on CCD (Charged Coupled Device) sensors. The CCD is more expensive but gives better quality in terms of light sensitivity and low graininess. CMOS sensors are getting better though, because there is a big market for cheap digital cameras out there meaning a lot of research is being done in this area.
If you have seen solar panels then you know that light can be converted into electrons (electricity). You might also know about light sensitive diodes. A typical example would probably be a door that opens when you break a beam of light because a light sensor has registered a change in the lighting conditions. Anyhow, that is the basic principle of sensors used in digital cameras. They get hit by photons (light) and convert this energy into electrical signals which end up as 1s and 0s which your computer can understand. Color information is typically obtained through use of red green and blue filters (the exact way of applying these filters varies a lot between different manufacturers and price ranges of cameras. If you see a camera with a 3xCCD label, it means it has a dedicated CCD for each of the 3 filters/colors (Red, Green, Blue) which usually means superior colors and quality.
Now let's get to the main point here. The quality and design of the sensor is the upper limit of how much light information that can be gathered and then transformed into an image. So if your sensor would consist of a grid of 4 by 4 blocks, then the total resolution of that image would be 16 pixels (4x4=16). Obviously, that wouldn't be very useful. As we have already discussed, the term Megapixel describes the potential of the imaging sensor. If your camera is labeled '1 Megapixel' then the sensor in the camera is capable of capturing light information for 1 million pixels. If you have a more expensive camera you might be able to capture e.g. 7 Megapixel images and so forth....
Below is a table that shows the screen resolutions (Screen Res) that correspond to different Megapixel ranges (Cam Res). It also shows the size of the image if you would choose to print it out on a photo-quality printer (and if you print at 600dpi, just cut that length in half). Note however that the screen resolution is of a different format that a typical old style photo. A typical size of a photo might be 15x10cm meaning that the width vs height ratio is 3/2 or 1.5 (15/10=1.5). A typical PC screen has the ratio of 4/3 or 1.33 (e.g. 800/600=1.333). What all this means is that the screen resolution might not correspond exactly to say 1 Megapixel, but rather, it will be the highest possible resolution that follows the PC standard of the 1.333 ratio and standard available PC resolutions. Typically, PC resolutions are evenly divisible by 16.
|Cam Res||Screen Res||Print size (300dpi)||Comment|
|1 Mpix||1024 x 768||~ 8.5 x 6.5 cm|
|2 Mpix||1600 x 1200||~ 13.5 x 10 cm|
|3 Mpix||1920 x 1440||~ 16 x 12 cm|
|4 Mpix||2240 x 1680||~ 19 x 14 cm||Beyond most monitors|
|5 Mpix||2560 x 1920||~ 21.5 x 16 cm|
|6 Mpix||2720 x 2040||~ 23 x 17 cm|
|7 Mpix||3040 x 2280||~ 25.5 x 19.5 cm|
|8 Mpix||3200 x 2400||~ 27 x 20.5 cm|
|9 Mpix||3360 x 2520||~ 28.5 x 21.5 cm||Roughly, A4 size|
|10 Mpix||3520 x 2640||~ 30 x 22.5 cm|
|12 Mpix||4000 x 3000||~ 34 x 25.5 cm|
|15 Mpix||4480 x 3360*||~ 38 x 28.5 cm||*probably (4320*3240 --> 14Mpix)|
|20 Mpix||5120 x 3840||~ 43.5 x 32.5 cm||(A3)|
|50 Mpix||8160 x 6120||~ 69 x 52 cm||Not yet available (>A2)|
|100 Mpix||11360 x 8520||~ 96 x 72 cm||Roughly 10x of normal PC screen|
|1 Gpix||> 36000 x 27000||> 300 x 230 cm||One billion (Giga) pixels|
As we can see from this table, you will need a 20 Megapixel camera to be able to print A4 at 600 dpi with full print quality. Cameras exceeding 20Mpix are already available for professionals (or rich enthusiasts ;-).
Anyway, I hope someone finds these tables helpful.