+92 332 4229 857 99ProjectIdeas@Gmail.com

What is VGA Video Graphics Array Camera/Display? Difference Between VGA And Mega Pixel Technology

















Video Graphics Array
(VGA) technology
was developed by IBM in 1987, for use in display of PC
monitors. VGA display contains a resolution of 640x480 pixels, which equals 0.3
Mega pixels if measured in Mega pixels. Pixels are small dots like structure
having granular nature. They cover the entire screen of any display device. The
image produced by VGA camera is therefore 640 pixels in width and 480 pixels in
vertical length. VGA cameras are suitable for small screens only. If the image produced by VGA camera is stretched,
or displayed on large screens the image becomes distorted
, hence becoming
poor in quality. After 1990,s VGA cameras were slowly replaced by more advanced
(XVGA) Extended Video Graphics Array and (SVGA) Super video graphics array. VGA
mainly is used today in electronic devices having small screens, like hand-held
devices, lower picture quality mobile phones, meters used to display images
along with readings, screens of children toys, for MMS picture messages etc.
Moreover in PCs during safe mode operation, the image is blurred; this is due
to VGA display.


When compared to modern cameras which mostly deploy Mega pixel technology, the quality of
VGA is not good; this is due to the reason that modern mega pixel cameras have
millions of pixels to construct an image. A typical camera now-a-days contains
not less than 1.3 mega pixels. This is equal to about 1.3 millon pixels; while
VGA as previously said carry 0.3 mega pixels. We can find the exact number of
pixels with the help of resolution. Width in VGA display contains 640 pixels
while vertical length contains 480 pixels. When multiplied together gives us
the whole area of dispaly block, that is 307,200 pixels in total. This when
converted into million gives 0.3 mega pixels. More pixels a display contains
more capable it is to reveal the details of any image. The quality of VGA on
small screens is not bad, but if those images are displayed on screens of
television or monitors then their quality drops. The image becomes blurred and
distorted.







Moreover, a VGA
display contains 8 bit colors
. That is, each colour in VGA is stored in
memory of 8 bits. And no more than this, as it is the only memory available to
store any particular color. Now as information in digital devices is stored in
binary form so, if we take 8th power of 2, it will give us the total number of
combinations which an 8 bit memory can store. It comes out to be 256. So VGA
display mode can contain no more than 256 colors, and it has to make any image
with only these 256 colors. What is the result? Result is quality of the image
drops. And image does not look more close to reality. Modern cameras contain 24
bit capacity to store any particular colour. So the total number of
combinations produced is 2 raised to the power 24. That gives us 16,777,216 colors.
The difference is clear. Modern displays have option to choose from 16,777,216 colors
to make any image, as compared to VGA's 256 colors scheme.


Actually any image contains many attributes, like pixel
concentration, number of colors etc. Greater the number of pixels, more details
the display can reveal. It can depict the two close objects as not being one.
This feature contributes to clarity of any image. Moreover more the number of
pixels larger the image can be made without affecting the quality. While the
other attribute that is number of colors has nothing to do with clarity. This
feature only contributes in making any image more close to natural view. In VGA
both these attributes are limited in capacity.


VGA cameras and
displays
are used mostly in applications where memory is limited. For
example to send any picture message, VGA is a good choice because the size is
small, so not costing you more band width. Moreover if the images are in large
number it will also result in less time for transferring of data.

0 comments: