History Of The Pc Monitor

874 Words4 Pages
PC Monitor Monitors also referred to as output devices have come a long way since the early 1970s when blinking green monitors existed in text-based computer systems. Computer monitors are often packaged separately than other computer components. The monitors offer instantaneous feedback by displaying the contents of the text or graphic images as one works or plays. Typically, desktop monitors make use of a cathode ray tube (CRT). Laptops, on the other hand, come with the output device firmly affixed to the rest of the unit. Laptops primarily use gas plasma or any similar projection technology, liquid crystal display (LCD) and light emitting diode (LED). Because of the slimmer design and lesser energy consumption, LCD monitors are rapidly replacing the venerable CRT (Allan, 124). This paper will examine the history of monitors, explaining how video cards have altered the needs for monitors. Advances in monitor technology have facilitated the introduction and adoption of superior monitors. IBM made some of the most significant modifications. In the 1970s, monitors consisted of text-reading technologies that were essential for deciphering contents of the text-based computers. In the course, of a decade, IBM made substantial changes to monitors and in 1981; the company introduced the first color graphics adapter (CGA). This monitor had the capacity to display four colors and a maximum resolution of 320 pixels that entailed 320 pixels on the horizontal display and 200 pixels on the vertical display. In 1984, IBM made another change and launched the enhanced graphics adapter (EGA) display, which allowed for a maximum of 16 colors and had an enhanced resolution of 640 x 350 pixels. This means that the new monitor had better display and appearance, which allowed for easier reading of texts. Later on in 1987, IBM launched the video graphics array (VGA) display model.

More about History Of The Pc Monitor

Open Document