The History of Digital Video

In the early 1990s, a digital video system capable of capturing full-screen video images would have cost several thousands of pounds. The biggest cost element was the compression hardware, needed to reduce the huge files that result from the conversion of an analogue video signal into digital data, to a manageable size. Less powerful video capture cards were available, capable of compressing quarter-screen images – 320×240 pixels – but even these were far too expensive for the average PC user. The consumer end of the market was limited to basic cards that could capture video, but which had no dedicated hardware compression features of their own. These low-cost cards relied on the host PC to handle the raw digital video files they produced, and the only way to keep file sizes manageable was to drastically reduce the image size.

Until the arrival of the Pentium processor, in 1993, even the most powerful PCs were limited to capturing images no more than 160×120 pixels. For a graphics cards running at a resolution of 640×480, a 160×120 image filled just one-sixteenth of the screen. As a result these low-cost video capture cards were generally dismissed as little more than toys, incapable of performing any worthwhile real-world application.

The turning point for digital video systems came as processors finally exceeded 200MHz. At this speed, PCs could handle images up to 320×240 without the need for expensive compression hardware. The advent of the Pentium II and ever more processing power made video capture cards which offered less than full-screen capability virtually redundant and by the autumn of 1998 there were several consumer-oriented video capture devices on the market which provided full-screen video capture for as little as a few hundred pounds.