Let’s not forget that computer data is digital: that is: it is a series of zeros and ones, and nothing else. It’s a point easily forgotten.
Computers, we know, have an extraordinary power to manipulate numbers at this digital level, which means (with a little number base work) they are perfect for dealing with any numeric data we throw at them. We now use them routinely to perform phenomenal calculations, but there is a basic problem with a great deal of the data we ask them to handle, and it stems from a simple fact: the world we live in is analogue, not digital.
This was an issue long before computers existed, as people tried to find ways to discuss their analogue world with each other. The trouble is, our analogue world is so subjective that it poses some curious difficulties when it comes to communication. Though it may have certain rhythms or patterns, analogue data is continuous, and has no intrinsic calibrations for us to use to communicate with. For example, we may know that the temperature varies between hot and cold with the time of day and seasons. But hot and cold are not precisely defined, are open to subjective judgement, and so can’t be communicated with any real accuracy: today is hot, and so was yesterday – but which was the hottest? When is it warm? Cool? Cold? Is warm to someone in the Scottish hills the same as warm to another person on the Florida coast?
The solution humans adopt for these analogue issues is to introduce a means of measuring. The means of measuring must have a consistent means of sampling, but otherwise is entirely arbitrary. However, this measuring has no value unless it is shared and widely adopted. Then, it becomes an invaluable tool.
For instance, continuing with our temperature example, the Celsius scale of temperature measurement defines the freezing point of water as 0 and the boiling point of water as 100. It then divides the intervening points into 100 equal units, and extrapolates these units for temperatures below 0 and above 100. An eminently sensible system, those using it accept it as completely natural. However, beyond being based on water’s freezing and boiling points there is nothing natural about it. Importantly, though, it can be consistently sampled, has been widely adopted, and therefore provides a basis for communication and science.
Interestingly, the measuring is often done in an analogue to analogue way, but with a calibration imposed. For instance, when measuring temperature we watch the expansion of mercury up a glass tube. The mercury expansion is an analogue event, but the Celsius measurements are not. A sundial or even a clock hand represent time in an analogue way, but we impose our measures of time on the rotation. Speed, sound, mass, gravity, acceleration, voltage, current and so on; all are measured by testing against a second analogue signal which is affected by the first and applying a calibration.
With the digital computer, the calibration is simple and predetermined: binary. By whatever means, everything a computer deals with must ultimately be reduced to zeros and ones. So now, on to the pictures.
- How Do You Put Picasso in PCs?
- Analogue and Digital Data
- Converting Images To Digital Computer Data
- Resolutions When Sampling Digital Images with Scanners or Using Digital Cameras
- Making Bitmaps with the RGB Colour System, Palettes and Look Up Tables