A gigapixel, as specified by the SI prefixes, is 109 pixels - that is, 1,000,000,000 or one American billion pixels.

A gigapixel is clearly a quite enormous collection of pixels, and until very recent times - like the gigabyte just a few years ago - has had only a few areas of genuine practical use. A good-quality digital camera has, at present, about two megapixels, tops. A single gigapixel, then, is about two hundred times larger than this. However much digital cameras evolve, I find it hard to believe we'll ever be measuring the size of their pictures in gigapixels. Human visual accuity is simply not that sharp at any common scale.

What about digital movie files? We could use gigapixels to sum the total number of pixels in all the frames of a movie. Of course, given the size of some of the movies being tossed about over P2P services these days, even the gigapixel might prove an unwieldy unit, but that's another story. This might have been handy in the days of uncompressed AVI files, but the compression methods of MPEG and Div X now means that such a figure is no longer easily calculable nor indeed very useful. File size is a much better measure of just how much film is being passed around.

The next place in which you are likely to encounter gigapixels is in the area of 3D graphics and graphics cards. Since the latest efforts from NVidea and such can sustain resolutions of 1600 x 1200 at 60 FPS, thats a healthy 0.1 gigapixels per second. Given the astonishing rate at which graphics cards have evolved, moving from a rarely-seen luxury seven or eight years ago to the ubiquitous talking point they are today, it is unlikely to be far in the future before graphics cards manufacturers can talk in terms of gigapixels. Whether they are willing to abandon the "big numbers cult", that sees 2 GHz processors labelled as 2000 MHz, remains to be seen.

Log in or register to write something here or to contact authors.