Remote Sensing

Pixels and Bits

 
Hurricane Floyd

Using radio waves, data from Earth-orbiting satellites are transmitted on a regular basis to properly equipped ground stations. As the data are received they are translated into a digital image that can be displayed on a computer screen. Just like the pictures on your television set, satellite imagery is made up of tiny squares, each of a different gray shade or color. These squares are called pixels—short for picture elements—and represent the relative reflected light energy recorded for that part of the image.

This weather satellite image of hurricane Floyd from September 15, 1999, has been magnified to show the individual picture elements (pixels) that form most remote sensing images. (Image derived from NOAA GOES DATA)

Each pixel represents a square area on an image that is a measure of the sensor's ability to resolve (see) objects of different sizes. For example, the Enhanced Thematic Mapper (ETM+) on the Landsat 7 satellite has a maximum resolution of 15 meters; therefore, each pixel represents an area 15 m x 15 m, or 225 m2. Higher resolution (smaller pixel area) means that the sensor is able to discern smaller objects. By adding up the number of pixels in an image, you can calculate the area of a scene. For example, if you count the number of green pixels in a false color image, you can calculate the total area covered with vegetation.

How does the computer know which parts of the image should be dark and which one should be bright? Computers understand the numeric language of binary numbers, which are sets of numbers consisting of 0s and 1s that act as an "on-off" switch. Converting from our decimal system to binary numbers, 00 = 0, 01 = 1, 10 = 2, 11 = 3. Note that we cannot use decimal numbers since all computers are fussy—they only like "on" and "off."

For example, consider an image that is made up of 8 columns by 5 rows of pixels. In this figure, four shades are present: black, dark gray, light gray and white. The darkest point is assigned the binary number 00, dark gray as 01, light gray as 10, and the brightest part the binary number 11. We therefore have four pixels (B5, C4, D7 and E2) that the spacecraft says are 00. There are three dark gray pixels (B3, C2, C6 and E6) assigned the binary number 01, three light gray pixels (D3, D6 and E5) that are binary number 10, and 29 white pixels are assigned the binary number 11.

Pixel Diagram

Four shades between white and black would produce images with too much contrast, so instead of using binary numbers between 00 and 11, spacecraft use a string of 8 binary numbers (called "8-bit data"), which can range from 00000000 to 11111111. These numbers correspond from 0 to 255 in the decimal system. With 8-bit data, we can assign the darkest point in an image to the number 00000000, and the brightest point in the image to 11111111. This produces 256 shades of gray between black and white. It is these binary numbers between 0 and 255 that the spacecraft sends back for each pixel in every row and column—and it takes a computer to keep track of every number for every pixel!

next: Color Images
back: Absorption Bands and Atmospheric Windows

  pullquote

Remote Sensing
Introduction and History
Radiation
Electromagnetic Spectrum
Absorption Bands and Atmospheric Windows
Spectral Signatures
Pixels and Bits
Color Images
Remote Sensing Methods
NASA Remote Sensing Accomplishments
References

Related Data Sets:
Observation Deck