This page is available in a framed version. For convenience, a Full Table of Contents is provided.
In recent years, technological advances have changed the way geographic analyses are done. Increasingly, computers are used to automate aspects of cartography and remote sensing, producing data that are easily integrated into a GIS.
Many GIS systems have the capability of incorporating aerial photography,
satellite data, and radar imagery into their data layers. The process is
simple, as images may be scanned or read off a data tape. However, to use
this technology effectively, it is important to know the strengths and
limitations of remotely sensed data, and to understand which types of imagery
are suited to particular projects. This unit was developed with these concerns
in mind. The information and exercises contained within it are intended
to familiarize you with the interface between remote sensing and GIS.
The USGS defines the electromagnetic spectrum in the following manner: "Electromagnetic radiation is energy propagated through space between electric and magnetic fields. The electromagnetic spectrum is the extent of that energy ranging from cosmic rays, gamma rays, X-rays to ultraviolet, visible, and infrared radiation including microwave energy."
Electromagnetic waves are radiated through space. When the energy encounters an object, even a very tiny one like a molecule of air, one of three reactions occurs. The radiation will either be reflected off the object, absorbed by the object, of transmitted through the object. The total amount of radiation that strikes an object is referred to as the incident radiation, and is equal to:
reflected radiation + absorbed radiation + transmitted radiation
In remote sensing, we are largely concerned with REFLECTED RADIATION. This is the radiation that causes our eyes to see colors, causes infrared film to record vegetation, and allows radar images of the earth to be created.
The electric field and the magnetic field are important concepts that can be used to mathematically describe the physical effects of electromagnetic waves.
The electric field vibrates in a direction transverse (i.e. perpendicular) to the direction of travel of the electromagnetic wave.
The magnetic field vibrates in a direction transverse to the direction of the em wave AND transverse to the electric field.
POLARIZATION: Polarization is defined by the orientation of the electrical field E. It is usually described in terms of HORIZONTAL POLARIZATION and VERTICAL POLARIZATION. Polarization is most important when discussing RADAR applications of remote sensing.
Aerial photography has two uses that are of interest within the context of this course: (1) Cartographers and planners take detailed measurements from aerial photos in the preparation of maps. (2) Trained interpreters utilize arial photos to determine land-use and environmental conditions, among other things.
Although both maps and aerial photos present a "bird's-eye" view of the earth, aerial photographs are NOT maps. Maps are orthogonal representations of the earth's surface, meaning that they are directionally and geometrically accurate (at least within the limitations imposed by projecting a 3-dimensional object onto 2 dimensions). Aerial photos, on the other hand, display a high degree of radial distortion. That is, the topography is distorted, and until corrections are made for the distortion, measurements made from a photograph are not accurate. Nevertheless, aerial photographs are a powerful tool for studying the earth's environment.
Because most GISs can correct for radial distortion, aerial photographs are an excellent data source for many types of projects, especially those that require spatial data from the same location at periodic intervals over a length of time. Typical applications include land-use surveys and habitat analysis.
This unit discusses benefits of aerial photography, applications, the different types of photography, and the integration of aerial photographs into GISs.
Novice photo interpreters often encounter difficulties when presented
with their first aerial photograph. Aerial photographs are different from
"regular" photos in at least three important ways:
In 1903 or 1904 the first reliable black and white infrared film was developed in Germany. The film emulsion was adjusted slightly from regular film to be sensitive to wavelengths of energy just slightly longer than red light and just beyond the range of the human eye. By the 1930s, black and white IR films were being used for landform studies, and from 1930 to 1932 the National Geographic Society sponsored a series of IR photographs taken from hot air balloons.
Throughout the 1930s and 1940s, the military was hard at work developing color infrared film, eager to exploit it for surveillance. By the early 1940s the military was successful in its attempts. It developed a film that was able to distinguish camouflaged equipment from surrounding vegetation. Within months, however, an IR reflecting paint was developed for use on military vehicles, effectively making IR film technology useless to the military. So, they dropped it.
The scientific community, however, has made continuous use of the film technology.
Color infrared film is often called "false-color" film. Objects that are normally red appear green, green objects (except vegetation) appear blue, and "infrared" objects, which normally are not seen at all, appear red.
The primary use of color infrared photography is vegetation studies. This is because healthy green vegetation is a very strong reflector of infrared radiation and appears bright red on color infrared photographs.
more to come
Land-Use Planning and Mapping
Species Habitat Mapping
Humans are adept at visually interpreting data. We can distinguish millions
of colors, several shades of gray, and have a demonstrated ability to identify
water, vegetation, and urban forms on several types of imagery. Why try
to expand on this?
Sources of Digital Data
LANDSAT refers to a series of satellites put into orbit around the earth to collect environmental data about the earth's surface. The LANDSAT program was initiated by the U.S. Department of Interior and NASA under the name ERTS, an acronym which stands for Earth Resources Technology Satellites. ERTS-1 was launched on July 23, 1972, and was the first unmanned satellite designed solely to acquire earth resources data on a systematic, repetitive, multispectral basis. Just before the launch of the second ERTS satellite, NASA announced it was changing the program designation to LANDSAT, and that the data acquired through the LANDSAT program would be complemented by the planned SEASAT oceanographic observation satellite program. ERTS-1 was retroactively named LANDSAT-1, and all subsequent satellites in the program have carried the LANDSAT designation. Over time, the sensors carried by the LANDSAT satellites have varied as technologies improved and certain types of data proved more useful than others. The table which follows outlines the sensors onboard each satellite, their launch dates, and the dates they were decommissioned.
The various Landsats have had Multispectral Scanners (MSS), Return Beam Vidicon (RBV) scanners, and Thematic Mapper (TM) scanners. Each type has its own spectral range and spatial resolution.
Interpreting Landsat Data
The images discussed in this section are the property of the University
of California, Santa Barbara. Click here
to get to the Center for Ecological Health Research Home Page, then click
on the image indicated below, then back up to this page with the image
still visible to read the discussion that pertains to the image. Detailed
explanations of the images will be added soon.
NOAA Geostationary and Polar Orbiting Satellites
NOAA GOES mission overview and history. The GOES graphic was prepared by the NASA Goddard Space Flight Center, which provides additional information about the GOES project.
The first visible GOES-8 image. Look carefully and you can make out Baja California on the lower left and Lake Michigan on the upper right.
Applications of Satellite Imagery
Integration of Satellite Imagery into GIS
Bauer, M.E., T.E. Burk, A.R. Ek, P.R. Coppin, S.D. Lime, T.A. Walsh, D.K. Walters, W. Befort, and D.F. Heinzen. Satellite Inventory of Minnesota Forest Resources. Photogrammetric Engineering and Remote Sensing, in press.
Thermal infrared radiation refers to electromagnetic waves with a wavelength of between 3.5 and 20 micrometers. Most remote sensing applications make use of the 8 to 13 micrometer range. The main difference between THERMAL infrared and the infrared discussed above is that thermal infrared is emitted energy, whereas the near infrared (photographic infrared) is reflected energy.
Interpreting Thermal Scanning Imagery
Limitations of Thermal Infrared Imaging
There are some limitations of thermal imagery you should be aware of if you plan to use it in your GIS:
The following radar images come from sites all over the world. The files at NASA's Jet Propulsion Laboratory have explanations accompanying the images.
Spaceborne Synthetic Aperture Radar, Oetxal, Austria. This file was created by NASA's Jet Propulsion Laboratory in Pasadena, CA.
Last updated 2000.2.6. LNC.