Basic Concepts

It is necessary to understand a few basic remote-sensing concepts before we begin discussing how remotely sensed imagery can be used in ecological modeling.

EM spectrum

The electromagnetic spectrum (EMS) includes wavelengths of EM radiation ranging from short-wavelength (high-frequency) gamma rays to long-wavelength (low-frequency) radio waves. We focus on the region of the spectrum starting in the ultraviolet and continuing through the microwave wavelengths. Optical sensors are used to measure ultraviolet, visible, and infrared wavelengths, and microwave sensors are used for the microwave portion of the EMS.

A fundamental physical principal that remote sensing relies on is that different features on the Earth's surface interact with specific wavelengths of the EMS in different ways. When working with optical sensors the most important property used to identify features on the

Spectral curves

Spectral curves

Population Dynamics Concept Map

Wavelength (pm)

Figure 1 Spectral signatures for selected features. Materials on the Earth's surface have unique spectral reflectance properties. This figure shows the spectral reflectance curves of six common materials. Many of the methods used in remote sensing are designed to associate the spectral information acquired by a sensor with the spectral qualities of features that are to be identified.

Wavelength (pm)

Figure 1 Spectral signatures for selected features. Materials on the Earth's surface have unique spectral reflectance properties. This figure shows the spectral reflectance curves of six common materials. Many of the methods used in remote sensing are designed to associate the spectral information acquired by a sensor with the spectral qualities of features that are to be identified.

Earth's surface is spectral reflectance: the ratio of the intensity of light reflected from a surface divided by the intensity of incident light. Different features have different spectral reflectance properties and we can use this information to identify individual features. For example, white sand reflects most visible and near-infrared light whereas green vegetation absorbs most red wavelengths and reflects most near-infrared wavelengths. Figure 1 illustrates the spectral properties of different materials.

Some remote-sensing instruments also provide information about how EM energy interacts with the surface of a feature or within a three-dimensional feature such as a forest. These will be discussed later in this article.

What is an image?

The most familiar form of remotely sensed data is an image. An image is made up of individual elements that are arranged in a grid of rows and columns. These elements are called pixels. When zooming into an image, individual pixels can be seen (Figure 2).

In addition to rows and columns of pixels, images also have layers. These layers are commonly referred to as 'bands' or 'channels'. Throughout this article we use the term 'band' to refer to the layers in an image. These bands also correspond to different wavelengths of the EMS. Remote-sensing instruments vary in the number of bands recorded, some only record a single band of data while others record hundreds. A convention has been established to use 'hyperspectral' to describe images with many bands (usually well over 100) and 'multispectral' for

Figure 2 Zooming to an individual pixel. The image in the left is printed at full resolution, it is a Landsat Enhanced Thematic Mapper Plus (ETM+) image acquired over Burlington, Vermont on 21 August 1999. The image in the top-right is a subset of this image that has been magnified by a factor of 3. In the magnified image individual pixels (the square blocks that make up the image) can be seen. The three black and white images in the bottom-right represent the three image bands that are used to create the color image. In this case the red band is from the ETM+ band 4 (near-infrared), the green band is from ETM+ band 5 (mid-infrared), and the blue band is from ETM+ band 3 (red). These three bands are combined to make the color image.

Figure 2 Zooming to an individual pixel. The image in the left is printed at full resolution, it is a Landsat Enhanced Thematic Mapper Plus (ETM+) image acquired over Burlington, Vermont on 21 August 1999. The image in the top-right is a subset of this image that has been magnified by a factor of 3. In the magnified image individual pixels (the square blocks that make up the image) can be seen. The three black and white images in the bottom-right represent the three image bands that are used to create the color image. In this case the red band is from the ETM+ band 4 (near-infrared), the green band is from ETM+ band 5 (mid-infrared), and the blue band is from ETM+ band 3 (red). These three bands are combined to make the color image.

images with fewer (usually from three to a few dozen) bands.

With most imagery the individual bands are used to record radiance values at different wavelengths. Radiance is a measurement of the intensity of EM energy. In other words, the sensor is measuring the intensity of light when it hits the detector. The units for this measurement are typically watts per steradian per square meter (Wsr-1m~2). It is important to understand that optical sensors measure radiance and not reflectance. Reflectance, which is the ratio of reflected light over incident light, can be estimated using image-processing methods but the physical property recorded by the sensor is radiance (Figure 3).

Different platforms and orbits

For local and detailed information the airplane is still often the platform of choice since it is possible to select which sensors should be mounted for a particular application and it is possible to determine when to fly. Aircraft have the ability to fly low to acquire imagery with a lot of detail. For global and systematic coverage, satellites are the standard remote-sensing platform. Most satellite orbits can be classified as either geostationary or polar orbiting. Geostationary satellites orbit the Earth in the equatorial plane with the same orbital period as the Earth so the position remains fixed over a particular point on the Earth and therefore it can continously view the same area. These satellites are commonly used to monitor the weather but are too far from the Earth's surface (^38 500 km) for detailed environmental monitoring. More common for Earth remote sensing is a near-polar orbit that provides a near-global

Radiance at the sensor is a measure of the electromagnetic radiation hitting the sensor's detector

Radiance at the sensor is a measure of the electromagnetic radiation hitting the sensor's detector

Incident light

Reflected light

Reflectance is the ratio of the intensity of reflected radiation over incident radiation

Target

Figure 3 Reflectance and radiance. Remote-sensing detectors measure radiance which is the energy of the radiation hitting the detector. Reflectance, which must be calculated, is the ratio of the intensity of reflected radiation over incident radiation.

view of the Earth over a regular time period, for example, every 16 days in the case of Landsat. It is important to note that with a near-polar orbit the polar regions are not viewed from the satellite. For this reason, when people mention global remotely sensed data sets they often mean the data sets are near-global. Polar and near-polar orbiting satellites fly only several hundred kilometers above the Earth's surface.

Passive versus active remote sensing

Remote-sensing instruments are often categorized as having either active or passive sensors. An active sensor generates

Table 1 Active and passive satellite-based remote-sensing instruments

Sensor name

Type

Wavelength range

Resolution (m)

IKONOS

Optical

450-900 nm

1-4

SPOT5a

Optical

500-1750 nm

2.5-10

IRS-P6 - LISS-4

Optical

520-860 nm

5.8

ALOS AVNIR-2

Optical

420-500 nm

10

ASTER

Optical

520-11 650 nm

15-60

Landsat ETM+

Optical

450-2350 nm

15-30

MODIS

Optical

459-14 385 nm

250-1000

AVHRR

Optical

580-12 500 nm

1000

ENVISAT

Radar

5.7 cm (C band)

25

RADARSAT-1

Radar

5.7 cm (C band)

10-100

RADARSAT-2

Radar

5.6 cm (C band)

3-100

ALOS - PALSAR

Radar

23.5cm (L band)

10-100

its own signal which is subsequently measured when reflected back by the Earth's surface. A passive sensor measures solar energy that is either reflected or emitted from features on the Earth's surface. Table 1 lists a number of different active and passive instruments mounted on satellite platforms.

Although most passive sensors operate in the visible and infrared portions of the EMS, there are also some passive microwave sensors in use that measure a number of parameters such as wind speed, atmospheric and sea surface temperature, soil moisture, rainfall, and atmospheric water vapor.

An advantage of passive sensors is that most rely on the Sun's energy to illuminate the target and therefore do not need their own energy source so in general they are simpler instruments. A limitation for most passive optical sensors is that they require daylight to operate, although there are some sensors that record nighttime lights and clouds at night and others that record energy emitted from the Earth's surface. Since most of these sensors operate in the visible and infrared wavelengths, they are adversely affected by weather and cloud cover. Lastly, since sunlight is primarily reflected from the top of a feature, such as a forest, it is not possible to 'see' under a canopy to measure vegetation structure. To obtain this kind of information it is necessary to use active sensors.

Active sensors, such as radar and lidar emit their own energy to illuminate a target and are comprised of a signal generator and receiver. They measure the strength of the returned signal and the time delay between when the instrument emits the energy and when it receives the returned pulse. These two types of information are used to describe vegetation structure. Radar is an acronym for 'radio detection and ranging'. Radar systems operate in the long-wavelength microwave portion of the EMS and thus are largely unaffected by clouds and rain. They can be considered all-weather systems. Lidar is an acronym for 'light detection and ranging' and these systems use lasers that emit light in the visible and near-infrared portions of the EMS. In a lidar system a single light pulse can reflect off of several features in vertical space such as different layers in a forest. A single emitted pulse will result in a wave or series of returned pulses that are recorded by the detector. These return pulses can be recorded as a wave (full-waveform lidar) or in discrete pieces that correspond to the peaks in the returned signal. A number of different types of lidar systems have been developed but most provide the capability to record the first and last return of the light pulse. The returns correspond to the top of an object (i.e., top of a tree canopy) and the base substrate that the object is resting on (i.e., the ground). This is ideal for measuring the height of features such as trees or buildings.

Radar systems behave differently from optical systems with respect to how they interact with materials. The signal from most radar systems can penetrate well into a forest canopy, and radar systems with especially long wavelengths (e.g., P-band systems) can even penetrate dry ground.

Although lidar is probably best know for its capability of acquiring digital elevation data which can be used to describe topography, these systems show a lot of promise to be able to directly measure vegetation structure characteristics. Lidar can make direct measurements of vegetation structure and provide vertical information about structure that is largely missing in data collected using passive remote-sensing instruments which only record light reflected from the top of a canopy.

What qualities determine what can be identified in an image?

There are different characteristics that affect the detail that can be resolved (seen) in a digital image. These are traditionally referred to as the four types of image resolution. Most people think of 'resolution' as being synonymous with spatial resolution but other 'resolution' terms used in the formal literature are as follows:

Spatial resolution. This is often simply referred to as 'resolution' and is the size of a pixel (smallest discrete scene element and image display unit) in ground dimensions. In most cases an image's resolution is labeled with a single number, such as 30 m, which represents the length of a side of a square pixel if it were projected onto the Earth's surface. Ifthe pixel were rectangular (not very common any more), then the length and width of the pixel would be provided.

Spectral characteristics. This includes bandwidth, band placement, and the number of bands. Spectral bandwidth, or spectral resolution as it is often called, refers to the range of wavelengths that are detected in a particular image band. This is effectively a measure of how precisely an image band measures a portion of the EMS. Band placement defines the portion of the EMS that is used for a particular image band. For example, one band might detect blue wavelengths and another band might detect thermal wavelengths along the EMS. The properties of the features one is interested in sensing indicate which bands are important. The last spectral variable is the number ofbands. The more bands that are available the more precisely spectral properties of a feature can be measured.

Acquisition dynamics. This has two components. The first is the minimum time a particular feature can be recorded twice, often called the repeat frequency of temporal resolution. Some sensors with a very wide field of view can acquire multiple images of the same area in the same day whereas some sensors have a repeat frequency of several weeks. It should also be reiterated that most remote-sensing satellites have a near-polar orbit and are not able to acquire imagery at the poles since their orbit does not go over these areas. The other component is the timing of the acquisitions. Dynamic features such as deciduous forests and events such as flooding often have an optimum time for which they should be imaged. For example, the identification of deciduous vegetation is aided by acquiring imagery during leaf-on and during leaf-off periods.

Sensitivity of the sensor. This is defined by the dynamic range of the sensor as well as the range of digital numbers that can be used to represent the pixel values. Sensors have lower limits below which a signal is not registered and upper limits above which the sensor saturates and is unable to measure increases in radiance. The detail that can be measured between these extremes is determined by the range between the minimum and maximum digital numbers permitted for a particular data type. For example, Landsat TM data values can range from 0 to 255, whereas IKONOS values range form 0 to 2048. This potential range of values is often referred to as quantization or radiometric resolution.

10 Ways To Fight Off Cancer

10 Ways To Fight Off Cancer

Learning About 10 Ways Fight Off Cancer Can Have Amazing Benefits For Your Life The Best Tips On How To Keep This Killer At Bay Discovering that you or a loved one has cancer can be utterly terrifying. All the same, once you comprehend the causes of cancer and learn how to reverse those causes, you or your loved one may have more than a fighting chance of beating out cancer.

Get My Free Ebook


Post a comment