Introduction to Remote Sensing

Remote sensing is the process of detecting and monitoring the physical characteristics of an area by measuring its reflected and emitted radiation at a distance from the targeted area.

While remote sensing is commonly used as a synonym for satellite data, the concept of remote sensing can also be applied to aerial photography or lidar from drones. The remote part of remote sensing means that you are gathering data from a distance.

A Brief History of US Satellite Imagery

The first images of the earth from space were captured in 1947 from a camera placed in a sub-orbital German V-2 rocket repurposed by the US after WW II.

First Space Image, 1947 (NASA)

Although the Russians would beat the US into orbit with Sputnik I on 4 October 1957, the United States would be the first to return crude satellite images from a television camera onboard Explorer VI on 14 August, 1959.

Explorer VI Satellite Image, 1959 (NASA)

The then-secret Corona defense intelligence satellite project would have beaten Explorer VI into space by a few months, but a string of technical failures delayed the first images until Discoverer 13 in 18 August 1960. The satellites used film cameras to capture high resolution images that were then returned to earth in a re-entry capsule that was captured mid-air by a recovery airplane. As befits a cold-war era project the first high-resolution image from space was of the Russian Mys Shmidta Airfield on 18 August 1960.

Corona Spy Satellite Image of Mys Shmidta Airfield, 1960 (National Reconnaissance Office)

The first truly functional civilian satellite remote sensing system was the Television Infrared Observation Satellite (TIROS) series, the first of which launched on 1 April 1960. This inaugurated the use of satellites for weather observation and forecasting.

TIROS Weather Image, 1960 (NASA)

Applications of Remote Sensing

Satellite data and imagery has a wide variety of uses in the natural sciences in addition to its military and commercial value.

As an introduction to the wide variety of (perhaps unexpected) uses for remotely sensed data, skim this list of 100 Earth Shattering Remote Sensing Applications and Uses.

100 Earth Shattering Remote Sensing Applications and Uses

Orbits

Satellites travel in an orbit around the earth in a way that the centrifugal force of the circling of the satellite around the planet counterbalances the pull of gravity so that the satellite stays aloft.

Orbits can have a number of different characteristics, which involve different types of movement relative to the earth:

The type of orbit determines the temporal resolution of a satellite, or how often and long the satellite senses any particular location on the surface of the earth. Some satellite systems orbit to the same location daily, while others that are designed to observe the entire earth may take days or weeks to return to the same location.

Rasters

Satellites almost always capture data as rasters, which are regular grids of rectangular pixels.

Pixels in a Remotely-Sensed Image
Spatial resolution is the amount of area on the surface of the earth covered by one pixel. Spatial resolution is usually measured by the distance in meters between the center of each pixel in the raster.

Dallas at Medium Spatial Resolution (MODIS)
Dallas at High Spatial Resolution (Landsat)

Swath

Because there are technical and cost limitations to resolution that defines how much detail a satellite sensor can capture at any one time, satellite data capture follows a narrow path or swath along the ground. The width of this swath varies by different satellite systems based on their purpose.

Satellites can scan these swaths in two different ways:

Types of Swath

Electromagnetic Radiation

Remote sensing takes advantage of the emission and reflection of electromagnetic radiation by objects on the surface of the earth to capture what is where on the surface of the earth.

Objects reflect, absorb, and emit energy in a unique way, and at all times. This energy is called electromagnetic radiation and is emitted in waves that are able to transmit energy from one place to another. These waves originate from billions of vibrating electrons, atoms, and molecules, which emit and absorb electromagnetic radiation.

Different types of electromagnetic radiation are distinguished by the speed of their vibration. As these waves travel through space at 300,000 kilometers per second (186,000 miles/sec) the distance between the vibrating pulses is called the wavelength and is usually measured in meters or nanometers (one billionth of a meter). The speed of the vibration is called frequency and is usually measured in Hertz (1 Hz = one vibration per second)

The higher the temperature of an object, the faster its electrons vibrate and the shorter its peak wavelength of emitted radiation. Conversely, the lower the temperature of an object, the slower its electrons vibrate, and the longer its peak wavelength of emitted radiation.

The fundamental unit of electromagnetic phenomena is the photon, the smallest possible amount of electromagnetic energy of a particular wavelength. Photons are units of energy rather than matter, so they have no mass. The energy of a photon determines the frequency (and wavelength) of light that is associated with it. The greater the energy of the photon, the greater the frequency and vice versa.

Electromagnetic radiation is a part of our lives in many ways. Different frequencies of electromagnetic radiation have different propagation characteristics. These characteristics make different frequencies of electromagnetic radiation useful for different types of remote sensing.

The Electromagnetic Spectrum (Lawrence Berkeley National Laboratory)

Electromagnetic radiation is different from particle radiation associated with radioactive materials like uranium and nuclear power plants. Particle radiation results from subatomic particles being thrown off by nuclear reactions. Particle radiation is often associated with electromagnetic radiation, but the primary health concern with any kind of radiation is ionization, which occurs when radiation pushes electrons out of atoms and leaves them as ions with a positive charge. With living cells, this ionization damages the cell DNA and can lead to cell death or mutations and cancer.

Bands

While early satellites captured only panchromatic (grayscale) visible light, contemporary satellites often have sensors that capture different ranges or bands of electromagnetic radiation.

The number of different bands that can be handled by the satellite sensors is called the spectral resolution. The appropriate spectral resolution depends on the purpose of the satellite.

For space imagery we are usually most interested in the red (430-480 THz), green (540-580 THz), and blue (610-670 THz) bands that the three different types of cone cells in our eye retinas can detect as visible light colors.

Other bands are useful for analyzing a variety of phenomena. For example, biogeographers commonly use a combination of red and near-infrared bands called normalized difference vegetation index (NDVI) to determine levels of vegetation in a particular area.

NDVI is based on a characteristic that photosynthetic green plants tend to reflect infrared light to avoid overheating. They also reflect green light, which is why they appear green to our eyes. However, they absorb red light to power the process of photosynthesis.

This phenomena can be used with the Landsat near infrared band (band 5) and the red band (band 4) to calculate an index that is highest in areas with large amounts of vegetation, and lower in areas of low vegetation.

Normalized Difference Vegetation Index

The range of the index is negative one to positive one.

When near infrared is high and red is low, that is when plants are reflecting infrared and absorbing red, that's when NDVI is high and closer to one.

1 - 0
----- = 1
1 + 0

When near infrared is low and red is high, such as with bare ground or water, NDVI is low and closer to negative one.

0 - 1
----- = -1
0 + 1

Ground Control and Downlinks

As with GPS, satellites used for remote sensing are controlled through a mission operations center (MOC). All contemporary satellite data is returned to earth with radio signals through downlink stations that then relay that data to the MOC for processing, storage and communication. The photo below is of a particularly remote downlink station in the Arctic used by the Landsat system.

German Remote Sensing Data Center, Neustrelitz (DLR)

Landsat

Of the hundreds of remote-sensing satellites launched in the past half century, the Landsat program has proved especially valuable for civilian use. Landsat 1 was launched on 23 July 1972 and subsequent satellites have provided continuous satellite imagery of the Earth. This is arguably one of the most important scientific enterprises of our time, and if you work with remote sensing, you will probably use Landsat data on multiple occasions.

Landsat 7 and Landsat 8 are currently operational.

Landsat VII (NASA)

Landsat satellites complete just over 14 orbits a day, covering the entire earth every 16 days. This gives a temporal resolution of 16 days, although if there is cloud cover when the satellite passes over a location, the data may be unusable. Landsat satellites travel in a near-polar, sun-synchronous orbit.

Landsat data is publicly available as scenes, or images that cover an area around 100 miles square. Scenes are designated by a path number and row number.

Landsat Orbit Coverage (NASA)

The scene below was acquired from Landsat 8 on 10 November 2015. It is path 34, row 32, covering North Central Colorado with Denver in the lower right-hand corner.

Landsat Image of Denver, 10 November 2015 (NASA)

Landsat 8 captures data in 11 different bands, giving it a spectral resolution that includes the visible light, near-infrared, short wave infrared, and thermal infrared bands.

Landsat 8 spatial resolution varies from 15 meters with panchromatic data (grayscale) to 30 meters with multispectral data, to 100 meters with thermal data.

Google Maps / Earth

When using satellite view in Google Maps or Google Earth, you will notice a copyright at the bottom for TerraMetrics, a commercial geospatial data company that Google buys their imagery from. Although this is referred to as "satellite" view, what you see at different levels of zoom is "multiple layers of data such as satellite imagery, aerial photography, synthetic ocean imagery, roadways, location names, addresses and more, which come from many different data and imagery providers."

Indeed, rather than giving a purely faithful picture of what you would see from space, this is an artistic representation of the Earth that is used to effectively communicate what is where so that users can interpret that data more clearly and act accordingly.

Threats to Satellite Systems

There are around 1,100 active satellites plus another 2,600 or so that have been decommissioned. This does not include as many as 500,000 pieces of "space junk" the size of a marble or larger that NASA tracks to help safeguard space operations. While most space junk will eventually return to earth, the constant addition of new satellites and launch stages, and the occasional collision of satellites (turning two satellites into multiple pieces of junk) means that the threat to space operations by space junk is increasing and irreversible.

As the list of space-faring nations grows, terrestrial conflicts could extend to actions against spaceborne systems, making military and civilian geospatial technology highly vulnerable to disruption or destruction by state and non-state actors.

All satellite systems are expensive to build, launch, maintain, and renew. As such, they are dependent upon political and economic support that is tenuous in the contemporary American political environment.

If this is an area of interest to you, you might consider browsing the 2001 Report of the Commission to Assess United States National Security Space Management and Organization.

Landsat Raster Visualization in ArcMap

Landsat Raster Visualization in ArcMap