Difference between revisions of "Remote Sensing"

From Wiki - Scioly.org
Jump to navigation Jump to search
m (Typo (changed from "evapotraspiration" to "evapotranspiration")
Line 22: Line 22:
 
In '''Remote Sensing''', teams use remote sensing image, such as photographic and spectroscopic information, to analyze data. The topic for the [[2017]] season is Climate Change Processes.
 
In '''Remote Sensing''', teams use remote sensing image, such as photographic and spectroscopic information, to analyze data. The topic for the [[2017]] season is Climate Change Processes.
  
Each team can bring 1 double sided [[Note Sheet]], as well as two metric rulers, protractors, triangles, magnifying glasses, and non-graphing calculators.
+
Each participant may bring one 8.5" x 11" double-sided [[Note Sheet]], as well as a metric ruler, a protractor, and any kind of (non-graphing) calculator.
  
 
This year, the tests tend to comprise of a mix of image interpretation as well as questions regarding concepts of remote sensing and hydrosphere. Some ecology/biology background is useful. Knowledge of individual space programs and NASA satellites, in addition to the types of sensors used, is also useful.
 
This year, the tests tend to comprise of a mix of image interpretation as well as questions regarding concepts of remote sensing and hydrosphere. Some ecology/biology background is useful. Knowledge of individual space programs and NASA satellites, in addition to the types of sensors used, is also useful.

Revision as of 02:56, 20 January 2017

Template:EventLinksBox

In Remote Sensing, teams use remote sensing image, such as photographic and spectroscopic information, to analyze data. The topic for the 2017 season is Climate Change Processes.

Each participant may bring one 8.5" x 11" double-sided Note Sheet, as well as a metric ruler, a protractor, and any kind of (non-graphing) calculator.

This year, the tests tend to comprise of a mix of image interpretation as well as questions regarding concepts of remote sensing and hydrosphere. Some ecology/biology background is useful. Knowledge of individual space programs and NASA satellites, in addition to the types of sensors used, is also useful.

Please note that acronyms are used often in this event. Several times in this page, an acronym will be listed alongside a key term. They are there since acronyms can be used as questions. (i.e. what does RADAR stand for?) There are many of them, but do not get confused because of them. Keep in mind that they are symbols of the idea they represent, and not independent entities.

<spoiler text="Remote Sensing 2009">

Remote Sensing 2009

"Participants will use remote sensing imagery, science and mathematical process skills to complete tasks related to an understanding of the causes and consequences of global warming." - Remote Sensing rules 2009

You may bring five (5) pages of double-sided paper with notes in any form. Each participant may bring any non-graphing calculator.

This event is essentially a test based on identifying satellite imagery. Be prepared to study about and memorize different NASA space programs aimed at imaging earth from space. Also, learn to identify different human constructions based on satellite photos. Test questions will often be open-ended, with answers to questions based on analysis of such satellite images in visible, infrared, and radio wavelengths. Other such images include but are not limited to charts of variation in average temperature and measure of chlorophyll concentration in the ocean. </spoiler>

Satellites

History

While not an integral part of the event, it is expected that a student know the basic history behind Remote Sensing.

The Beginning

Remote sensing began in the 1860s, when Felix Tournachon, a Frenchman, took aerial photographs from a balloon. In the 1880s, cameras were mounted on kites and took pictures via a remote control mechanism. When heavier-than-air flight was invented, it was only logical to take pictures from airplanes also. During WWI, cameras mounted on planes, or held by aviators were used in military reconnaissance.

Sputnik 1, the first artificial satellite

Remote Sensing Advances

As early as 1904, rockets were used to launch cameras to heights of 600 meters. But until the 1960s, aerial photographs from airplanes were the only way to take pictures of Earth's landscape. It took the space race between the US and the USSR to start remote sensing from above the atmosphere.

The first satellite was Sputnik 1, launched by the USSR on 4 October 1957. Sputnik helped to identify the density of high atmospheric layers and provided data on radio signals in the ionosphere. In early 1955 the US was working on Project Orbiter, which used a Jupiter C rocket to launch a satellite. The project, led by legendary rocket scientist Wernher von Braun, succeeded, and Explorer 1 became the US’ first satellite on January 31, 1958.

Examples of Instruments

Instruments are instrumental (no pun intended) to the function of satellites and remote sensing. Know what types of instruments will be used for certain applications.

RADAR: short for Radio Detection and Ranging. It transmits radio waves, which are scattered and reflected when they come into contact with something. They can pass through water droplets and are generally used with active remote sensing systems. Radar is good for locating objects and measuring elevation.
LIDAR: short for Light Detection and Ranging. It is similar to RADAR but uses laser pulses instead of radio waves.
TM: stands for Thematic Mapper. It was introduced in the Landsat program and involves seven image data bands that scan along a ground track.
ETM+: stands for Enhanced Thematic Mapper Plus. It replaced the TM sensor on Landsat 7. Unlike TM, it has eight bands.
MSS: stands for Multispectral Scanner. It was introduced in the Landsat program also, and each band responds to a different type of radiation, thus the name “multispectral”.
MODIS: stands for Moderate-resolution Imaging Spectroradiometer. It is on the Terra and Aqua satellites. It measures cloud properties and radiative energy flux.
CERES: stands for Clouds and the Earth's Radiant Energy System. It is on the Terra and Aqua satellites. It measures broadband radiative energy flux.
SeaWiFS (Sea-viewing Wide Field-of-View Sensor): Eight spectral bands of very narrow wavelength ranges, monitors ocean primary production and phytoplankton processes, ocean influences on climate processes (heat storage and aerosol formation), and monitors the cycles of carbon, sulfur, and nitrogen.

Other Instruments

  • ALI: Advanced Land Imager
  • ASTER: Advanced Spaceborne Thermal Emission and Reflection Radiometer
  • ATSR: Along Track Scanning Radiometer
  • AVHRR: Advanced Very High Resolution Radiometer (used with NOAA)
  • AVIRIS: Airborne Visual/Infrared Imaging Spectrometer
  • CCD: Charge Coupled Devices
  • CZCS: Coastal Zone Color Scanner
  • GPS: Global Positioning System
  • HRV: High Resolution Visible sensor
  • LISS-III: Linear Imaging Self-Scanning Sensor
  • MESSR: Multispectral Electronic Self-Scanning Radiometer
  • MISR: Multi-angle Imaging Spectro Radiometer
  • MSR: Microwave Scanning Radiometer
  • RAR: Real Aperture Radar
  • VTIR: Visible and Thermal Infrared Radiometer
  • WiFS: Wide Field Sensor

This is in no way a comprehensive list of the instruments tested on in Remote Sensing.

Examples of Satellites

There are countless numbers of satellites currently orbiting Earth, but tests will mostly focus on satellite programs directed by NASA (National Aeronautics and Space Administration). Some other agencies of note include the NOAA (National Oceanic and Atmospheric Administration) and the IRS (Indian Remote Sensing).

EOS

Satellites especially likely to appear on tests are those that come from the Earth Observation System (EOS). The EOS is a series of NASA satellites designed to observe the Earth's land, atmosphere, biosphere, and hydrosphere. The first EOS satellite was launched in 1997.

A-Train: an EOS satellite constellation scheduled to be with seven satellites working together in Sun synchronous (SS) orbit. Their compiled images have resulted in high-resolution images of the Earth's surface.

The A-Train was supposed to be a group of seven satellites operated by NASA. There are only four currently in formation, due to failures, or change of orbit.

The NASA Earth Observatories
  • Active:
    • Aqua: studies the water cycle, such as precipitation and evaporation
    • CloudSat: studies the altitude and other properties of clouds
    • CALIPSO: studies of clouds and air particles, and their effect on Earth's climate
    • Aura: studies Earth's ozone layer, air quality, and climate
  • Past:
    • PARASOL: studies radiative and microphysical properties of clouds and air particles; moved to lower orbit
  • Failed:
    • OCO(Orbiting Carbon Observatory): was intended to study atmospheric carbon dioxide
    • Glory: was to study radiative and microphysical properties of air particles

Both failures occurred because of launch vehicle failure.

Other EOS Satellites

Landsat: A series of 8 satellites using multiple spectral bands. Only three are operational today: Landsat 5, Landsat 7, and the Landsat Data Continuity Mission or "LDCM" (launched in February 2013). These are generally the most commonly tested satellites, as well as those using the ASTER sensor. The name Landsat is a mixture of the two words "land" and "satellite".
Terra: Provides global data on the atmosphere, land, and water. Its scientific focus includes atmospheric composition, biogeochemistry, climate change and variability, and the water and energy cycle.

Other Satellites

There are other notable satellites that may appear on exams and are not affiliated with the EOS.

  • GOES (Geostationary Operational Environmental Satellite) System: 2 weather satellites in Geostationary orbit 36000 km. It is partially organized by NASA, in cooperation with NOAA.
  • MOS: Marine Observation Satellite
  • SEASAT: SEA SATellite
  • SPOT: Système Pour l'Observation de la Terre

Satellite Imaging

The formal definition of remote sensing is the science of acquiring data without being in contact with it. For this reason, a major part of this event involves the processing of images and analyzing them to come to a conclusion.

Image Processing

Satellite data is sent from the satellite to the ground in a raw digital format. The smallest unit of data is represented by a binary number. This data will be strung together into a digital stream and applied to a single dot, or pixel (short for "picture element") which gets a value known as a Digital Number (DN). Each DN corresponds to a particular shade of gray that was detected by the satellite. These pixels, when arranged together in the correct order, form an image of the target where the varying shades of gray represent the varying energy levels detected on the target.

The human eye can only distinguish between about 16 shades of gray in an image, but it is able to distinguish between millions of colors. Thus, a common image enhancement technique is to assign specific DN values to specific colors, increasing the contrast. A true color image is one for which the colors have been assigned to DN values that represent the actual spectral range of the colors used in the image. A photograph is an example of a true color image. False color (FC) is a technique by which colors are assigned to spectral bands that do not equate to the spectral range of the selected color. This allows an analyst to highlight particular features of interest using a color scheme that makes the features stand out.

Composites

Composites are images where multiple individual satellite images have been combined to produce a new image. This process is used to create more detailed images that take multiple factors into account, as well as to find patterns that would not have been revealed in a single image. It also helps to create larger images than the satellite itself can make. This is because each satellite covers a specific swath, or area imaged by a satellite with a fixed width. When these swaths are put together into a composite, a larger area is imaged.

To begin understanding composites, we must first understand how they are made. First, work with three black and white transparencies of the same image. Each represents a different spectral band - blue, green, and red. Shine white light through each one onto a screen. Then, project each band through a filter of the same color- blue band through a blue filter, green through a green, red through a red. Because the blue images are clear on the blue spectral band image, they'll appear blue on the composite. If you line up the three images, you'll have the natural color (or very close) of the image. You've just made a color composite. This process is called "color additive viewing."

Not all composites have to have natural colors. What would happen if you projected the red band through a green filter? Or the green band through a blue filter? If you have an infrared band as one of the transparencies and shine it through the red filter, you can make something called a "False Color Composite." (FCC) You may have seen false color composites in competition. Often, they are used to show healthy vegetation compared to vegetation poor in health. They may appear the same naturally, but false color displays healthy vegetation in a much brighter tone. For example, false color composites may show a football field made up of healthy grass as a strong red color, but a football field composed of Astroturf or other artificial substances will show up as a duller red.

Common composites:

  • True-color composite- useful for interpreting man-made objects. Simply assign the red, green, and blue bands to the respective color for the image.
  • Blue-near IR-mid IR, where blue channel uses visible blue, green uses near-infrared (so vegetation stays green), and mid-infrared is shown as red. Such images allow seeing the water depth, vegetation coverage, soil moisture content, and presence of fires, all in a single image.
    • NearIR is usually assigned to red on the image; thus, vegetation often appears bright red in false color images, rather than green, because healthy vegetation reflects a lot of nearIR radiation.

Contrast

Contrast refers to the difference in relative brightness between an item and its surroundings as seen in the image. A particular feature is easily detected in an image when contrast between an item and its background are high. However, when the contrast is low, an item might go undetected in an image.

Resolution

Resolution is a property of an image that describes the level of detail that can be discerned from it. This is important, as images with higher resolution will have higher detail. There are several types of resolution that are important to remote sensing. One of these is spatial resolution, which is the smallest detail a sensor can detect. Since the smallest element in a satellite image is a pixel, spatial resolution describes the area on the Earth's surface represented by each pixel. For example, in a weather satellite image that has a resolution of 1 km, each pixel represents the brightness of an area that is 1 km by 1 km.

Other types of resolution include spectral resolution, or the ability of sensor to distinguish between fine wavelength intervals; radiometric resolution, which is the ability of sensor to discriminate very small differences in energy; and temporal resolution, or the time between which the same area is viewed twice.

A key thing to keep in mind is that the resolution of a particular satellite sensor must be optimized for the intended use of the data. Weather satellites generally monitor weather patterns that cover hundreds of miles, so there is no need for resolution higher than 0.5 km. Landsat and other land-use satellites need to distinguish between much smaller items, so a higher resolution is required. The trade-off for higher resolution, however, is that the amount of data produced by the satellite is much greater, which increases transmission times and burdens the mission.

Active Sensing vs. Passive Sensing

Active sensing occurs when the satellite produces radiation on its own, and then senses the backscatter. This is useful since it does not depend on outside radiation, but it uses up energy more quickly. Examples of active sensors are a laser fluorosensor and synthetic aperture radar (SAR). Passive sensing, on the other hand, senses naturally available radiation to create a picture. It does not need to use energy to produce radiation, but it is dependent on the outside radiation's existence. If there is little or no outside radiation, the satellite cannot function well.

The Electromagnetic Spectrum

Electromagnetic radiation (EMR) is the most common energy source for remote sensing. It consists of an electric and magnetic field perpendicular to each other and the direction of travel while traveling at the speed of light. This is important to remote sensing because that's how sensors detect certain data about the objects a satellite is studying.

Radiation is an important part of remote sensing, since different materials respond to radiation in different ways, so this can be used to identify objects. One example of this is scattering (or atmospheric scattering), where particles in the atmosphere redirect radiation. There are three types: Raleigh, Mie, and non-selective. This scattering is used to identify the presence and quantity of certain gases in the atmosphere. Also, transmission is when radiation passes through a target, indicating it is unaffected by that particular wave.

There are several types of electromagnetic energy that can be emitted, depending on their wavelength. All of them are found in the electromagnetic spectrum (EMS), represented in the image below:

Em spect.jpeg

It's important to know which types of energy are useful for what.

Gamma rays and x-rays cannot be used for remote sensing because they are absorbed by the Earth's atmosphere: in general, the shorter the wavelength (and the greater the frequency), the more absorption occurs.
Ultraviolet radiation is not useful either because it is blocked by the ozone layer.
Visible light allows satellites to detect colors a human eye would see. Some of these satellites are panchromatic, meaning they are sensitive to all wavelengths of visible light.
Infrared (IR) is divided into categories: near infrared, reflected infrared and thermal infrared. Near infrared (NIR) is useful for vegetation, and thermal infrared (TIR) is also known as heat and is emitted passively, not actively.
Microwaves are used in radar. (see more in Sensors section)

Image Interpretation

Image interpretation and analysis is a huge part of the Remote Sensing event. It involves locating, identifying or measuring certain objects in images acquired using Remote Sensing. This isn't as straightforward as it may seem. There are plenty of features that can throw you off in each image. However, some features are the same in each image as well. There will always be a "target" to look for, which will always contrast with other parts of the image- making it "distinguishable".

According to the Canada Centre for Remote Sensing, whose tutorial you can find in the external links section, there are several things to look for to assist in image interpretation. These are tone, shape, size, pattern, texture, shadow, and association.

  • Tone is the brightness or color of an object. It's the main way to distinguish targets from backgrounds.
  • Shape is the shape of an object. Note that a straight-edged shape is usually man-made, such as agricultural or urban structures. Irregular-edged shapes are usually formed naturally.
  • Size, relative or absolute, can be determined by finding common objects in images, such as trees or roads. (see Finding Area section, below)
  • Pattern refers to the arrangement of objects in an image, such as the spacing between buildings in an urban setting.
  • Texture is the arrangement of tone variation throughout the image.
  • Shadow can help determine size and distinguish objects.
  • Association refers to things that are associated with one another in photographs, which can assist interpretation, i.e. boats on a lake, etc.
An example of splitting an irregular shape into more workable shapes

Finding Area

Another major part of image interpretation is determining the surface area of a particular area of interest. You will often be asked on a test to find the area of some piece of land, but this piece of land is usually not regularly shaped, like a rectangle. It'll have lots of different curves, and at first, it may seem difficult to find the exact area. However, an easy way to estimate area is to split up this irregular shape into smaller, easier shapes, like rectangles or circles. Then, you can add up the areas of the individual shapes to get the total area of the piece of land.

Before doing this, though, you need to take scale into consideration. Scale is the ratio of size on image to real-life size. For example, if the scale on an image is 1 inch:25 miles, each inch on the image represents 25 miles in real life. To find the area of one of your shapes, measure its dimensions with your ruler in inches (or centimeters, if the scale says so) and then multiply that number by the scale to find how long each of your dimensions is in real life. Do this for all of your smaller, more regular shapes. Then, just find the areas of all of them and add them together. Your answer should be approximately the area of the piece of land. It will not be exact, nor will it need to be, as test graders should have a range of values that they will accept as being correct.

NDVI

A NDVI map of Europe

During the competition, you may be asked to analyze a picture's NDVI values. NDVI stands for "Normalized Difference Vegetation Index" and is used to describe various land types, usually to determine whether or not the image contains vegetation. The equation provided by USGS for NDVI is as follows:

NDVI = (Channel 2 - Channel 1) / (Channel 2 + Channel 1)

Channel 1 is in the red light part of the electromagnetic spectrum. In this region, the chlorophyll absorbs much of the incoming sunlight. Channel 2 is in the Near Infrared part of the spectrum, where the plant's mesophyll leaf structure can cause reflectance. You may also see the equation given like so:

[math]\displaystyle{ \mbox{NDVI}=\frac{(\mbox{NIR}-\mbox{VIS})}{(\mbox{NIR}+\mbox{VIS})} }[/math]

(Where NIR is Near Infrared and VIS is Visual (Red) Light)

So, healthy vegetation has a low red light reflectance (Channel 1) and a high infrared reflectance (Channel 2). This would produce a high NDVI value. As the amount of vegetation decreases, so too do the NDVI values. The range of NDVI values is -1 to +1.

Generally, areas rich in vegetation will have higher positive values. Soil tends to cause NDVI values somewhat lower than vegetation, small positive amounts. Bodies of water, such as lakes or oceans, will have even lower positive (or, in some cases, high negative) values.

There are some factors that may affect NDVI values. Atmospheric conditions can have an effect on NDVI, as well as the water content of soil. Clouds sometimes produce NDVI values of their own, but if they aren't thick enough to do so, they may throw off measurements considerably.

EVI

EVI, or the Enhanced Vegetation Index, was created to improve off of NDVI and eliminate some of its errors. It has an improved sensitivity to regions high in biomass and its elimination of canopy background. The equation for EVI is as follows: [math]\displaystyle{ EVI= G \times \frac{(NIR-RED)}{(NIR+C1 \times RED-C2 \times Blue+L)} }[/math]

Where NIR is again Near Infrared, and Red and Blue are of course those colors' bands. All three of these are at least partially atmospherically-corrected surface reflectances. The equation filters out canopy noise through L.

EVI has been adopted as a standard product for two of NASA's MODIS satellites, Terra and Aqua. As it factors out background noise, it's often considered to be more popular than NDVI.

Ecology aspect

Generally, Remote Sensing contains an ecological focus, which the satellite portion is meant to reflect. In 2012, the ecology subject is projected to be Human Interaction on the Hydrosphere.

Human Interaction

Human Interaction with the Earth is a large part of Remote Sensing. It emphasizes how humans affect the ecosystem on scales detectable by remote sensing. This interaction will often be represented in tests as deforestation, ozone layer changes, changes in land use, retreat of glaciers, and loss of sea ice, among others.

The Keeling Curve is a graph derived from a series of measurements of carbon dioxide atop Mauna Loa since 1958. It showed the increasing levels of carbon dioxide in the atmosphere as well as annual cycles. It is credited as being the first evidence to bring attention to the problem of greenhouse gases. Charles Keeling began the research and his son, Richard Keeling, took over after his death in 2005.

Global Warming

When human impact on the environment is mentioned, one of the main ideas it entails is global warming. Global warming is defined as “the increase in the average temperature of Earth's near-surface air and oceans since the mid-20th century and its projected continuation”. The causes of global warming are debated, but the main consensus is that the main cause is the increase in concentrations of greenhouse gases. Possible results of global warming include a rise in sea levels, a change in weather patterns, the retreat of glaciers and sea ice, species extinctions, and an increased frequency of extreme weather.

The greenhouse effect is caused by certain greenhouse gases that trap heat in the Earth’s atmosphere. The main gases, along with their percent contribution to the greenhouse effect, are water vapor (36-70%), carbon dioxide (9-26%), methane (4-9%), and ozone (3-7%). Of these, carbon dioxide is perhaps the gas most focused on as a potential human impact on the environment; thus, it is the most likely to appear on tests. Humans have increased the amounts of these and other greenhouse gases in the atmosphere during industrialization periods such as the Industrial Revolution. CFC’s and nitrous oxide are among the greenhouse gases now present in the atmosphere that were not before.

Carbon Cycle

The carbon cycle is the process through which carbon atoms are cycled through the environment. It cycles through the atmosphere as carbon dioxide, and some carbon is dissolved into the hydrosphere. It is also taken in by plants during photosynthesis and released when the plants die. When animals feed on plants, they also take in carbon.

However, the burning of fossil fuels, which come from biomatter, releases excess carbon into the atmosphere, increasing the concentration of carbon dioxide. Carbon can be stored for long periods of time in trees and soil in forest biomes, so the altering of this balance affects the cycle of carbon and can help global warming and climate change. The resulting global warming can then affect plant growth since slight changes in temperature or other biotic factors can kill off certain species of plants. Since there would be less plants remaining alive, more carbon dioxide stays in the atmosphere rather than being taken in by plants.

Hydrological Cycle

The hydrologic cycle, more commonly known as the water cycle, describes the cycle through which water travels. Its base is the more commonly known cycle of evaporation, condensation, and precipitation. Among smaller parts of the water cycle, water is stored as ice and snow in cold climates. It also enters the ground through infiltration, although some simply flows over it as surface runoff. The groundwater flow then takes this water to the oceans where it reenters the main cycle.

Finally, some evaporation occurs as evapotranspiration in plants. Fewer plants would result in less carbon taken in, and thus more carbon dioxide in the atmosphere contributing to the greenhouse effect.

Albedo

Albedo is a very important way remote sensing is used to detect climate changes. Albedo is how much a substance reflects light from the sun. This is useful when determining Earth's energy balance and how much energy and heat the ground absorbs. Since ice and snow are very reflective and have high albedos, the presence of them is important since it keeps the Earth from getting too warm. As albedo decreases, the Earth absorbs more energy and warms up.

Resources

See Also

Practice Remote Sensing Test

Textbooks

Links

Older links

NOTE: These links are not relevant to the 2012 event

<spoiler text="Older links">

</spoiler>