Difference between revisions of "Remote Sensing"

From Wiki - Scioly.org
Jump to: navigation, search
m (Bot: Replacing category Study Event Pages with Study events)
 
(106 intermediate revisions by 32 users not shown)
Line 1: Line 1:
 
{{EventLinksBox
 
{{EventLinksBox
 +
|active=
 +
|type=Earth Science
 +
|cat=Study
 
|2009thread=[http://www.scioly.org/phpBB3/viewtopic.php?f=17&t=88 2009]
 
|2009thread=[http://www.scioly.org/phpBB3/viewtopic.php?f=17&t=88 2009]
|2009tests=[http://scioly.org/wiki/2009_Test_Exchange#Remote_Sensing 2009]
 
 
|2010thread=[http://www.scioly.org/phpBB3/viewtopic.php?f=67&t=1390 2010]
 
|2010thread=[http://www.scioly.org/phpBB3/viewtopic.php?f=67&t=1390 2010]
|2010tests=[http://scioly.org/wiki/2010_Test_Exchange#Remote_Sensing 2010]
 
 
|2011thread=[http://scioly.org/phpBB3/viewtopic.php?f=93&t=2204 2011]
 
|2011thread=[http://scioly.org/phpBB3/viewtopic.php?f=93&t=2204 2011]
|2011tests=[http://scioly.org/wiki/2011_Test_Exchange#Remote_Sensing 2011]
+
|2012thread=[http://scioly.org/phpBB3/viewtopic.php?f=121&t=2968 2012]
 +
|2013thread=[http://scioly.org/phpBB3/viewtopic.php?f=144&t=3707 2013]
 +
|2017thread=[http://scioly.org/phpBB3/viewtopic.php?f=227&t=9299 2017]
 +
|2018thread=[https://scioly.org/forums/viewtopic.php?f=265&t=10875 2018]
 +
|2017tests=2017
 +
|2018tests=2018
 +
|2018questions=[https://scioly.org/forums/viewtopic.php?f=266&t=10961 2018]
 +
|testsArchive=true
 +
|2017questions=[http://scioly.org/phpBB3/viewtopic.php?f=228&t=9653 2017]
 +
|C Champion=[[Seven Lakes High School]]
 
}}
 
}}
 +
In '''Remote Sensing''', a [[Division C]] event, teams use remote sensing image, such as photographic and spectroscopic information, to analyze data and/or make climate models.
  
In '''Remote Sensing''', "participants will use remote sensing imagery, science and mathematical process skills to complete tasks related to an understanding of the causes and consequences of human interaction with forest biomes." - Remote Sensing rules 2010
+
Each team may bring four 8.5" x 11" double-sided [[Note Sheet]], as well as a metric ruler, a protractor, and any kind of (non-graphing) [[Calculators|calculator]].
  
Like 2009, 5 double sided sheets of paper are permitted, as well as a non-graphing calculator.
+
Remote Sensing is in rotation for the [[2017]] and [[2018]] seasons, and was previously an event for roughly ten years, through [[2013]].
  
This year, the tests tend to comprise of a mix of image interpretation as well as questions regarding concepts of remote sensing and forest biome biology knowledge. Some ecology/biology background is useful. Knowledge of individual space programs and NASA satellites, in addition to the types of sensors used, is useful.
+
The tests tend to be comprised of a mix of image interpretation as well as questions regarding concepts of remote sensing and climate change processes (carbon cycle, aerosols, ozone depletion, etc.). Some ecology/biology background is useful, as well as meteorology and knowledge of basic physics concepts. Knowledge of individual space programs and NASA satellites, in addition to the types of sensors used, is recommended.
  
==Image Interpretation==
+
'''''Please note''''' that acronyms are used often in this event. Several times in this page, an acronym will be listed alongside a key term. They are there since acronyms can be used as questions. (i.e. what does RADAR stand for?) There are many of them, but do not get confused because of them. Keep in mind that they are symbols of the idea they represent, and not independent entities.
===The Basics===
 
Image interpretation and analysis is a huge part of the Remote Sensing event. It involves locating, identifying or measuring certain objects in images acquired using Remote Sensing. This isn't as straightforward as it may seem. There are plenty of features that can throw you off in each image. However, some features are the same in each image as well. There will always be a "target" to look for, which will always contrast with other parts of the image- making it "distinguishable". All images in the Remote Sensing event will be in analog format- photograph form- rather than digital.  
 
  
According to the Canada Centre for Remote Sensing, whose tutorial you can find in the external links section, there are several things to look for to assist in image interpretation. These are tone, shape, size, pattern, texture, shadow, and association.  
+
== Topics ==
 +
Remote Sensing rotates between topics occasionally. In addition, the specific aspects of remote sensing, including the satellites and types of data focused on, often change between years. '''The topic for the 2018 season is the same as it was in 2017, Climate Change Processes.'''
 +
{|class="wikitable"
 +
|-
 +
! Season
 +
! Topic
 +
|-
 +
! [[2018]]
 +
| Climate Change Processes
 +
|-
 +
! [[2017]]
 +
| Climate Change Processes
 +
|-
 +
! [[2013]]
 +
| Earth's Hydrosphere
 +
|-
 +
! [[2012]]
 +
| Earth's Hydrosphere
 +
|-
 +
! [[2011]]
 +
| Human Impact on Earth
 +
|-
 +
! [[2010]]
 +
| Human Interactions with Forest Biomes
 +
|-
 +
! [[2009]]
 +
| Human Land Use Patterns?
 +
|-
 +
! [[2008]]
 +
| Mars
 +
|-
 +
! [[2007]]
 +
| Mars
 +
|-
 +
! [[2006]]
 +
|
 +
|-
 +
! [[2005]]
 +
|
 +
|-
 +
! [[2004]]
 +
|
 +
|-
 +
! [[2003]]
 +
|
 +
|-
 +
! [[2002]]
 +
|
 +
|-
 +
|}
  
*'''Tone''' is the brightness or color of an object. It's the main way to distinguish targets from backgrounds.
+
== Satellites ==
*'''Shape''' is the shape of an object. Note that a straight-edged shape is usually man-made, such as agricultural or urban structures. Irregular-edged shapes are usually formed naturally.
+
=== History ===
*'''Size''', relative or absolute, can be determined by finding common objects in images, such as trees or roads.
+
While not an integral part of the event, it is expected that a student know the basic history behind Remote Sensing.
*'''Pattern''' refers to the arrangement of objects in an image, such as the spacing between buildings in an urban setting.
 
*'''Texture''' is the arrangement of tone variation throughout the image.
 
*'''Shadow''' can help determine size and distinguish objects.
 
*'''Association''' refers to things that are associated with one another in photographs, which can assist interpretation, i.e. boats on a lake, etc.
 
  
===The Composites===
+
==== The Beginning ====
Projecting color-filtered black and white images through certain filters can yield color composites. The images you'll be dealing with in this event will most likely be composites.  
+
Remote sensing began in the 1860s, when Gaspard-Felix "Nadar" Tournachon, a Frenchman, took aerial photographs from a balloon. In the 1880s, cameras were mounted on kites and took pictures via a remote control mechanism. When heavier-than-air flight was invented, it was only logical to take pictures from airplanes also. During WWI, cameras mounted on planes, or held by aviators were used in military reconnaissance.
 +
[[File:Sputnik.gif|300px|thumb|left|Sputnik 1, the first artificial satellite]]
  
To begin understanding composites, we must first understand how they are made. First, work with three black and white transparencies of the same image. Each represents a different spectral band - blue, green, and red. Shine white light through each one onto a screen. Then, project each band through a filter of the same color- blue band through a blue filter, green through a green, red through a red. Because the blue images are clear on the blue spectral band image, they'll appear blue on the composite. If you line up the three images, you'll have the natural color (or very close) of the image. You've just made a color composite. This process is called "color additive viewing."
+
==== Remote Sensing Advances ====
 +
As early as 1904, rockets were used to launch cameras to heights of 600 meters. But until the 1960s, aerial photographs from airplanes were the only way to take pictures of Earth's landscape. It took the space race between the US and the USSR to start remote sensing from above the atmosphere.
  
Not all composites have to have natural colors. What would happen if you projected the red band through a green filter? Or the green band through a blue filter? If you have an infrared band as one of the transparencies and shine it through the red filter, you can make something called a "False Color (IR) Composite." You may have seen false color composites in competition. Often, they are used to show healthy vegetation compared to vegetation poor in health. They may appear the same naturally, but false color displays healthy vegetation in a much brighter tone. For example, false color composites may show a football field made up of healthy grass as a strong red color, but a football field composed of Astroturf or other artificial substances will show up as a duller red.
+
The first satellite was Sputnik 1, launched by the USSR on 4 October 1957. Sputnik helped to identify the density of high atmospheric layers and provided data on radio signals in the ionosphere. In early 1955 the US was working on Project Orbiter, which used a Jupiter C rocket to launch a satellite. The project, led by legendary rocket scientist Wernher von Braun, succeeded, and Explorer 1 became the US’ first satellite on January 31, 1958.
  
It's important to understand what bands correspond to what wavelengths for satellites. This link: http://geo.arc.nasa.gov/sge/health/sensor/cfsensor.html, is the best source for all of the bands of the major satellites. Another link, http://www.physicalgeography.net/fundamentals/2e.html, is good for a very brief overview of the topic of remote sensing.  
+
=== Examples of Instruments ===
 +
Instruments are instrumental (no pun intended) to the function of satellites and remote sensing. Know what types of instruments will be used for certain applications.
 +
 
 +
:'''RADAR''': short for Radio Detection and Ranging. The name comes from WWII, when radio waves were actually used in radar. The waves are scattered and reflected when they come into contact with something. Modern-day radars actually use microwaves. They can pass through water droplets and are generally used with active remote sensing systems. Radar is good for locating objects and measuring elevation.
 +
:'''LIDAR''': short for Light Detection and Ranging. It is similar to RADAR but uses laser pulses instead of radio waves.
 +
:'''TM''': stands for Thematic Mapper. It was introduced in the Landsat program and involves seven image data bands that scan along a ground track.
 +
:'''ETM+''': stands for Enhanced Thematic Mapper Plus. It replaced the TM sensor on Landsat 7. Unlike TM, it has eight bands.
 +
:'''MSS''': stands for Multispectral Scanner. It was introduced in the Landsat program also, and each band responds to a different type of radiation, thus the name “multispectral”.
 +
:'''MODIS''': stands for Moderate-resolution Imaging Spectroradiometer. It is on the Terra and Aqua satellites. It measures cloud properties and radiative energy flux.
 +
:'''CERES''': stands for Clouds and the Earth's Radiant Energy System. It is on the Terra and Aqua satellites. It measures broadband radiative energy flux.
 +
:'''SeaWiFS (Sea-viewing Wide Field-of-View Sensor)''': Eight spectral bands of very narrow wavelength ranges, monitors ocean primary production and phytoplankton processes, ocean influences on climate processes (heat storage and aerosol formation), and monitors the cycles of carbon, sulfur, and nitrogen.
 +
 
 +
'''Other Instruments'''
 +
* '''ALI''': Advanced Land Imager
 +
* '''ASTER''': Advanced Spaceborne Thermal Emission and Reflection Radiometer
 +
* '''ATSR''': Along Track Scanning Radiometer
 +
* '''AVHRR''': Advanced Very High Resolution Radiometer ''(used with NOAA)''
 +
* '''AVIRIS''': Airborne Visual/Infrared Imaging Spectrometer
 +
* '''CCD''': Charge Coupled Devices
 +
* '''CZCS''': Coastal Zone Color Scanner
 +
* '''GPS''': Global Positioning System
 +
* '''HRV''': High Resolution Visible sensor
 +
* '''LISS-III''': Linear Imaging Self-Scanning Sensor
 +
* '''MESSR''': Multispectral Electronic Self-Scanning Radiometer
 +
* '''MISR''': Multi-angle Imaging Spectro Radiometer
 +
* '''MSR''': Microwave Scanning Radiometer
 +
* '''RAR''': Real Aperture Radar
 +
* '''VTIR''': Visible and Thermal Infrared Radiometer
 +
* '''WiFS''': Wide Field Sensor
 +
 
 +
This is in no way a comprehensive list of the instruments tested on in Remote Sensing. Participants are encouraged to at least demonstrate basic knowledge about how these instruments work.
 +
 
 +
For those who are interested, a large list of acronyms used by NASA can be found [https://earthdata.nasa.gov/user-resources/acronym-list here]
 +
 
 +
==== Active Sensing vs. Passive Sensing ====
 +
'''Active sensing''' occurs when the satellite produces radiation on its own, and then senses the backscatter. This is useful since it does not depend on outside radiation, but it uses up energy more quickly. Examples of active sensors are a laser fluorosensor and synthetic aperture radar (SAR). These instruments operate even at night, because they do not rely on reflected radiation (which is usually solar in origin). '''Passive sensing''', on the other hand, senses naturally available radiation to create a picture. It does not need to use energy to produce radiation, but it is dependent on the outside radiation's existence. If there is little or no outside radiation, the satellite cannot function well. One exception includes thermal infrared (TIR) sensors, which actually obtain better information at night, because of natural emission of thermal energy.
 +
 
 +
=== Examples of Satellites ===
 +
There are countless numbers of satellites currently orbiting Earth, but tests will mostly focus on satellite programs directed by NASA (National Aeronautics and Space Administration). Some other agencies of note include the '''NOAA''' (National Oceanic and Atmospheric Administration), the '''CSA''' (Canadian Space Agency), '''JAXA''' (Japanese Aerospace Exploration Agency), the '''ESA''' (European Space Agency) and the '''IRS''' (Indian Remote Sensing).
 +
 
 +
==== EOS ====
 +
Satellites especially likely to appear on tests are those that come from the Earth Observing System (EOS). The EOS is a series of NASA satellites designed to observe the Earth's land, atmosphere, biosphere, and hydrosphere. The first EOS satellite was launched in 1997.
 +
 
 +
'''A-Train''', or '''EOS-PM''': an EOS satellite constellation scheduled to be with seven satellites working together in Sun synchronous (SS) orbit. Their compiled images have resulted in high-resolution images of the Earth's surface.
 +
 
 +
The A-train, (also known as the '''Afternoon constellation''' because of its south-to-north equatorial crossing time of 1:30 PM) was planned to have seven satellites. Five satellites currently fly in the constellation.
 +
[[File:Earth_Observing_Sys_2017.jpeg|450px|thumb|right|The NASA Earth Observatories]]
 +
 
 +
* Active:
 +
** '''OCO-2''': studies global atmospheric carbon dioxide. It is a replacement for the failed OCO.
 +
** '''GCOM-W1 "SHIZUKU"''': studies the water cycle.
 +
** '''Aqua''': studies the water cycle, such as precipitation and evaporation.
 +
** '''CALIPSO''': studies of clouds and air particles, and their effect on Earth's climate.
 +
** '''Aura''': studies Earth's ozone layer, air quality, and climate.
 +
* Past:
 +
** '''PARASOL''': studied radiative and microphysical properties of clouds and air particles; moved to lower orbit and deactivated in 2013.
 +
** '''CloudSat''': studies the altitude and other properties of clouds; moved to lower orbit following partial mechanical failure in February 2018.
 +
* Failed to achieve orbit:
 +
** '''OCO (Orbiting Carbon Observatory)''': was intended to study atmospheric carbon dioxide.
 +
** '''Glory''': was to study radiative and microphysical properties of air particles.
 +
Both failures occurred because of launch vehicle failure.
 +
 
 +
'''Morning Constellation''', or '''EOS-AM''', is a second constellation in the EOS. It is so named because of its 10:30 AM north-to-south equatorial crossing. Currently there are three satellites in this constellation - the fourth, EO-1, was decommissioned in March 2017.
 +
 
 +
:'''Landsat''': A series of 8 satellites using multiple spectral bands. Only two are operational today: Landsat 7 and Landsat 8 (launched in February 2013). These are some of the most commonly tested satellites. The name Landsat is a mixture of the two words "land" and "satellite".
 +
:'''Terra''': Provides global data on the atmosphere, land, and water. Its scientific focus includes atmospheric composition, biogeochemistry, climate change and variability, and the water and energy cycle.
 +
 
 +
Remember that the aforementioned satellites hardly comprise an exhaustive list - the EOS is a very large collection of satellites. It is almost impossible to know all of them thoroughly, so it is suggested that participants familiarize themselves with the most important ones.
 +
 
 +
==== Other Satellites ====
 +
There are other notable satellites that may appear on exams and are not affiliated with the EOS.
 +
 
 +
* '''GOES (Geostationary Operational Environmental Satellite) System''': 2 weather satellites in Geostationary orbit 36000 km. It is partially organized by NASA, in cooperation with NOAA.
 +
* '''MOS''': Marine Observation Satellite
 +
* '''SeaSat''': SEA SATellite. This satellite is especially significant as the first satellite focused on the oceans, as well as the first satellite to carry synthetic aperture radar (SAR).
 +
* '''SPOT''': Système Pour l'Observation de la Terre. This is a series of 7 CNES satellites similar to the Landsat program, with a more commercial focus.
 +
 
 +
== Satellite Imaging ==
 +
The formal definition of ''remote sensing'' is the science of acquiring data without being in contact with it. For this reason, a major part of this event involves the processing of images and analyzing them to come to a conclusion.
 +
 
 +
=== Image Processing ===
 +
Satellite data is sent from the satellite to the ground in a raw digital format. The smallest unit of data is represented by a binary number. This data will be strung together into a digital stream and applied to a single dot, or pixel (short for "picture element") which gets a value known as a Digital Number (DN). Each DN corresponds to a particular shade of gray that was detected by the satellite. These pixels, when arranged together in the correct order, form an image of the target where the varying shades of gray represent the varying energy levels detected on the target. The pixels are arranged into a matrix. This data is then either stored in the remote sensing platform temporarily, or transmitted to the ground. The data can then be manipulated mathematically for various reasons, which will be discussed below.
 +
 
 +
The human eye can only distinguish between about 16 shades of gray in an image, but it is able to distinguish between millions of colors. Thus, a common image enhancement technique is to assign specific DN values to specific colors, increasing the contrast. A ''true color'' image is one for which the colors have been assigned to DN values that represent the actual spectral range of the colors used in the image. A photograph is an example of a true color image. ''False color'' (FC) is a technique by which colors are assigned to spectral bands that do not equate to the spectral range of the selected color. This allows an analyst to highlight particular features of interest using a color scheme that makes the features stand out.
 +
 
 +
Additionally, remote sensing instruments often operate continuously for very long times, which means that they are prone to instrument error and malfunction. Different processing techniques can remedy this.
 +
 
 +
==== Composites ====
 +
'''Composites''' are images where multiple individual satellite images have been combined to produce a new image. This process is used to create more detailed images that take multiple factors into account, as well as to find patterns that would not have been revealed in a single image. It also helps to create larger images than the satellite itself can make. This is because each satellite covers a specific ''swath'', or area imaged by a satellite with a fixed width. When these swaths are put together into a composite, a larger area is imaged.
 +
 
 +
Traditionally, composites were made by merging colors. Three black and white transparencies of the same image are first made. Each represents a different spectral band - blue, green, and red. Shine white light through each one onto a screen. Then, each band is projected through a filter of the same color - blue band through a blue filter, green through a green, red through a red. Because the blue images are clear on the blue spectral band image, they'll appear blue on the composite. When the three images are aligned, the resulting image will have the natural color (or very close) of the original. This process is called "color additive viewing", and red, green, and blue are often known as ''additive colors'' because they add together to create new colors. The earliest color films would record multispectral scenes in multicolored films, which were then developed and merged into one colored film image.
 +
 
 +
In satellite data, this crude method is replaced by assigning color gradient values to DNs. When the three colorized images are merged, a true color image is formed.
 +
 
 +
Not all composites are true colored. A ''False Color Composite (FCC)'' results when one band is assigned color gradient values for another color, such as if the blue band DNs were set to correspond to shades of red. One extremely popular FCC combination assigns red colors to the NIR image. Since healthy vegetation reflects strongly in the NIR region of the EM spectrum, a FCC using this combination displays areas with healthy vegetation as red. This also differentiates natural features from artificial ones: a football field made up of healthy grass has a strong red color, but a football field composed of Astroturf or other artificial substances will show up as a duller red, or even dark brown.
  
 
Common composites:
 
Common composites:
*True-color composite- useful for interpreting man-made objects. Simply assign the red, green, and blue bands to the respective color for the image.
+
* True-color composite- useful for interpreting man-made objects. Simply assign the red, green, and blue bands to the respective color for the image.
*Blue-nearIR-midIR, where blue channel uses visible blue, green uses near-infrared (so vegetation stays green), and mid-infrared is shown as red. Such images allow seeing the water depth, vegetation coverage, soil moisture content, and presence of fires, all in a single image.
+
* Blue-near IR-mid IR, where blue channel uses visible blue, green uses near-infrared (so vegetation stays green), and mid-infrared is shown as red. Such images allow seeing the water depth, vegetation coverage, soil moisture content, and presence of fires, all in a single image.
**NearIR is usually assigned to red on the image; thus, vegetation often appears bright red in false color images, rather than green, because healthy vegetation reflects a lot of nearIR radiation.
+
** NearIR is usually assigned to red on the image; thus, vegetation often appears bright red in false color images, rather than green, because healthy vegetation reflects a lot of nearIR radiation. This can also be used in identifying urban/artificial areas.
 +
 
 +
==== Contrast ====
 +
'''Contrast''' refers to the difference in relative brightness between an item and its surroundings as seen in the image. A particular feature is easily detected in an image when contrast between an item and its background are high. However, when the contrast is low, an item might go undetected in an image.
 +
 
 +
==== Resolution ====
 +
'''Resolution''' is a property of an image that describes the level of detail that can be discerned from it. This is important, as images with higher resolution will have higher detail. There are several types of resolution that are important to remote sensing. One of these is ''spatial resolution'', which is the smallest detail a sensor can detect. Since the smallest element in a satellite image is a pixel, spatial resolution describes the area on the Earth's surface represented by each pixel. For example, in a weather satellite image that has a resolution of 1 km, each pixel represents the brightness of an area that is 1 km by 1 km.
 +
 
 +
Other types of resolution include ''spectral resolution'', or the ability of sensor to distinguish between fine wavelength intervals; ''radiometric resolution'', which is the ability of sensor to discriminate very small differences in energy; and ''temporal resolution'', or the time between which the same area is viewed twice.
 +
 
 +
A key thing to keep in mind is that the resolution of a particular satellite sensor must be optimized for the intended use of the data. Weather satellites generally monitor weather patterns that cover hundreds of miles, so there is no need for resolution higher than 0.5 km. Landsat and other land-use satellites need to distinguish between much smaller items, so a higher resolution is required. The trade-off for higher resolution, however, is that the amount of data produced by the satellite is much greater, which increases transmission times and burdens the mission. In addition, smaller areas contain less radiometric output, so spectral resolution generally decreases for increased spatial resolution.
 +
 
 +
==== Image Enhancement ====
 +
Images often need to be enhanced to facilitate easier interpretation. These enhancements include, but are not limited to, contrast enhancement, haze/noise corrections, instrument corrections, and smoothing operations.
 +
* Contrast enhancement: usually, the two extreme DN values are resampled to the minimum value, 0, and the maximum value that the sensor supports. All values in the middle are recalculated either by a line of best fit (linear contrast stretching) or by frequency of the values in the image (histogram stretching).
 +
* Haze corrections reflect the fact that haze usually causes a uniform increase across all DNs in an area. The sensor is calibrated over a source with a known radiative output (such as a body of water, which would have a value approaching 0) and all DNs in the area are resampled by subtracting that value.
 +
* Noise corrections involve the use of ''kernels'', matrices of DNs surrounding a central DN. Weighted averages are determined and a threshold value for the difference between this average value and the average value is set. If the difference exceeds the threshold, the central DN is reassigned the average value; if the difference is less than than the threshold, the DN is conserved.
 +
* Instrument corrections generally account for malfunctioning sensors on the CCD. Corrections are similar to those used in haze corrections.
 +
* Smoothing operations simply create smoother images with less detail, which serves a variety of purposes.
 +
 
 +
=== The Electromagnetic Spectrum ===
 +
'''Electromagnetic radiation''' (EMR) is the most common energy source for remote sensing. It consists of an electric and magnetic field perpendicular to each other and the direction of travel while traveling at the speed of light. This is important to remote sensing because that's how sensors detect certain data about the objects a satellite is studying.
 +
 
 +
Radiation is an important part of remote sensing, since different materials respond to radiation in different ways, so this can be used to identify objects. One example of this is ''scattering'' (or atmospheric scattering), where particles in the atmosphere redirect radiation. There are three types: Raleigh, Mie, and non-selective. This scattering is used to identify the presence and quantity of certain gases in the atmosphere. Also, ''transmission'' is when radiation passes through a target, indicating it is unaffected by that particular wave.
 +
 
 +
There are several types of electromagnetic energy that can be emitted, depending on their wavelength. All of them are found in the electromagnetic spectrum (EMS).
  
===The Electromagnetic Spectrum===
+
[[File:em_spect.jpeg|450px|thumb|left|The electromagnetic spectrum]]
Energy can be emitted, transmitted, absorbed or reflected in waves when it hits a surface. This is important to remote sensing because that's how sensors detect certain data about the objects a satellite is studying. '''Active''' sensors emit radiation toward an object and measure its reflectance. '''Passive''' sensors simply use the energy already being radiated from objects without emitting any of their own.
 
  
Their are several types of energy that can be emitted, depending on their wavelength:
+
It's important to know which types of energy are useful for what:
[[image: em_spect.jpeg]]
 
  
It's important to know which types of energy are useful for what.  
+
:''Gamma rays'' and ''x-rays'' cannot be used for remote sensing because they are absorbed by the Earth's atmosphere: in general, the shorter the wavelength (and the greater the frequency), the more absorption occurs.
 +
:''Ultraviolet radiation'' is not useful either because it is blocked by the ozone layer.
 +
:''Visible light'' allows satellites to detect colors a human eye would see. Some of these satellites are ''panchromatic'', meaning they are sensitive to all wavelengths of visible light.
 +
:''Near Infrared (NIR)'', the region just past the visible portion of the spectrum, is useful for monitoring vegetation, as healthy vegetation reflects much NIR.
 +
:''Short Wave Infrared (SWIR)'', a region somewhere past NIR, which is useful for determining the spectral signature of objects. A spectral signature is the telltale reflectance of radiation by a material across the spectrum. Each object has a unique spectral signature.
 +
:''Thermal Infrared (TIR)'', the kind of IR we perceive as "heat". The Earth naturally emits TIR, so TIR remote sensing usually involves passively detecting radiation in this region of the spectrum. This is useful for determining temperatures of objects.
 +
:''Microwaves'' are used in radar. Different radars utilize different wavelengths, which range from relatively shortwave regions, such as the W-band, to longwave regions such as the L-band. Radars are very good at penetrating foliage, and in the case of longwave radars, the ground, to a certain extent. Microwaves also reveal a lot about the properties of a surface, such as its dielectric constant, moisture, etc.
  
Gamma rays and x-rays cannot be used for remote sensing because they are absorbed by the Earth's atmosphere: in general, the shorter the wavelength (and the greater the frequency), the more absorption occurs.
+
Since the atmosphere's components selectively absorb select wavelengths, only certain regions of the spectrum may actually be used in remote sensing. These regions are known as ''atmospheric windows''.
  
Ultraviolet radiation is not useful either because it is blocked by the ozone layer.
+
==== Blackbody Radiation ====
 +
In physics, a ''blackbody'' is an ideal object which absorbs and re-emits 100% of all incident radiation. The spectral signature for a blackbody is modeled by a blackbody curve, determined by the ''Planck Function''. The blackbody curve is dependent on temperature. In practice, blackbodies do not exist; instead, most objects are ''graybodies'', which emit a certain percentage of the radiation absorbed. This percentage is known as ''emissivity''.
  
Visible light allows satellites to detect colors a human eye would see.
+
[[File:Blackbodycurves.png|300px|thumb|right|Select blackbody curves]]
Infrared is divided into categories: near infrared, reflected infrared and thermal infrared. Near infrared is useful for vegetation, and thermal infrared is also known as heat and is emitted passively, not actively.
 
  
Microwaves are used in radar (see more in Sensors section)
+
Integrating a Planck function blackbody curve gives the total ''radiant exitance'', or power emitted per unit area, of an object. Radiant exitance is specified by the ''Stefan-Boltzmann Law''. For more information on this law, see [[Reach for the Stars#Stefan-Boltzmann's Law]] and [[Climate Notes#Radiation equations]].
 +
 
 +
The ''dominant wavelength'' of an object is the wavelength that to a large extent determines the appearance of an object. The sun appears yellow because its dominant wavelength is in the yellow portion of the visible light region of the EM spectrum. The dominant wavelength of an object can be determined by ''Wein's Displacement Law''.
 +
 
 +
==== Albedo ====
 +
''Albedo'' is, simply put, the percentage of radiation that is reflected off of an object. Albedo and emissivity are important concepts to understand and differentiate. Whereas albedo concerns incident light that is reflected from a surface, emissivity concerns blackbody radiation that is emitted from an object. In real life applications, white objects, which reflect more wavelengths, have higher albedo. This reduces the amount of radiation that is absorbed, which also reduces the amount of radiation emitted by proxy.
 +
 
 +
=== Image Interpretation ===
 +
''For the basics of image interpretation, see [[Road Scholar#Satellite Images]].''
 +
 
 +
Image interpretation and analysis is a huge part of the Remote Sensing event. It involves locating, identifying or measuring certain objects in images acquired using Remote Sensing. This isn't as straightforward as it may seem. There are plenty of features that can throw you off in each image. However, some features are the same in each image as well. There will always be a "target" to look for, which will always contrast with other parts of the image- making it "distinguishable".
 +
 
 +
According to the Canada Centre for Remote Sensing, whose tutorial you can find in the external links section, there are several things to look for to assist in image interpretation. These are tone, shape, size, pattern, texture, shadow, and association.
 +
 
 +
* '''Tone''' is the brightness or color of an object. It's the main way to distinguish targets from backgrounds.
 +
* '''Shape''' is the shape of an object. Note that a straight-edged shape is usually man-made, such as agricultural or urban structures. Irregular-edged shapes are usually formed naturally.
 +
* '''Size''', relative or absolute, can be determined by finding common objects in images, such as trees or roads. (see Finding Area section, below)
 +
* '''Pattern''' refers to the arrangement of objects in an image, such as the spacing between buildings in an urban setting.
 +
* '''Texture''' is the arrangement of tone variation throughout the image.
 +
* '''Shadow''' can help determine size and distinguish objects.
 +
* '''Association''' refers to things that are associated with one another in photographs, which can assist interpretation, i.e. boats on a lake, etc.
 +
[[File:Irregular shape.jpg|300px|thumb|right|An example of splitting an irregular shape into more workable shapes]]
 +
 
 +
==== Finding Area ====
 +
 
 +
Another major part of image interpretation is determining the surface area of a particular area of interest. You will often be asked on a test to find the area of some piece of land, but this piece of land is usually not regularly shaped, like a rectangle. It'll have lots of different curves, and at first, it may seem difficult to find the exact area. However, an easy way to estimate area is to split up this irregular shape into smaller, easier shapes, like rectangles or circles. Then, you can add up the areas of the individual shapes to get the total area of the piece of land.
 +
 
 +
Before doing this, though, you need to take scale into consideration. '''Scale''' is the ratio of size on image to real-life size. For example, if the scale on an image is 1 inch:25 miles, each inch on the image represents 25 miles in real life. To find the area of one of your shapes, measure its dimensions with your ruler in inches (or centimeters, if the scale says so) and then multiply that number by the scale to find how long each of your dimensions is in real life. Do this for all of your smaller, more regular shapes. Then, just find the areas of all of them and add them together. Your answer should be approximately the area of the piece of land. It will not be exact, nor will it need to be, as test graders should have a range of values that they will accept as being correct.
 +
 
 +
==== NDVI ====
 +
[[File:NDVI.gif|300px|thumb|left|A NDVI map of Europe]]
  
===NDVI===
 
 
During the competition, you may be asked to analyze a picture's NDVI values. NDVI stands for "Normalized Difference Vegetation Index" and is used to describe various land types, usually to determine whether or not the image contains vegetation. The equation provided by USGS for NDVI is as follows:
 
During the competition, you may be asked to analyze a picture's NDVI values. NDVI stands for "Normalized Difference Vegetation Index" and is used to describe various land types, usually to determine whether or not the image contains vegetation. The equation provided by USGS for NDVI is as follows:
  
''NDVI = (Channel 2 - Channel 1) / (Channel 2 + Channel 1)''
+
<div align="center>''NDVI = (Channel 2 - Channel 1) / (Channel 2 + Channel 1)''</div>
  
 
Channel 1 is in the red light part of the electromagnetic spectrum. In this region, the chlorophyll absorbs much of the incoming sunlight. Channel 2 is in the Near Infrared part of the spectrum, where the plant's mesophyll leaf structure can cause reflectance. You may also see the equation given like so:
 
Channel 1 is in the red light part of the electromagnetic spectrum. In this region, the chlorophyll absorbs much of the incoming sunlight. Channel 2 is in the Near Infrared part of the spectrum, where the plant's mesophyll leaf structure can cause reflectance. You may also see the equation given like so:
  
:<math>\mbox{NDVI}=\frac{(\mbox{NIR}-\mbox{VIS})}{(\mbox{NIR}+\mbox{VIS})}</math>
+
<div align="center>[math]\mbox{NDVI}=\frac{(\mbox{NIR}-\mbox{VIS})}{(\mbox{NIR}+\mbox{VIS})}[/math]</div>
  
 
(Where NIR is Near Infrared and VIS is Visual (Red) Light)
 
(Where NIR is Near Infrared and VIS is Visual (Red) Light)
  
So, healthy vegetation has a low red light reflectance (Channel 1) and a high infrared reflectance (Channel 2). This would produce a high NDVI value. As the amount of vegetation decreases, so too does the NDVI values. The range of NDVI values is -1 to +1.
+
So, healthy vegetation has a low red light reflectance (Channel 1) and a high infrared reflectance (Channel 2). This would produce a high NDVI value. As the amount of vegetation decreases, so too do the NDVI values. The range of NDVI values is -1 to +1.
  
Generally, areas rich in vegetation will have higher positive values. Soil tends to cause NDVI values somewhat lower than vegetation, small positive amounts. Bodies of water, such as lakes or oceans, will have even lower positive (or, in some cases, high negative) values.  
+
Generally, areas rich in vegetation will have higher positive values. Soil tends to cause NDVI values somewhat lower than vegetation, small positive amounts. Bodies of water, such as lakes or oceans, will have even lower positive (or, in some cases, high negative) values.
  
There are some factors that may affect NDVI values. Atmospheric conditions can have an affect on NDVI, as well as the water content of soil. Clouds sometimes produce NDVI values of their own, but if they aren't thick enough to do so, they may throw off measurements considerably.  
+
There are some factors that may affect NDVI values. Atmospheric conditions can have an effect on NDVI, as well as the water content of soil. Clouds sometimes produce NDVI values of their own, but if they aren't thick enough to do so, they may throw off measurements considerably.
  
===EVI===
+
==== EVI ====
 
EVI, or the Enhanced Vegetation Index, was created to improve off of NDVI and eliminate some of its errors. It has an improved sensitivity to regions high in biomass and its elimination of canopy background. The equation for EVI is as follows:
 
EVI, or the Enhanced Vegetation Index, was created to improve off of NDVI and eliminate some of its errors. It has an improved sensitivity to regions high in biomass and its elimination of canopy background. The equation for EVI is as follows:
<math>EVI= G \times \frac{(NIR-RED)}{(NIR+C1 \times RED-C2 \times Blue+L)}</math>
+
<div align="center>[math]EVI= G \times \frac{(NIR-RED)}{(NIR+C1 \times RED-C2 \times Blue+L)}[/math]</div>
  
 
Where NIR is again Near Infrared, and Red and Blue are of course those colors' bands. All three of these are at least partially atmospherically-corrected surface reflectances. The equation filters out canopy noise through L.
 
Where NIR is again Near Infrared, and Red and Blue are of course those colors' bands. All three of these are at least partially atmospherically-corrected surface reflectances. The equation filters out canopy noise through L.
Line 84: Line 292:
 
EVI has been adopted as a standard product for two of NASA's MODIS satellites, Terra and Aqua. As it factors out background noise, it's often considered to be more popular than NDVI.
 
EVI has been adopted as a standard product for two of NASA's MODIS satellites, Terra and Aqua. As it factors out background noise, it's often considered to be more popular than NDVI.
  
==Satellites==
+
== Climate Change Processes ==
===Glossary===
+
The 2018 Remote Sensing topic is '''Climate Change Processes'''. Participants are encouraged to study climatology aspects as well as technology.
  
This is a list of some useful remote sensing vocabulary:
+
=== Human Interaction ===
All of this can be found in the ccrs tutorial
+
Human Interaction with the Earth is a large part of Remote Sensing. It emphasizes how humans affect the Earth on scales detectable by remote sensing. This interaction has previously been represented in tests as deforestation, ozone layer changes, changes in land use, retreat of glaciers, and loss of sea ice, among others. It is important to remember, however, that not all climate change processes are anthropogenic, and other climatology events may be tested as well.
  
:'''Absorption''': when substances absorb radiation
+
[[File:Keeling.jpg|350px|thumb|right|The Keeling Curve is a graph derived from a series of measurements of carbon dioxide atop Mauna Loa since 1958. It showed the increasing levels of carbon dioxide in the atmosphere as well as annual cycles. It is credited as being the first evidence to bring attention to the problem of greenhouse gases. Charles Keeling began the research and his son, Richard Keeling, took over after his death in 2005.]]
:'''Active sensing''': giving off radiation, then sensing the backscatter
 
:'''Electromagnetic radiation''': most common energy source for remote sensing consisting of an electric and magnetic field perpendicular to each other and the direction of travel while traveling at the speed of light c (3.0 m/sec)
 
:'''Frequency''': the number of waves passing a given point in a given amount of time; measured in hertz
 
:'''Image''': any pictoral representing any wavelength used in sensing
 
:'''Orbit''': path followed by a satellite
 
:'''Passive sensing''': sensing naturally available radiation
 
:'''Radiometric resolution''': ability of sensor to discriminate very small differences in energy
 
:'''Reflection''': when radiation is redirected upon striking a target; this is the target interaction useful for remote sensing
 
:'''Remote sensing''': the science of acquiring data without being in contact with it
 
:'''Scale''': ratio of size on image to real-life size
 
:'''Scattering''' (or '''atmospheric scattering'''): when particles in the atmosphere redirect radiation
 
:'''Spatial resolution''': smallest detail a sensor can detect
 
:'''Spectral resolution''': ability of sensor to distinguish between fine wavelength intervals
 
:'''Swath''': area imaged by a satellite with a fixed width
 
:'''Temporal resolution''': describes the time between which the same area is viewed twice
 
:'''Transmission''': when radiation passes through a target
 
:'''Wavelength''': the distance between two crests of a periodic
 
  
===Examples of Instruments===
+
==== Global Warming ====
Know what types of instruments will be used for certain applications.  
+
When human impact on the environment is mentioned, one of the main ideas it entails is global warming. Global warming is defined as “the increase in the average temperature of Earth's near-surface air and oceans since the mid-20th century and its projected continuation”. The causes of global warming are debated, but the main consensus is that the main cause is the increase in concentrations of greenhouse gases. Possible results of global warming include a rise in sea levels, a change in weather patterns, the retreat of glaciers and sea ice, species extinctions, and an increased frequency of extreme weather.
  
:'''RADAR''': short for Radio Detection and Ranging. It transmits radio waves, which are scattered and reflected when they come into contact with something. They can pass through water droplets and are generally used with active remote sensing systems. Radar is good for locating objects and measuring elevation.
+
The greenhouse effect is caused by certain ''greenhouse gases'' that trap heat in the Earth’s atmosphere. The main gases, along with their percent contribution to the greenhouse effect, are water vapor (36-70%), carbon dioxide (9-26%), methane (4-9%), and ozone (3-7%). Of these, carbon dioxide is perhaps the gas most focused on as a potential human impact on the environment; thus, it is the most likely to appear on tests. Humans have increased the amounts of these and other greenhouse gases in the atmosphere during industrialization periods such as the Industrial Revolution. CFC’s and nitrous oxide are among the greenhouse gases now present in the atmosphere that were not before.
:'''LIDAR''': short for Light Detection and Ranging. It is similar to RADAR but uses laser pulses instead of radio waves.  
+
 
:'''TM''': stands for Thematic Mapper. It was introduced in the Landsat program and involves seven image data bands that scan along a ground track.
+
==== Radiative Forcing ====
:'''MSS''': stands for Multispectral Scanner. It was introduced in the Landsat program also, and each band responds to a different type of radiation, thus the name “multispectral”.
+
Different greenhouse gases are more/less powerful in forcing the greenhouse effect. Their ability to force the greenhouse effect is known as '''radiative forcing''' and is generally given as a value or a percent.
  
===Examples of Satellites===
+
==== Global Dimming ====
Most of the satellites tested for are NASA-related.
+
'''Global dimming''' refers to a natural process that cools the earth. Generally speaking, global dimming is caused not by gases but by aerosols, such as sulfates and chlorates. These aerosols occur naturally as well as artificially. For example, large-scale volcanic eruptions emit a lot of sulfate into the atmosphere. The 1991 eruption of Mt. Pinatubo dropped the average global temperature by 0.5 degrees C due to the global dimming effect. This is because these aerosols reflect incoming solar radiation very effectively.
  
:'''A-Train''': a satellite constellation scheduled to be with seven satellites working together in Sun synchronous (SS) orbit. Their compiled images can have high-resolution results.
+
Furthermore, sulfates and chlorate aerosols in the atmosphere serve as '''cloud condensation nuclei''', or CCNs. CCNs are particles upon which water vapor can more favorably condense, forming clouds. These clouds reflect significantly more incoming radiation than normal clouds due to the '''Twomey effect'''. This results in cooling.
*Aqua: used for monitoring the water cycle.
 
*Aura: measures air quality and climate.
 
*CloudSat: uses RADAR to monitor clouds’ altitude and properties.
 
*CALIPSO: measures materials within clouds  
 
*PARASOL: a satellite which studies clouds and aerosols. It has begun to leave the A-Train.
 
  
:'''Landsat''': A series of 7 satellites using multiple spectral bands.  Only two are operational today (Landsat 7 and Landsat 5) These are generally the most commonly tested satellites, as well as those using the ASTER sensor.
+
Chlorates, and to some degree sulfates, deplete ozone. In addition, sulfates react with water to form sulfuric acid, which precipitates as '''acid rain'''.
  
:'''GOES (Geostationary Operational Environmental Satellite) System''': 2 weather satellites in Geostationary orbit 36000 km
+
=== Climate Cycles ===
:'''SeaWiFS (Sea-viewing Wide-Field-of View Sensor)''': Eight spectral bands of very narrow wavelength ranges, monitors ocean primary production and phytoplankton processes, ocean influences on climate processes (heat storage and aerosol formation), and monitors the cycles of carbon, sulfur, and nitrogen.
+
A number of cycles are integral to the study of climatology. The two most important cycles are the '''carbon cycle''' and the '''hydrological (water) cycle'''.
 +
==== Carbon Cycle ====
 +
The carbon cycle is the process through which carbon atoms are cycled through the environment. It cycles through the atmosphere as carbon dioxide, and some carbon is dissolved into the hydrosphere. It is also taken in by plants during photosynthesis and released when the plants die. When animals feed on plants, they also take in carbon.
  
==Forest biome==
+
However, the burning of fossil fuels, which come from biomatter, releases excess carbon into the atmosphere, increasing the concentration of carbon dioxide. Carbon can be stored for long periods of time in trees and soil in forest biomes, so the altering of this balance affects the cycle of carbon and can help global warming and climate change. The resulting global warming can then affect plant growth since slight changes in temperature or other biotic factors can kill off certain species of plants. Since there would be less plants remaining alive, more carbon dioxide stays in the atmosphere rather than being taken in by plants.
The second portion of this event requires the use of knowledge of forest biomes and the interaction of humans with them.
 
  
===Characteristics===
+
The carbon cycle indicates the presence of what are known as carbon '''sources''' and '''sinks'''. A source emits carbon into the atmosphere or the biosphere, and a sink absorbs it from the atmosphere, storing that carbon somewhere inaccessible. Oceans, for example, are carbon sinks. In addition, phytoplankton in the oceans photosynthesize and utilize dissolved carbon dioxide to produce energy. Plants also absorb carbon; tropical rainforests, and other biomes containing a very rich and diverse population of flora, are also carbon sinks. However, deciduous plants shed leaves and suspend photosynthesis in the winter, leading to the seasonal fluctuations observed in the Keeling Curve.
There are three major types of forests, which are all characterized by the amounts of trees growing in them.
 
  
*'''Tropical forests''' are near the equator. They have the greatest diversity in species, and only two seasons are present (rainy and dry).
+
Natural sources of carbon include wildfires and volcanic eruptions. However, the sources of carbon that are of most interest to scientists are usually anthropogenic - the burning of fossil fuels such as coal or petroleum, for example.
*'''Temperate forests''' are located in eastern North America, northeastern Asia, and western and central Europe. There are four defined seasons and a moderate climate. Precipitation (75-150 cm) is distributed evenly throughout the year.
 
*'''Boreal forests (taiga)''' are in northern Eurasia and North America. There is a short, warm summer and a very long and cold winter.
 
  
===Human Interaction===
+
==== Hydrological Cycle ====
As humans have expanded their reign over the planet, the health of the forest biome has taken a hit. Effects of humans such as deforestation threaten the well-being of the planet, especially since forests play an important role in processes such as the water cycle, carbon cycle, and ecological diversity.
+
The hydrologic cycle, more commonly known as the water cycle, describes the cycle through which water travels. Its base is the more commonly known cycle of evaporation, condensation, and precipitation. Among smaller parts of the water cycle, water is stored as ice and snow in cold climates. It also enters the ground through infiltration, although some simply flows over it as surface runoff. The groundwater flow then takes this water to the oceans where it reenters the main cycle.
  
==Acronyms==
+
Finally, some evaporation occurs as evapotranspiration in plants. Fewer plants would result in less carbon taken in, and thus more carbon dioxide in the atmosphere contributing to the greenhouse effect.
:'''ALI''': Advanced Land Imager
 
:'''ASTER''': Advanced Space Borne thermal Emission and Reflected Radiometer
 
:'''ATSR''': Along Track Scanning Radiometer
 
:'''AVIRIS''': Airborne Visual/Infrared Imaging Spectrometer
 
:'''CCD''': Charge Coupled Devices
 
:'''CERES''': Clouds and Earth’s Radiant Energy System
 
:'''CIR''': Colour Infrared
 
:'''CZCS''': Coastal Zone Color Scanner
 
:'''EMR''': ElectroMagnetic Radiation
 
:'''EMS''': ElectroMagnetic Spectrum
 
:'''EOS''': Earth Observing System
 
:'''FC''': False Colour
 
:'''FCC''': False Colour Composite
 
:'''FLIR''': Forward Looking InfraRed
 
:'''GOES''': Geostationary Operational Environmental Satellite
 
:'''GPS''': Global Positioning Satellite
 
:'''HRV''': High Resolution Visible
 
:'''IFOV''': Instantaneous Field of View
 
:'''IRS''': Indian Remote Sensing
 
:'''LANDSAT''': LAND SATellite
 
:'''LIDAR''': LIght Detection And Ranging
 
:'''LISS-III''': Linear Imaging Self-Scanning Sensor
 
:'''LWIR''': LongWave InfraRed
 
:'''LWR''': LongWave Radiation
 
:'''MESSR''': Multispectral Electronic Self-Scanning Radiometer
 
:'''MISR''': Multi-angle Imaging Spectro Radiometer
 
:'''MODIS''': MODerate Resolution Imaging Spectroradiometer
 
:'''MOS''': Marine Observation Satellite
 
:'''MSR''': Microwave Scanning Radiometer
 
:'''NDVI''': Normalized Difference Vegetation Index
 
:'''NIR''': Near InfraRed
 
:'''NOAA AVHRR''': National Oceanic and Atmospheric Administration Advanced Very High Resolution Radiometer
 
:'''PAN''': Panchromatic
 
:'''PCA''': Principal Components Analysis
 
:'''RADAR''': RAdio Detection And Ranging
 
:'''RAR''': Real Aperture Radar
 
:'''RGB''': Red, Green, Blue Colour Space
 
:'''R/S''': Remote Sensing
 
:'''SAR''': Synthetic Aperture Radar
 
:'''SEASAT''': SEA SATellite
 
:'''SPOT''': Système Pour l'Observation de la Terre
 
:'''SWIR''': ShortWave-InfraRed
 
:'''TIR''': Thermal Infrared
 
:'''TM''': Thematic Mapper OR Thermal Mapper
 
:'''VTIR''': Visible and Thermal Infrared Radiometer
 
:'''WiFS''': Wide Field Sensor
 
  
More acronyms can be found in the CCRS (Canada Center for Remote Sensing) Tutorial
+
Water vapor is the most prevalent greenhouse gas, and so monitoring all aspects of the water cycle also provides valuable insight and data concerning climate change and the validity of the greenhouse effect.
  
<spoiler text="Remote Sensing 2009">
+
=== Key Climate Concepts ===
==Remote Sensing 2009==
+
==== The Ozone Layer ====
"Participants will use remote sensing imagery, science and mathematical process skills to complete tasks related to an understanding of the causes and consequences of global warming." - Remote Sensing rules 2009
+
The '''ozone layer''' is a naturally occurring layer of ozone in the stratosphere. Ozone blocks harmful UV from reaching the surface. Ozone can interact with various gases in the atmosphere, some natural, others artificial. These gases may destroy ozone, leading to ''ozone depletion''. Ozone depletion may be seasonal or anthropogenic.
  
You may bring five (5) pages of double-sided paper with notes in any form. Each participant may bring any non-graphing calculator.
+
Ozone is formed from the reaction of a free oxygen atom with a molecule of oxygen. The resulting molecule is relatively unstable. When a photon carrying sufficient energy, such as that of UV, hits the molecule, the molecule will absorb the energy and break back into its reactants. Therefore, ozone shields the surface of the Earth from ionizing radiation, which would otherwise be very detrimental to life.
  
This event is essentially a test based on identifying satellite imagery. Be prepared to study about and memorize different NASA space programs aimed at imaging earth from space. Also, learn to identify different human constructions based on satellite photos. Test questions will often be open-ended, with answers to questions based on analysis of such satellite images in visible, infrared, and radio wavelengths. Other such images include but are not limited to charts of variation in average temperature and measure of chlorophyll concentration in the ocean.
+
==== Ozone Depletion ====
</spoiler>
+
Ozone depletion has been greatly accelerated by pollutants in the atmosphere. Before the "ozone hole" was discovered, many propellants and refrigerants used chlorofluorocarbons (CFCs). In the atmosphere, chlorine atoms will break off from the CFC molecule. Chlorine is a very efficient catalyst in the breakdown of ozone - one molecule of chlorine can degrade up to ten thousand molecules of ozone. CFCs were phased out since the Montreal Protocol went into effect in 1989. It is estimated that recovery may last until the mid-21st century.
==Resources==
 
  
===Textbooks===
+
CFCs have been replaced with hydrofluorocarbons (HFCs), which do not contain chlorine, and therefore does not deplete ozone.
  
[http://www.amazon.com/Remote-Sensing-Interpretation-Thomas-Lillesand/dp/0470052457/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1236147330&sr=8-1 Remote Sensing and Image Interpretation]
+
==== Ocean Acidification ====
 +
Previously, it was mentioned that oceans dissolve carbon dioxide. This is not the entire explanation. In reality, carbon dioxide reacts with water to form carbonic acid, which can dissociate into carbonate and bicarbonate anions.
  
[http://www.amazon.com/Remote-Sensing-Interpretation-Floyd-Sabins/dp/1577665074/ref=pd_bbs_sr_4?ie=UTF8&s=books&qid=1236147330&sr=8-4 Remote Sensing: Principles and Interpretation]
+
Carbon dioxide, carbonic acid, carbonate and bicarbonate exist in a temperature-dependent equilibrium. The higher the temperature, the more the equilibrium shifts towards the production of carbonate and bicarbonate. The H+ cations dissociate and accumulate, lowering the pH of the water. This process is known as '''ocean acidification'''.
  
===Links===
+
Ocean acidification is detrimental to marine life. Organisms particularly sensitive to pH changes include coral and phytoplankton. Corals support the most diverse marine ecosystems, the ocean, and phytoplankton form the staple of many food chains. Coral is made of a calcium carbonate (aragonite/calcite) skeleton, which dissolves in acidic water. In addition, corals produce a natural sunscreen that protects them from ionizing radiation, a process that is slowed in warmer environments. The biodiversity of many coral reefs, including the Great Barrier Reef off Australia, is being threatened by what is known as ''coral bleaching'' (dead coral appears bleached).
  
====2010 links====
+
Scientists measure these processes either by directly measuring the sea surface temperature (SST) using thermal infrared remote sensing, or by measuring chlorophyll concentrations and phytoplankton presence/health by proxy. MODIS, for example, has bands dedicated to measuring chlorophyll concentrations.
  
*http://soinc.org/remote_sensing_c
+
== Sample Questions ==
**Official Science Olympiad remote sensing page
+
* The ______________ ______________ is a naturally occurring process that aids in heating the Earth's surface and atmosphere.
 +
* The term _________ is used to describe the total mass of organic matter.
 +
* _______ _______ refers to any fuel that is created from decomposed carbon-based plant and animal organisms.
 +
* A _____________ is the smallest element that can be displayed on a satellite image or computer monitor.
 +
* Define Albedo.
 +
* Earth is in a __________ orbit.
 +
* The geometric shape of satellite orbital paths around the Earth is called what?
 +
* Define Eccentricity.
 +
* Define Obliquity.
 +
* Define Precession.
 +
* What is the name of the scientist who first proposed the astronomical theory for climate change?
 +
* According to the theory, how might changes in the eccentricity, obliquity, and precession result in an ice age?
  
*http://www.ccrs.nrcan.gc.ca/resource/tutor/fundam/pdf/fundamentals_e.pdf
+
== Resources ==
**This remote sensing tutorial written by the Canada Centre for Remote Sensing is very useful. Covers basic concept of remote sensing, sensor types, image interpretation and analysis, and use of data. Section 5.3 on forests is a '''must'''.
 
  
*http://rst.gsfc.nasa.gov/Front/tofc.html
+
=== Textbooks ===
**The NASA tutorial is more advanced than the Canada one, and it is recommended reading after the Canada one has already been read. Difficult to read both due to time constraints, however, most substance in this tutorial will not be necessary on most tests. Good if time permits.
 
  
====Older links====
+
* [http://www.amazon.com/Remote-Sensing-Interpretation-Thomas-Lillesand/dp/0470052457/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1236147330&sr=8-1 Remote Sensing and Image Interpretation]
''Most of these links are either no longer active or not relevant to the 2010 event''
+
* [http://www.amazon.com/Remote-Sensing-Interpretation-Floyd-Sabins/dp/1577665074/ref=pd_bbs_sr_4?ie=UTF8&s=books&qid=1236147330&sr=8-4 Remote Sensing: Principles and Interpretation]
*http://www.soinc.org/events/remotesense/index.htm
 
**The official Science Olympiad website has many links, official rule clarifications, and tips on how to improve your binder.
 
  
*http://cmex.ihmc.us/CMEX/index.html OR http://mars.jpl.nasa.gov
+
=== Links ===
**Mars Topographic Map, as referenced by the official rules. No longer applicable due to rule changes.
 
  
*http://pubs.usgs.gov/imap/i2782/i2782_sh1.pdf and http://pubs.usgs.gov/imap/i2782/i2782_sh2.pdf
+
* [http://soinc.org/remote_sensing_c Official Science Olympiad remote sensing page]
**Direct links to the Mars Topographic Maps from pubs.usgs.gov - note they are large in file size. No longer applicable due to rule changes.
+
* [http://www.nrcan.gc.ca/sites/www.nrcan.gc.ca/files/earthsciences/pdf/resource/tutor/fundam/pdf/fundamentals_e.pdf This remote sensing tutorial] written by the Canada Centre for Remote Sensing is very useful. Covers basic concept of remote sensing, sensor types, image interpretation and analysis, and use of data.
 +
* [https://web.archive.org/web/20120212051856/http://rst.gsfc.nasa.gov:80/Front/tofc.html The NASA tutorial]<sup>archived from [http://rst.gsfc.nasa.gov/Front/tofc.html original]</sup> is more advanced than the Canada one, and it is recommended reading after the Canada one has already been read. Difficult to read both due to time constraints, however, most substance in this tutorial will not be necessary on most tests. Good if time permits.
 +
* [http://www.physicalgeography.net/fundamentals/2e.html This is good for a very brief overview of the topic of remote sensing]
 +
* [http://geo.arc.nasa.gov/sge/health/sensor/cfsensor.html This is a good source for all of the bands of the major satellites]
  
*http://www.michiganso.org/mars_remote_sensing_course.htm or http://www.michiganso.org/resources.htm
+
==== Older links ====
**The other link in the rules has probably moved here instead. There is a great online course dedicated to Remote Sensing and a great topographic map.
+
'''''NOTE:''' These links are not relevant to the 2018 event''
 
*http://www.tx.ncsu.edu/science_olympiad/Tournament_information/Event_rules_nc/remote_sensing.cfm
 
**Usually had good event resources.
 
  
*http://www.tufts.edu/as/wright_center/products/sci_olympiad/upload_1_15_05/pdf/remote_sensing_2005.pdf
+
* [http://cmex.ihmc.us/CMEX/index.html OR http://mars.jpl.nasa.gov Mars Topographic Map], as referenced by the official rules. No longer applicable due to rule changes.
**This is a good document for Remote Sensing in general, without any focus on Mars. There are two pages of links at the end for you to use.  
+
* [http://pubs.usgs.gov/imap/i2782/i2782_sh1.pdf File 1] and [http://pubs.usgs.gov/imap/i2782/i2782_sh2.pdf File 2]
 +
** Direct links to the Mars Topographic Maps from pubs.usgs.gov - note they are large in file size. No longer applicable due to rule changes.
  
*http://www.scioly.org/obb/board.php?FID=35
+
{{Earth and Space Event}}
**Feel free to ask any additional questions you might have about Remote Sensing here, as long as you follow the rules.
 
  
*http://newyorkscioly.org/SOPages/Events/Remote.html
+
[[Category:Events]]
**New York Coaches Conference
+
[[Category:Study events]]
[[Category:Event Pages]]
+
[[Category:Earth and Space Science Events]]
[[Category:Study Event Pages]]
 

Latest revision as of 19:40, 24 April 2021

In Remote Sensing, a Division C event, teams use remote sensing image, such as photographic and spectroscopic information, to analyze data and/or make climate models.

Each team may bring four 8.5" x 11" double-sided Note Sheet, as well as a metric ruler, a protractor, and any kind of (non-graphing) calculator.

Remote Sensing is in rotation for the 2017 and 2018 seasons, and was previously an event for roughly ten years, through 2013.

The tests tend to be comprised of a mix of image interpretation as well as questions regarding concepts of remote sensing and climate change processes (carbon cycle, aerosols, ozone depletion, etc.). Some ecology/biology background is useful, as well as meteorology and knowledge of basic physics concepts. Knowledge of individual space programs and NASA satellites, in addition to the types of sensors used, is recommended.

Please note that acronyms are used often in this event. Several times in this page, an acronym will be listed alongside a key term. They are there since acronyms can be used as questions. (i.e. what does RADAR stand for?) There are many of them, but do not get confused because of them. Keep in mind that they are symbols of the idea they represent, and not independent entities.

Topics

Remote Sensing rotates between topics occasionally. In addition, the specific aspects of remote sensing, including the satellites and types of data focused on, often change between years. The topic for the 2018 season is the same as it was in 2017, Climate Change Processes.

Season Topic
2018 Climate Change Processes
2017 Climate Change Processes
2013 Earth's Hydrosphere
2012 Earth's Hydrosphere
2011 Human Impact on Earth
2010 Human Interactions with Forest Biomes
2009 Human Land Use Patterns?
2008 Mars
2007 Mars
2006
2005
2004
2003
2002

Satellites

History

While not an integral part of the event, it is expected that a student know the basic history behind Remote Sensing.

The Beginning

Remote sensing began in the 1860s, when Gaspard-Felix "Nadar" Tournachon, a Frenchman, took aerial photographs from a balloon. In the 1880s, cameras were mounted on kites and took pictures via a remote control mechanism. When heavier-than-air flight was invented, it was only logical to take pictures from airplanes also. During WWI, cameras mounted on planes, or held by aviators were used in military reconnaissance.

Sputnik 1, the first artificial satellite

Remote Sensing Advances

As early as 1904, rockets were used to launch cameras to heights of 600 meters. But until the 1960s, aerial photographs from airplanes were the only way to take pictures of Earth's landscape. It took the space race between the US and the USSR to start remote sensing from above the atmosphere.

The first satellite was Sputnik 1, launched by the USSR on 4 October 1957. Sputnik helped to identify the density of high atmospheric layers and provided data on radio signals in the ionosphere. In early 1955 the US was working on Project Orbiter, which used a Jupiter C rocket to launch a satellite. The project, led by legendary rocket scientist Wernher von Braun, succeeded, and Explorer 1 became the US’ first satellite on January 31, 1958.

Examples of Instruments

Instruments are instrumental (no pun intended) to the function of satellites and remote sensing. Know what types of instruments will be used for certain applications.

RADAR: short for Radio Detection and Ranging. The name comes from WWII, when radio waves were actually used in radar. The waves are scattered and reflected when they come into contact with something. Modern-day radars actually use microwaves. They can pass through water droplets and are generally used with active remote sensing systems. Radar is good for locating objects and measuring elevation.
LIDAR: short for Light Detection and Ranging. It is similar to RADAR but uses laser pulses instead of radio waves.
TM: stands for Thematic Mapper. It was introduced in the Landsat program and involves seven image data bands that scan along a ground track.
ETM+: stands for Enhanced Thematic Mapper Plus. It replaced the TM sensor on Landsat 7. Unlike TM, it has eight bands.
MSS: stands for Multispectral Scanner. It was introduced in the Landsat program also, and each band responds to a different type of radiation, thus the name “multispectral”.
MODIS: stands for Moderate-resolution Imaging Spectroradiometer. It is on the Terra and Aqua satellites. It measures cloud properties and radiative energy flux.
CERES: stands for Clouds and the Earth's Radiant Energy System. It is on the Terra and Aqua satellites. It measures broadband radiative energy flux.
SeaWiFS (Sea-viewing Wide Field-of-View Sensor): Eight spectral bands of very narrow wavelength ranges, monitors ocean primary production and phytoplankton processes, ocean influences on climate processes (heat storage and aerosol formation), and monitors the cycles of carbon, sulfur, and nitrogen.

Other Instruments

  • ALI: Advanced Land Imager
  • ASTER: Advanced Spaceborne Thermal Emission and Reflection Radiometer
  • ATSR: Along Track Scanning Radiometer
  • AVHRR: Advanced Very High Resolution Radiometer (used with NOAA)
  • AVIRIS: Airborne Visual/Infrared Imaging Spectrometer
  • CCD: Charge Coupled Devices
  • CZCS: Coastal Zone Color Scanner
  • GPS: Global Positioning System
  • HRV: High Resolution Visible sensor
  • LISS-III: Linear Imaging Self-Scanning Sensor
  • MESSR: Multispectral Electronic Self-Scanning Radiometer
  • MISR: Multi-angle Imaging Spectro Radiometer
  • MSR: Microwave Scanning Radiometer
  • RAR: Real Aperture Radar
  • VTIR: Visible and Thermal Infrared Radiometer
  • WiFS: Wide Field Sensor

This is in no way a comprehensive list of the instruments tested on in Remote Sensing. Participants are encouraged to at least demonstrate basic knowledge about how these instruments work.

For those who are interested, a large list of acronyms used by NASA can be found here

Active Sensing vs. Passive Sensing

Active sensing occurs when the satellite produces radiation on its own, and then senses the backscatter. This is useful since it does not depend on outside radiation, but it uses up energy more quickly. Examples of active sensors are a laser fluorosensor and synthetic aperture radar (SAR). These instruments operate even at night, because they do not rely on reflected radiation (which is usually solar in origin). Passive sensing, on the other hand, senses naturally available radiation to create a picture. It does not need to use energy to produce radiation, but it is dependent on the outside radiation's existence. If there is little or no outside radiation, the satellite cannot function well. One exception includes thermal infrared (TIR) sensors, which actually obtain better information at night, because of natural emission of thermal energy.

Examples of Satellites

There are countless numbers of satellites currently orbiting Earth, but tests will mostly focus on satellite programs directed by NASA (National Aeronautics and Space Administration). Some other agencies of note include the NOAA (National Oceanic and Atmospheric Administration), the CSA (Canadian Space Agency), JAXA (Japanese Aerospace Exploration Agency), the ESA (European Space Agency) and the IRS (Indian Remote Sensing).

EOS

Satellites especially likely to appear on tests are those that come from the Earth Observing System (EOS). The EOS is a series of NASA satellites designed to observe the Earth's land, atmosphere, biosphere, and hydrosphere. The first EOS satellite was launched in 1997.

A-Train, or EOS-PM: an EOS satellite constellation scheduled to be with seven satellites working together in Sun synchronous (SS) orbit. Their compiled images have resulted in high-resolution images of the Earth's surface.

The A-train, (also known as the Afternoon constellation because of its south-to-north equatorial crossing time of 1:30 PM) was planned to have seven satellites. Five satellites currently fly in the constellation.

The NASA Earth Observatories
  • Active:
    • OCO-2: studies global atmospheric carbon dioxide. It is a replacement for the failed OCO.
    • GCOM-W1 "SHIZUKU": studies the water cycle.
    • Aqua: studies the water cycle, such as precipitation and evaporation.
    • CALIPSO: studies of clouds and air particles, and their effect on Earth's climate.
    • Aura: studies Earth's ozone layer, air quality, and climate.
  • Past:
    • PARASOL: studied radiative and microphysical properties of clouds and air particles; moved to lower orbit and deactivated in 2013.
    • CloudSat: studies the altitude and other properties of clouds; moved to lower orbit following partial mechanical failure in February 2018.
  • Failed to achieve orbit:
    • OCO (Orbiting Carbon Observatory): was intended to study atmospheric carbon dioxide.
    • Glory: was to study radiative and microphysical properties of air particles.

Both failures occurred because of launch vehicle failure.

Morning Constellation, or EOS-AM, is a second constellation in the EOS. It is so named because of its 10:30 AM north-to-south equatorial crossing. Currently there are three satellites in this constellation - the fourth, EO-1, was decommissioned in March 2017.

Landsat: A series of 8 satellites using multiple spectral bands. Only two are operational today: Landsat 7 and Landsat 8 (launched in February 2013). These are some of the most commonly tested satellites. The name Landsat is a mixture of the two words "land" and "satellite".
Terra: Provides global data on the atmosphere, land, and water. Its scientific focus includes atmospheric composition, biogeochemistry, climate change and variability, and the water and energy cycle.

Remember that the aforementioned satellites hardly comprise an exhaustive list - the EOS is a very large collection of satellites. It is almost impossible to know all of them thoroughly, so it is suggested that participants familiarize themselves with the most important ones.

Other Satellites

There are other notable satellites that may appear on exams and are not affiliated with the EOS.

  • GOES (Geostationary Operational Environmental Satellite) System: 2 weather satellites in Geostationary orbit 36000 km. It is partially organized by NASA, in cooperation with NOAA.
  • MOS: Marine Observation Satellite
  • SeaSat: SEA SATellite. This satellite is especially significant as the first satellite focused on the oceans, as well as the first satellite to carry synthetic aperture radar (SAR).
  • SPOT: Système Pour l'Observation de la Terre. This is a series of 7 CNES satellites similar to the Landsat program, with a more commercial focus.

Satellite Imaging

The formal definition of remote sensing is the science of acquiring data without being in contact with it. For this reason, a major part of this event involves the processing of images and analyzing them to come to a conclusion.

Image Processing

Satellite data is sent from the satellite to the ground in a raw digital format. The smallest unit of data is represented by a binary number. This data will be strung together into a digital stream and applied to a single dot, or pixel (short for "picture element") which gets a value known as a Digital Number (DN). Each DN corresponds to a particular shade of gray that was detected by the satellite. These pixels, when arranged together in the correct order, form an image of the target where the varying shades of gray represent the varying energy levels detected on the target. The pixels are arranged into a matrix. This data is then either stored in the remote sensing platform temporarily, or transmitted to the ground. The data can then be manipulated mathematically for various reasons, which will be discussed below.

The human eye can only distinguish between about 16 shades of gray in an image, but it is able to distinguish between millions of colors. Thus, a common image enhancement technique is to assign specific DN values to specific colors, increasing the contrast. A true color image is one for which the colors have been assigned to DN values that represent the actual spectral range of the colors used in the image. A photograph is an example of a true color image. False color (FC) is a technique by which colors are assigned to spectral bands that do not equate to the spectral range of the selected color. This allows an analyst to highlight particular features of interest using a color scheme that makes the features stand out.

Additionally, remote sensing instruments often operate continuously for very long times, which means that they are prone to instrument error and malfunction. Different processing techniques can remedy this.

Composites

Composites are images where multiple individual satellite images have been combined to produce a new image. This process is used to create more detailed images that take multiple factors into account, as well as to find patterns that would not have been revealed in a single image. It also helps to create larger images than the satellite itself can make. This is because each satellite covers a specific swath, or area imaged by a satellite with a fixed width. When these swaths are put together into a composite, a larger area is imaged.

Traditionally, composites were made by merging colors. Three black and white transparencies of the same image are first made. Each represents a different spectral band - blue, green, and red. Shine white light through each one onto a screen. Then, each band is projected through a filter of the same color - blue band through a blue filter, green through a green, red through a red. Because the blue images are clear on the blue spectral band image, they'll appear blue on the composite. When the three images are aligned, the resulting image will have the natural color (or very close) of the original. This process is called "color additive viewing", and red, green, and blue are often known as additive colors because they add together to create new colors. The earliest color films would record multispectral scenes in multicolored films, which were then developed and merged into one colored film image.

In satellite data, this crude method is replaced by assigning color gradient values to DNs. When the three colorized images are merged, a true color image is formed.

Not all composites are true colored. A False Color Composite (FCC) results when one band is assigned color gradient values for another color, such as if the blue band DNs were set to correspond to shades of red. One extremely popular FCC combination assigns red colors to the NIR image. Since healthy vegetation reflects strongly in the NIR region of the EM spectrum, a FCC using this combination displays areas with healthy vegetation as red. This also differentiates natural features from artificial ones: a football field made up of healthy grass has a strong red color, but a football field composed of Astroturf or other artificial substances will show up as a duller red, or even dark brown.

Common composites:

  • True-color composite- useful for interpreting man-made objects. Simply assign the red, green, and blue bands to the respective color for the image.
  • Blue-near IR-mid IR, where blue channel uses visible blue, green uses near-infrared (so vegetation stays green), and mid-infrared is shown as red. Such images allow seeing the water depth, vegetation coverage, soil moisture content, and presence of fires, all in a single image.
    • NearIR is usually assigned to red on the image; thus, vegetation often appears bright red in false color images, rather than green, because healthy vegetation reflects a lot of nearIR radiation. This can also be used in identifying urban/artificial areas.

Contrast

Contrast refers to the difference in relative brightness between an item and its surroundings as seen in the image. A particular feature is easily detected in an image when contrast between an item and its background are high. However, when the contrast is low, an item might go undetected in an image.

Resolution

Resolution is a property of an image that describes the level of detail that can be discerned from it. This is important, as images with higher resolution will have higher detail. There are several types of resolution that are important to remote sensing. One of these is spatial resolution, which is the smallest detail a sensor can detect. Since the smallest element in a satellite image is a pixel, spatial resolution describes the area on the Earth's surface represented by each pixel. For example, in a weather satellite image that has a resolution of 1 km, each pixel represents the brightness of an area that is 1 km by 1 km.

Other types of resolution include spectral resolution, or the ability of sensor to distinguish between fine wavelength intervals; radiometric resolution, which is the ability of sensor to discriminate very small differences in energy; and temporal resolution, or the time between which the same area is viewed twice.

A key thing to keep in mind is that the resolution of a particular satellite sensor must be optimized for the intended use of the data. Weather satellites generally monitor weather patterns that cover hundreds of miles, so there is no need for resolution higher than 0.5 km. Landsat and other land-use satellites need to distinguish between much smaller items, so a higher resolution is required. The trade-off for higher resolution, however, is that the amount of data produced by the satellite is much greater, which increases transmission times and burdens the mission. In addition, smaller areas contain less radiometric output, so spectral resolution generally decreases for increased spatial resolution.

Image Enhancement

Images often need to be enhanced to facilitate easier interpretation. These enhancements include, but are not limited to, contrast enhancement, haze/noise corrections, instrument corrections, and smoothing operations.

  • Contrast enhancement: usually, the two extreme DN values are resampled to the minimum value, 0, and the maximum value that the sensor supports. All values in the middle are recalculated either by a line of best fit (linear contrast stretching) or by frequency of the values in the image (histogram stretching).
  • Haze corrections reflect the fact that haze usually causes a uniform increase across all DNs in an area. The sensor is calibrated over a source with a known radiative output (such as a body of water, which would have a value approaching 0) and all DNs in the area are resampled by subtracting that value.
  • Noise corrections involve the use of kernels, matrices of DNs surrounding a central DN. Weighted averages are determined and a threshold value for the difference between this average value and the average value is set. If the difference exceeds the threshold, the central DN is reassigned the average value; if the difference is less than than the threshold, the DN is conserved.
  • Instrument corrections generally account for malfunctioning sensors on the CCD. Corrections are similar to those used in haze corrections.
  • Smoothing operations simply create smoother images with less detail, which serves a variety of purposes.

The Electromagnetic Spectrum

Electromagnetic radiation (EMR) is the most common energy source for remote sensing. It consists of an electric and magnetic field perpendicular to each other and the direction of travel while traveling at the speed of light. This is important to remote sensing because that's how sensors detect certain data about the objects a satellite is studying.

Radiation is an important part of remote sensing, since different materials respond to radiation in different ways, so this can be used to identify objects. One example of this is scattering (or atmospheric scattering), where particles in the atmosphere redirect radiation. There are three types: Raleigh, Mie, and non-selective. This scattering is used to identify the presence and quantity of certain gases in the atmosphere. Also, transmission is when radiation passes through a target, indicating it is unaffected by that particular wave.

There are several types of electromagnetic energy that can be emitted, depending on their wavelength. All of them are found in the electromagnetic spectrum (EMS).

The electromagnetic spectrum

It's important to know which types of energy are useful for what:

Gamma rays and x-rays cannot be used for remote sensing because they are absorbed by the Earth's atmosphere: in general, the shorter the wavelength (and the greater the frequency), the more absorption occurs.
Ultraviolet radiation is not useful either because it is blocked by the ozone layer.
Visible light allows satellites to detect colors a human eye would see. Some of these satellites are panchromatic, meaning they are sensitive to all wavelengths of visible light.
Near Infrared (NIR), the region just past the visible portion of the spectrum, is useful for monitoring vegetation, as healthy vegetation reflects much NIR.
Short Wave Infrared (SWIR), a region somewhere past NIR, which is useful for determining the spectral signature of objects. A spectral signature is the telltale reflectance of radiation by a material across the spectrum. Each object has a unique spectral signature.
Thermal Infrared (TIR), the kind of IR we perceive as "heat". The Earth naturally emits TIR, so TIR remote sensing usually involves passively detecting radiation in this region of the spectrum. This is useful for determining temperatures of objects.
Microwaves are used in radar. Different radars utilize different wavelengths, which range from relatively shortwave regions, such as the W-band, to longwave regions such as the L-band. Radars are very good at penetrating foliage, and in the case of longwave radars, the ground, to a certain extent. Microwaves also reveal a lot about the properties of a surface, such as its dielectric constant, moisture, etc.

Since the atmosphere's components selectively absorb select wavelengths, only certain regions of the spectrum may actually be used in remote sensing. These regions are known as atmospheric windows.

Blackbody Radiation

In physics, a blackbody is an ideal object which absorbs and re-emits 100% of all incident radiation. The spectral signature for a blackbody is modeled by a blackbody curve, determined by the Planck Function. The blackbody curve is dependent on temperature. In practice, blackbodies do not exist; instead, most objects are graybodies, which emit a certain percentage of the radiation absorbed. This percentage is known as emissivity.

Select blackbody curves

Integrating a Planck function blackbody curve gives the total radiant exitance, or power emitted per unit area, of an object. Radiant exitance is specified by the Stefan-Boltzmann Law. For more information on this law, see Reach for the Stars#Stefan-Boltzmann's Law and Climate Notes#Radiation equations.

The dominant wavelength of an object is the wavelength that to a large extent determines the appearance of an object. The sun appears yellow because its dominant wavelength is in the yellow portion of the visible light region of the EM spectrum. The dominant wavelength of an object can be determined by Wein's Displacement Law.

Albedo

Albedo is, simply put, the percentage of radiation that is reflected off of an object. Albedo and emissivity are important concepts to understand and differentiate. Whereas albedo concerns incident light that is reflected from a surface, emissivity concerns blackbody radiation that is emitted from an object. In real life applications, white objects, which reflect more wavelengths, have higher albedo. This reduces the amount of radiation that is absorbed, which also reduces the amount of radiation emitted by proxy.

Image Interpretation

For the basics of image interpretation, see Road Scholar#Satellite Images.

Image interpretation and analysis is a huge part of the Remote Sensing event. It involves locating, identifying or measuring certain objects in images acquired using Remote Sensing. This isn't as straightforward as it may seem. There are plenty of features that can throw you off in each image. However, some features are the same in each image as well. There will always be a "target" to look for, which will always contrast with other parts of the image- making it "distinguishable".

According to the Canada Centre for Remote Sensing, whose tutorial you can find in the external links section, there are several things to look for to assist in image interpretation. These are tone, shape, size, pattern, texture, shadow, and association.

  • Tone is the brightness or color of an object. It's the main way to distinguish targets from backgrounds.
  • Shape is the shape of an object. Note that a straight-edged shape is usually man-made, such as agricultural or urban structures. Irregular-edged shapes are usually formed naturally.
  • Size, relative or absolute, can be determined by finding common objects in images, such as trees or roads. (see Finding Area section, below)
  • Pattern refers to the arrangement of objects in an image, such as the spacing between buildings in an urban setting.
  • Texture is the arrangement of tone variation throughout the image.
  • Shadow can help determine size and distinguish objects.
  • Association refers to things that are associated with one another in photographs, which can assist interpretation, i.e. boats on a lake, etc.
An example of splitting an irregular shape into more workable shapes

Finding Area

Another major part of image interpretation is determining the surface area of a particular area of interest. You will often be asked on a test to find the area of some piece of land, but this piece of land is usually not regularly shaped, like a rectangle. It'll have lots of different curves, and at first, it may seem difficult to find the exact area. However, an easy way to estimate area is to split up this irregular shape into smaller, easier shapes, like rectangles or circles. Then, you can add up the areas of the individual shapes to get the total area of the piece of land.

Before doing this, though, you need to take scale into consideration. Scale is the ratio of size on image to real-life size. For example, if the scale on an image is 1 inch:25 miles, each inch on the image represents 25 miles in real life. To find the area of one of your shapes, measure its dimensions with your ruler in inches (or centimeters, if the scale says so) and then multiply that number by the scale to find how long each of your dimensions is in real life. Do this for all of your smaller, more regular shapes. Then, just find the areas of all of them and add them together. Your answer should be approximately the area of the piece of land. It will not be exact, nor will it need to be, as test graders should have a range of values that they will accept as being correct.

NDVI

A NDVI map of Europe

During the competition, you may be asked to analyze a picture's NDVI values. NDVI stands for "Normalized Difference Vegetation Index" and is used to describe various land types, usually to determine whether or not the image contains vegetation. The equation provided by USGS for NDVI is as follows:

NDVI = (Channel 2 - Channel 1) / (Channel 2 + Channel 1)

Channel 1 is in the red light part of the electromagnetic spectrum. In this region, the chlorophyll absorbs much of the incoming sunlight. Channel 2 is in the Near Infrared part of the spectrum, where the plant's mesophyll leaf structure can cause reflectance. You may also see the equation given like so:

[math]\mbox{NDVI}=\frac{(\mbox{NIR}-\mbox{VIS})}{(\mbox{NIR}+\mbox{VIS})}[/math]

(Where NIR is Near Infrared and VIS is Visual (Red) Light)

So, healthy vegetation has a low red light reflectance (Channel 1) and a high infrared reflectance (Channel 2). This would produce a high NDVI value. As the amount of vegetation decreases, so too do the NDVI values. The range of NDVI values is -1 to +1.

Generally, areas rich in vegetation will have higher positive values. Soil tends to cause NDVI values somewhat lower than vegetation, small positive amounts. Bodies of water, such as lakes or oceans, will have even lower positive (or, in some cases, high negative) values.

There are some factors that may affect NDVI values. Atmospheric conditions can have an effect on NDVI, as well as the water content of soil. Clouds sometimes produce NDVI values of their own, but if they aren't thick enough to do so, they may throw off measurements considerably.

EVI

EVI, or the Enhanced Vegetation Index, was created to improve off of NDVI and eliminate some of its errors. It has an improved sensitivity to regions high in biomass and its elimination of canopy background. The equation for EVI is as follows:

[math]EVI= G \times \frac{(NIR-RED)}{(NIR+C1 \times RED-C2 \times Blue+L)}[/math]

Where NIR is again Near Infrared, and Red and Blue are of course those colors' bands. All three of these are at least partially atmospherically-corrected surface reflectances. The equation filters out canopy noise through L.

EVI has been adopted as a standard product for two of NASA's MODIS satellites, Terra and Aqua. As it factors out background noise, it's often considered to be more popular than NDVI.

Climate Change Processes

The 2018 Remote Sensing topic is Climate Change Processes. Participants are encouraged to study climatology aspects as well as technology.

Human Interaction

Human Interaction with the Earth is a large part of Remote Sensing. It emphasizes how humans affect the Earth on scales detectable by remote sensing. This interaction has previously been represented in tests as deforestation, ozone layer changes, changes in land use, retreat of glaciers, and loss of sea ice, among others. It is important to remember, however, that not all climate change processes are anthropogenic, and other climatology events may be tested as well.

The Keeling Curve is a graph derived from a series of measurements of carbon dioxide atop Mauna Loa since 1958. It showed the increasing levels of carbon dioxide in the atmosphere as well as annual cycles. It is credited as being the first evidence to bring attention to the problem of greenhouse gases. Charles Keeling began the research and his son, Richard Keeling, took over after his death in 2005.

Global Warming

When human impact on the environment is mentioned, one of the main ideas it entails is global warming. Global warming is defined as “the increase in the average temperature of Earth's near-surface air and oceans since the mid-20th century and its projected continuation”. The causes of global warming are debated, but the main consensus is that the main cause is the increase in concentrations of greenhouse gases. Possible results of global warming include a rise in sea levels, a change in weather patterns, the retreat of glaciers and sea ice, species extinctions, and an increased frequency of extreme weather.

The greenhouse effect is caused by certain greenhouse gases that trap heat in the Earth’s atmosphere. The main gases, along with their percent contribution to the greenhouse effect, are water vapor (36-70%), carbon dioxide (9-26%), methane (4-9%), and ozone (3-7%). Of these, carbon dioxide is perhaps the gas most focused on as a potential human impact on the environment; thus, it is the most likely to appear on tests. Humans have increased the amounts of these and other greenhouse gases in the atmosphere during industrialization periods such as the Industrial Revolution. CFC’s and nitrous oxide are among the greenhouse gases now present in the atmosphere that were not before.

Radiative Forcing

Different greenhouse gases are more/less powerful in forcing the greenhouse effect. Their ability to force the greenhouse effect is known as radiative forcing and is generally given as a value or a percent.

Global Dimming

Global dimming refers to a natural process that cools the earth. Generally speaking, global dimming is caused not by gases but by aerosols, such as sulfates and chlorates. These aerosols occur naturally as well as artificially. For example, large-scale volcanic eruptions emit a lot of sulfate into the atmosphere. The 1991 eruption of Mt. Pinatubo dropped the average global temperature by 0.5 degrees C due to the global dimming effect. This is because these aerosols reflect incoming solar radiation very effectively.

Furthermore, sulfates and chlorate aerosols in the atmosphere serve as cloud condensation nuclei, or CCNs. CCNs are particles upon which water vapor can more favorably condense, forming clouds. These clouds reflect significantly more incoming radiation than normal clouds due to the Twomey effect. This results in cooling.

Chlorates, and to some degree sulfates, deplete ozone. In addition, sulfates react with water to form sulfuric acid, which precipitates as acid rain.

Climate Cycles

A number of cycles are integral to the study of climatology. The two most important cycles are the carbon cycle and the hydrological (water) cycle.

Carbon Cycle

The carbon cycle is the process through which carbon atoms are cycled through the environment. It cycles through the atmosphere as carbon dioxide, and some carbon is dissolved into the hydrosphere. It is also taken in by plants during photosynthesis and released when the plants die. When animals feed on plants, they also take in carbon.

However, the burning of fossil fuels, which come from biomatter, releases excess carbon into the atmosphere, increasing the concentration of carbon dioxide. Carbon can be stored for long periods of time in trees and soil in forest biomes, so the altering of this balance affects the cycle of carbon and can help global warming and climate change. The resulting global warming can then affect plant growth since slight changes in temperature or other biotic factors can kill off certain species of plants. Since there would be less plants remaining alive, more carbon dioxide stays in the atmosphere rather than being taken in by plants.

The carbon cycle indicates the presence of what are known as carbon sources and sinks. A source emits carbon into the atmosphere or the biosphere, and a sink absorbs it from the atmosphere, storing that carbon somewhere inaccessible. Oceans, for example, are carbon sinks. In addition, phytoplankton in the oceans photosynthesize and utilize dissolved carbon dioxide to produce energy. Plants also absorb carbon; tropical rainforests, and other biomes containing a very rich and diverse population of flora, are also carbon sinks. However, deciduous plants shed leaves and suspend photosynthesis in the winter, leading to the seasonal fluctuations observed in the Keeling Curve.

Natural sources of carbon include wildfires and volcanic eruptions. However, the sources of carbon that are of most interest to scientists are usually anthropogenic - the burning of fossil fuels such as coal or petroleum, for example.

Hydrological Cycle

The hydrologic cycle, more commonly known as the water cycle, describes the cycle through which water travels. Its base is the more commonly known cycle of evaporation, condensation, and precipitation. Among smaller parts of the water cycle, water is stored as ice and snow in cold climates. It also enters the ground through infiltration, although some simply flows over it as surface runoff. The groundwater flow then takes this water to the oceans where it reenters the main cycle.

Finally, some evaporation occurs as evapotranspiration in plants. Fewer plants would result in less carbon taken in, and thus more carbon dioxide in the atmosphere contributing to the greenhouse effect.

Water vapor is the most prevalent greenhouse gas, and so monitoring all aspects of the water cycle also provides valuable insight and data concerning climate change and the validity of the greenhouse effect.

Key Climate Concepts

The Ozone Layer

The ozone layer is a naturally occurring layer of ozone in the stratosphere. Ozone blocks harmful UV from reaching the surface. Ozone can interact with various gases in the atmosphere, some natural, others artificial. These gases may destroy ozone, leading to ozone depletion. Ozone depletion may be seasonal or anthropogenic.

Ozone is formed from the reaction of a free oxygen atom with a molecule of oxygen. The resulting molecule is relatively unstable. When a photon carrying sufficient energy, such as that of UV, hits the molecule, the molecule will absorb the energy and break back into its reactants. Therefore, ozone shields the surface of the Earth from ionizing radiation, which would otherwise be very detrimental to life.

Ozone Depletion

Ozone depletion has been greatly accelerated by pollutants in the atmosphere. Before the "ozone hole" was discovered, many propellants and refrigerants used chlorofluorocarbons (CFCs). In the atmosphere, chlorine atoms will break off from the CFC molecule. Chlorine is a very efficient catalyst in the breakdown of ozone - one molecule of chlorine can degrade up to ten thousand molecules of ozone. CFCs were phased out since the Montreal Protocol went into effect in 1989. It is estimated that recovery may last until the mid-21st century.

CFCs have been replaced with hydrofluorocarbons (HFCs), which do not contain chlorine, and therefore does not deplete ozone.

Ocean Acidification

Previously, it was mentioned that oceans dissolve carbon dioxide. This is not the entire explanation. In reality, carbon dioxide reacts with water to form carbonic acid, which can dissociate into carbonate and bicarbonate anions.

Carbon dioxide, carbonic acid, carbonate and bicarbonate exist in a temperature-dependent equilibrium. The higher the temperature, the more the equilibrium shifts towards the production of carbonate and bicarbonate. The H+ cations dissociate and accumulate, lowering the pH of the water. This process is known as ocean acidification.

Ocean acidification is detrimental to marine life. Organisms particularly sensitive to pH changes include coral and phytoplankton. Corals support the most diverse marine ecosystems, the ocean, and phytoplankton form the staple of many food chains. Coral is made of a calcium carbonate (aragonite/calcite) skeleton, which dissolves in acidic water. In addition, corals produce a natural sunscreen that protects them from ionizing radiation, a process that is slowed in warmer environments. The biodiversity of many coral reefs, including the Great Barrier Reef off Australia, is being threatened by what is known as coral bleaching (dead coral appears bleached).

Scientists measure these processes either by directly measuring the sea surface temperature (SST) using thermal infrared remote sensing, or by measuring chlorophyll concentrations and phytoplankton presence/health by proxy. MODIS, for example, has bands dedicated to measuring chlorophyll concentrations.

Sample Questions

  • The ______________ ______________ is a naturally occurring process that aids in heating the Earth's surface and atmosphere.
  • The term _________ is used to describe the total mass of organic matter.
  • _______ _______ refers to any fuel that is created from decomposed carbon-based plant and animal organisms.
  • A _____________ is the smallest element that can be displayed on a satellite image or computer monitor.
  • Define Albedo.
  • Earth is in a __________ orbit.
  • The geometric shape of satellite orbital paths around the Earth is called what?
  • Define Eccentricity.
  • Define Obliquity.
  • Define Precession.
  • What is the name of the scientist who first proposed the astronomical theory for climate change?
  • According to the theory, how might changes in the eccentricity, obliquity, and precession result in an ice age?

Resources

Textbooks

Links

Older links

NOTE: These links are not relevant to the 2018 event