Unmanned Aircraft Used for Environmental Methods

This paper is meant to enlighten people on possible environmental related uses that unmanned aircraft systems are doing presently, and dispel pre-conceptions that UAS can only be used for martial (military and police) or spying (movies and espionage) uses.

The topics that will be covered in this paper to teach people are farming, fire, hurricane studies, flooding, oil exploration, gas exploration, wildlife management, land management, mapping, glacier monitoring, atmospheric monitoring, and radiation measurements. The paper will detail some of the payloads used in each area of study. It also gives specific examples of UAS being used in these fields. This paper is set up in a fashion to be read from the most familiar topics, down to the less familiar to a general audience with small unmanned aircraft systems background.

Introduction:

The past ten years, the field of unmanned aircraft systems (UAS) has become far more public knowledge then it has been in the past and has expanded at exponential rates. With the last two wars, people have almost accepted UAS as common place. Still, not many people outside of the field know, or imagine UAS for non-intelligence, surveillance, and reconnaissance missions (ISR). Many people still call them drones, thinking that all of them are expendable, and are one-shot deals. Others that do know more about the commercial aspects to UAS still are unsure how just a camera can help some fields of study, especially in the areas of Earth Sciences (ES). Even less is known about UAS’s other Earth Science payloads.
This paper will go over many different ES mission types. Within each mission type, it will give a broad overview, to familiarize the reader, but not make them an expert. It will also give detailed information on the payloads and other testing parameters the UAS use to complete a mission. It will start out with areas that people will be more familiar with. This will make the technology in the proceeding applications less daunting to understand.
Farming:

Farming is actually a widespread use of UAS, although the general public is not aware of it. The camera technologies of UAS are very useful, although UAS’s main pull in farming and plant observation is that you do not have to use a person to do tedious work. UAS can also stay on station for long hours, be fully autonomous, and conduct their work in the dead of night. Some UAS can even carry extremely heavy loads, which would be particularly useful in crop dusting.
While crop dusting seems like an obvious choice because manned planes already do that, it is not used as much as the cameras are. However, Japan has used UAS for pesticide control and other chemical dispersants on rice farms since 1983 (Austin, 2010). Their primary UAS of choice is the RMAX helicopter UAS. It can carry heavy loads up to 65 lbs. and can fertilize a 5.5 NM square field. UAS are also used for surveillance against animals destroying crops.

Camera work on UAS is a very simple concept once the basics are known. The main types used for remote sensing of vegetation are electronic optical (basically a visual light, digital camera), infrared (IR), near infrared, multi-spectral, and thermal. These can be done with fixed wing, or rotary craft. Fixed wing are more prevalent outside Japan. Each type of camera gives its own unique advantage to the situation.

Multispectral are very useful in almost all times of day, as long as you have set your reflective scale properly (Campbell, 2007). Multispectral cameras see many different wavelengths of light/radiation. Each substance, that things are made of, gives off its own unique wavelength that can’t usually be seen by the human eye, but can with these sensors. The radiation is then presented in a form that makes sense to our eyes. This camera does include the infrared spectrum, although infrared cameras are more specifically for viewing those wavelengths, whereas this camera can display far more wavelengths in one picture.

Depending on the range of wavelengths your camera can capture, you can tell what chemicals something is made out of, how much of a certain substance is in an area, or be able to differentiate objects from terrain by their signature/ the different ways they reflect/emit radiation off their surfaces. These cameras generally just collect emitted/radiated energy to form an image, instead of radiating their own, as a visual camera would with a flash. Also, for specific studies, multispectral can concentrate on one band of light, such as the true color of blue (used for atmosphere and seeing underwater up to 150 ft.), or green, used for vegetation so that the plants show up how they actually would, and not as an artificial representation.

In farming, multispectral cameras are used to see the chlorophyll level in plants (Berni, Zarco-Tejada, Suarez, Gonzalez-Dugo, & Feres, 2010). They are also used to see the stress on the chlorophyll from water using a photochemical reflectance index (PRI). Another use for seeing different chemicals is differentiating where different fertilizers have been used. Using that information, a farmer can know where to lay more or potentially get rid of some in other areas. The biochemical cycle of plants has been updated using satellite pictures and multispectral analysis. Most studies, however, are ground based. These studies culminated in several indexes to base pictures on, which will be covered shortly. Levels of chemicals given off by decomposition of plants will also tell farmers at what stage their crops are in, or if they are contaminated.

Based on satellite remote sensing programs using multispectral pictures, as well as ground samples, and other methods, we know far more about the Earth’s different biomes (desert, artic, etc.) than we did before. We have also been able to divide those biomes into smaller groupings based on the different information on plant life we have collected. We are more aware of how different plants are affected by different radiation levels, such as in the lower hemisphere (high solar radiation) compared to the upper hemisphere. Pioneer was NASA’s first project in this field, where they shot pictures of coffee bean fields to see how they progressed through the season. Later, indexes were made based on this data, which will be brought up at the end of this farming segment.

Infrared and near-infrared are probably a famer’s most valuable telemetry types in farming. They specifically are usually looking at the long wavelengths of light just under our visual bands. With this an observer can tell health of the plants via indexes (Berni, Zarco-Tejada, Suarez, Gonzalez-Dugo, & Feres, 2010). A person can also tell where your different types of plants are. The observer can tell which plots of land are being too saturated with water. The near-IR information also gives the analyst the visual spectrum in the same image, giving the person both sets of information to use, though not all of the infrared data will be there. Since plants give off a very high IR signature, and are very bright, differentiation of each plant is possible in the picture, as well as size of their crown (the head of the plant) to see how well it is growing/absorbing sunlight using the crown leaf area index (CLAI).

Near-IR is achieved by taking the IR filter out of the camera’s lens to get some IR data in the exposure. Most cameras have the filter because on pictures of humans, blood vessels will show up very prominently in the picture. In a picture of plants, the picture will seem like it was taken in a very sunny area, as the plants will be extra bright, making some details hard to see. Other problems with IR technology is that in the evening (thermal is affected too since its information comes from even longer wavelengths in the infrared spectrum), the Earth emits its own infrared radiation, which it has collected from the sun during the day. This will over saturate the image, making any distinction of targets nearly impossible. IR is usually used during the night so that the sensitivity doesn’t need to be lowered to far.

The visual spectrum is good for obvious reasons. Surveying, surveillance, and planning are just a few aspects. Scouting out new land is also a possible use. The only problem with the visual spectrum is that you are limited to daytime operations, where the sun provides a light source to reflect of the targets in question.

Thermal in farming is used to see which plants are getting enough water. The more water absorbed the plants chlorophyll, the cooler it will appear. You can also tell if plants are not getting enough water. This is done by seeing how warm the plant is, or how cool. If it is cool, it has more water. This can also tell you if you are over watering.

Another use is decomposing plants give off heat. Pictures of the center of a group of potatoes with a higher heat signature will tell a farmer he needs to harvest the pile soon, or lose the batch to decomposition. Farmers also use thermal imaging to watch crops at night to make sure animals are not getting to them. A disadvantage to thermal imaging is it is time sensitive. Daytime, unless it has been desensitized, it will not be very useful, due to the suns heat on everything. Also, evening, the radiation from the ground will be interfering.

https://youtube.com/watch?v=9lGbNGNdhpA

Before any of these cameras can be used, they must go through extensive calibration. They must be calibrated to the range of which spectrum they are using. Also altitude must be pre-planned, and atmospheric interference adjusted for/made within limits. They must also geometrically calibrate the sensor in order to get an accurate picture in regards to relationship of size, and where the image is. GPS locations will provide where it is actually taken, but if the image is skewed, or unfocused, the image won’t be nearly as useful.

To know a plants varying health, or what stage it is in, indexes must be referenced once the image is taken. For water, depth indexes for color will be studied to know depth of the water/height of a wave (IR is more useful for waves, visual for depth). Using radiometer images, both visual and IR, NASA came up with the global vegetation index (GVI) in 1982-1985. This told people where different plant families were on the planet, and which examples were the healthiest. Other examples are CLAI, canopy-level Forest LIGHT Interaction Model (FLIGHT), narrowband vegetation indices (NDVI), Transformed Chlorophyll Absorption in Reflectance Index (TCARI), and Soil-Adjusted Vegetation Index (OSAVI). These all vary in use, ranging from finding the density of trees in an area compared to their height, all the way to finding how well the plants are adjusting to their new soil type in a different biome.

In 2004, NASA also used the Pathfinder UAS to watch coffee ripeness throughout a season. NASA’s present aims are also tending more towards using UAS in remote sensing fields, versus their satellites, mainly due to cost and more flexibility of the system. A study in 2008 combined even more ground based indices with color pictures and near IR information to create more accurate indices (Jensen, Troy, Zeller, Armondo, & Apan, 2008).

A legal jump you must make in preparation for any sort of UAS work is a Certificate of Authorization (COA). This is a document from the FAA, given to public entities, such as state governments and schools, which allows deviations from the FARs. These are needed because current FARs do not acknowledge UAS.

Fire:

In 2007, NASA used their MQ-9 Reaper named Ikhana (Native American for “aware/conscience”) to take multispectral images (mostly infrared/thermal) to watch the massive fires that took place that summer all over the US, but mainly in California. Using the information gathered from this telemetry, they were able to spot still burning fires under ground, to get crews to them. In October of that same year, Northrop Grumman’s Global hawk was able to fly over 14 hours surveying the Poomache Fire in San Diego. Over 700 images were taken. Global hawk was also supplemented with smaller helicopter UAS in order to guide ground crews fighting the fire (Northrop, 2008).

NASA foresees that future use of UAS in fires, will not only find hot spots, or guide crews safely, but will be able to give constant vigilance in forests to supplement watch towers. UAS, such as Ikhana, can also provide vital information on weather systems that fires start, giving us far more insight into the mechanics of fire-related weather. NASA sees “flocks” of small UAS (MLB Bats as a specific example) being used on fires to find the safest paths through fires (Bigelow, 2005). Distant-future technologies might enable dropping of fire deterrent, but the legal and weight lifting ramifications put that day far off for now.

The Global Hawk example was particularly effective, because, while they did use IR to see through the smoke, Synthetic Aperture Radar (SAR) images were able to give more accurate views on what was really happening 65,000 feet down below the aircraft. The 303rd UAS wing in Wright-Patterson, Ohio was able to map out the terrain, not only to assess damage, but to provide safer routes out. The SAR imaging gave the commanders the knowledge on any road debris, or log that might be hidden under the ashes. This imaging, overlaid with the IR and thermal data, could show spots underground where logs have burned, creating a hole that is ready to collapse.

Hurricane Studies:

The study of hurricanes by manned planes is well known. Satellites have been doing it for decades now. While many people do know about the UAS that were sacrificed to learn the inner workings of a hurricane, many do not even believe retrieval is possible. Retrieval is very possible, though it is also true that in the past, some UAS were considered expendable given the data that the UAS were transmitting back seemed more valuable overall (Aerosonde UAS was used in such a manner). The fact that UAS can get inside a hurricane, without risking a human life, gives scientists a tool that many did not think would ever come.

Not only are small UAS used. NASA has used Global Hawk and Ikhana to monitor hurricanes from above, warning boats and land of the hurricane’s path. Aside from obvious observational methods, pressure differences can be monitored, even from Global Hawk’s high altitude of 65,000 feet. These can give more accurate assessments of strength of the hurricane. Other uses for these high altitude UAS include looking for survivors after landfall and monitoring flood levels for evacuation warnings. They can also be used to guide rescue workers to buildings that are about to fail.
Aside from monitoring an already formed hurricane, UAS are very instrumental in finding out how they form and predicting where they will go. According to the Civil UAV Capability Assessment by NASA, UAS can “gather data on precipitation, clouds, electrical phenomenon, microphysics, and dust.” They go on to state that with the data from water buoys, along with surface waves and wind imaging, modeling of how a hurricane will build is greatly improved. Our knowledge of the electrical fields in a hurricane is of great interest as many think that that holds the key in development, along with ocean temperatures. Precipitation predictions can save many lives on land as well.

Flooding:

Recently, UAS have been used heavily in flooding situations. The main idea of remote sensing for floods actually came from 1972, when Buffalo Creek, West Virginia and Rapid City, South Dakota both sustained tremendous flooding due to failed dams/heavy rain (Campbell, 2007). NASA then used a density slice technique, and Philipson & Hafker’s Landsat MSS data, using visual interpretations of maps, to see how well dams would hold up in flooding areas. This information came from satellites. Density slicing also helped point out weak points in dams that inspectors might have missed.

In 2010, Grand Forks, ND was a prime example of the use of this technology up in Oslo, MN. They used a ScanEagle UAS with an electrical optical camera with 36x zoom to monitor the conditions just to the south of Grand Forks. The use for flood monitoring was accidental the first time, until the full scope of the telemetry received back was understood, after which, they regularly sent up the ScanEagle to take a look. Ice break up was monitored, as well as daily progression of the water spreading over farmland. Common distance from the target area of heaviest ice blockage was four miles away.

While the Grand Forks was a great example of how UAS can use visual camera’s to monitor flooding, visual cameras are actually not the main source of telemetry other area’s use (Campbell, 2007). Near IR is particularly useful because, while you are also getting the visual information, you receive the near IR signatures of plants, which normally show up very bright in infrared photography. Water is mostly dark. The visual sensors much of the time absorb too much green from the water’s color, which makes it hard to distinguish. This is particularly useful in wetland areas where there is much plant life and distinguishing water underneath the foliage is difficult. Plants near water are also brighter to give even more contrast because they are sustained so well by the water.

Waves show up in the IR slightly differently due to refraction, making them slightly brighter (Campbell, 2007). This allows researchers to see the roughness of the water, which could be important in tsunami situations. Also, depth of water is put against scales in the visual and the infrared spectrums if you do have any sort of sonar from other platforms to gather depth information.

Other useful information that UAS can provide, although in the past given by satellites, is bathymetry (Campbell, 2007). With this, a scientist can use synthetic aperture radar (SAR) to map a location in very high detail, giving a 3D image (this will be discussed more fully in land management and mapping). Acoustic sonar can do the same thing, though with less fidelity. With this, you can see the erosion on the land, or the degradation of dikes/dams. Australia’s flooding this year used this information to warn other towns of impending flooding by being able to guess where water would most likely flow, or destroy earthworks. Farmers can also use this information later to determine where the richer sediment settled, although this will take multispectral analysis as well. The SAR will have told them how much was moved. The movement would be known from current flows, as the SAR only maps, and only would distinguish the loss in terrain.

Oil and Gas:

Oil and UAS have two very distinct settings to be in, although more certainly exist. The main two setting that UAS are used for, concerning oil, are oil spills and observing pipelines for leaks. While this isn’t necessarily an earth science, it obviously helps the environment if oil is contained. Also, the same measures used to contain oil can be used to find it and learn more about it.

For oil spills, infrared information is the most useful. The reflected wavelengths of oil off of water make it show up far more easily than in a visual image. This makes it easier to see where it is spreading (Cambell, 2007).

For pipelines, multispectral imaging is used in order to see leaks in gas lines and stress points in the pipes. Normally the gas would not be detected visually. This same method can be used to find gas coming up from the ground, or possibly find where a volcano is forming due to venting. They can also follow plumes from gases, to give out warnings of the extent of a leak. Other sensors are used more prevalently in the detection of unknown gas. A few of these sensors are emission detectors, filters that are later processed on the ground to see what elements they collected as the UAS flew, and spectroradiometers. These also detect what affects they are having on the atmosphere.

Wildlife Management:

The most useful tools in this field are visual and thermal cameras. Other ways of finding animals are tagging them, and then having a radio signal tracker on the UAS be able to locate that animal. There are many uses, other than just keeping a watch for animals near farms. Herds can be found, constant guard over herds can be kept, and exploration of new areas can be conducted without disturbing the animals. Most UAS are fairly silent and very hard to see once they are 1,000 feet above the ground. ScanEagle, for instance, has a “hush kit” on its muffler, specifically so that it cannot be detected once it has some altitude.

Two great examples of this particular field involve Insitu’s ScanEagle UAS. Insitu and the University of Alaska used ScanEagle to find rare ice seals on the Northern Alaskan coast in 2009 (“Wildlife Monitoring,” 2009). Insitu, The University of Alaska, and the University of North Dakota also used ScanEagle in a study for finding whales for an oil company, so that the oil company could avoid harming the whales. During one of its test flights, it not only found whales, but also found the seals from the previous studies. Both of these studies used a visual camera.

Other animal-related uses for UAS are traffic monitoring near reefs and the monitoring of fisheries/fishing grounds to make sure illegal poaching does not occur. Both are hard to regulate with only surveillance, however, UAS can guide coast guard and other authorities to the proper positions (Austin, 2010).

Land Management (road work, archeology, timber theft, and volcanoes):

SAR allows people to see objects through clutter on the ground, which might not show up to the regular eye. Sometimes, patterns become more apparent in the 3D model. Returns from some objects, given strength of the return, may show that it is more solid or reflective than other substances. Also, with very precise information, deformations in the ground over time are far more visible than if one were just to one’s eye.
Other forms of mapping technology that UAS use are mosaic pictures (where 3D modeling is found through mathematics of the size of objects in the stitched pictures) and LIDAR (Oczipka, Bemmann, Piezonka, Munkhabayar, Ahrens, Achtelik, and Lehmann, 2008). Both of fairly good accuracy, though the 3D modeling is a little less accurate and takes time. The advantage to the 3D modeling is usually a person stitched the pictures together, so the image have been scrutinized more.

SAR is basically regular RADAR, but in motion, and with far more sensitive wavelengths being sent out. How it gets such a more precise image though, and the 3D nature of its telemetry, is through the motion (“Synthetic Aperture RADAR,” 2011). While the RADAR return would give back a 2D image, with some valleys and spikes, it is basically only that one orientation. By moving in a plane, the SAR not only sends a pulse down on its new target, but also a pulse back to get another view of its old target it just passed. This way, all views are seen. Since height can be measured more readily from a return like RADAR than measuring a heights object in a picture, plus the multiple views of the same target for confirmation, once the information is stitched together, the picture is very highly detailed.

Phased array RADAR uses the same principles, but from fixed positions. Multiply antenna arrays find objects via standard RADAR. They then use differentiation to find speed, altitude, and distance from each other. This is also how GPS works, with the exception that the time and relevant data is sent via a message, instead of waiting for bounce back of a return like RADAR. Both SAR and Phased array however cost a lot of money.
LIDAR uses much the same technique, but with light. Lasers find the distance of the target area from it, and basically make a 3D etch-a-sketch of an area. Laser range finders are relatively cheap, as well.

While making roads and regular maps is readily apparent using these methods, the changes over time to an area are ignored in value, except in a few fields. Volcanology uses this method to see if the Earth is swelling, possibly a sign of a rupture. Also, this same field has been used to notice jams in rivers occurring, and tree’s having been stolen from timber sales (“Missouri river GIS,” 2007).

Reading the smallest of details (also usually flying lower) is not really needed for maps, since each twig won’t affect the overall effect of a road. However, in archeology, that twig could be important. Or even the shape of a group of fallen twigs could show that something crucial is lying underneath.
Also, archeologists find UAS very useful due to the fact many use batteries, and therefore hurt the environment less (Oczipka, Bemmann, Piezonka, Munkhabayar, Ahrens, Achtelik, and Lehmann, 2008). They also are quiet, as to not disturb the local wild life. The images used from UAS also give a cheap aerial perspective (as compared to manned planes), and can instantly catalogue a site through video or still images. Mapping with visual images, while not as accurate, is also much cheaper than either LIDAR or SAR. Complimented with GPS, accuracy can still be fairly great.

Glacier Monitoring:

Glaciers are a key to our planets survival. They affect the oceans temperature and overall health. This affects animal life, tide levels, oceanic travel routes, and even the weather. Therefore, careful monitoring of glaciers is key.
UAS, using SAR and visual pictures have been used to assess glacier size, shrink rate, volume, and density. They are also used to follow broken glaciers, for traffic avoidance (Cox, Nagy, Skoog, Somers, 2004).

Atmospheric and Aerosol Measurements:

While many people do not think of it this way, this is actually the oldest use of UAS. Weather balloons are sent up with barometric sensors every day to give us the information aloft to predict weather, find where the jet stream is, and see what is in our atmosphere. More modern UAS can still do this information. The advantages to UAS are that they can carry more sensors and are not expended like balloons. UAS such as Helios can stay up months at a time observing our skies, then land, without any human input. The fact that Helios and many other craft use lean methods of flight, such as solar energy, also makes sure our environment isn’t harmed from emissions that would come from manned planes doing the same work (Davis, 2010). The UAS that do used fuels however, are also very efficient with fuel, staying in the air up to 36 hours on only about 5-6 kg of fuel.
While this old method for weather knowledge is definitely the most prevalent and evident, other missions have come up recently. As stated above, the BAMs research facility in Germany has used odor-source localization, plume tracking, and a gas sensor to find gases in volcanoes (Bartholmai, Neumann, 2008). The same sensors, as well as other pollutant sniffers, can be used to find areas of high concentrations of harmful chemicals to the environment. Currently, some power plants are using UAS to fly into the smoke stack emissions to see if they are keeping their units up to code. France is especially looking at UAS as a method to “eco-spy” on power plants (Watson, 2009). Many governments are considering this a testing method for that purpose as well. Northrop Grumman is also using CH4 and CO2 filters that they then analyze on the ground (Carone, 2010). Spectral imaging is also able to map out these areas, with the help of LIDAR for range information, so 3D maps can be drawn. Large areas of 1,000 KM can be covered in this way. These very same detectors, when adapted for other elements, can be used in HAZMAT situations to know which places are safe. This also makes sure it is all cleaned up in order to help the environment.

Radiation (measurements and uses) and Magnetic Measurements:

Harnessing radiation for use as power is another use that UAS are at the cutting edge on. UAS is one of the leading fields that is using IR panels, instead of solar, to provide energy. While still experimental and very expensive, these have the ability to collect the Earth’s IR energy it gives off at night, providing 24 hours of clean energy (Davis, 2010).
As covered up above in HAZMAT related situations, UAS have been used to sense radiation levels in certain areas. Germany is especially interested in this field. Measurements of electromagnetic fields can also give lifesaving information (such as hurricane information and earthquakes). The study of how the poles are changing can also be done with simple compass measurements, but with the advantage that a person doesn’t have to do it themselves.

Conclusion:

This paper has now covered several different, civilian-only, used for UAS. Hopefully, with the knowledge given here, people will think up even new ways to reach civilian markets with the wide array of sensors and packages available to UAS. The opportunities are endless. They are only limited by our imagination and willingness to pursue. Hopefully this paper has taught people something about the different camera types available to them to get a job done. Hopefully, it will give someone a good idea on how to use a UAS in their field, using similar methods to the various ones provided here.

Bibliography
• Austin, R. (Ed.). (2010). Unmanned aircraft systems, 1st edition. Cheppenham, Wiltshire, UK: John Wiley & Sons LTD.
• Bartholmai, M, & Neumann, P. (2008). Micro-Drone for Gas Measurement in Hazardous Scenarios via Remote Sensing. (2008). Berlin: BAM Federal Institute of Materials Research and Testing.
• Berni, J, P Zarco-Tejada, L Suárez, V González-Dugo, and E Fereres. (2010). “REMOTE SENSING OF VEGETATION FROM UAV PLATFORMS USING LIGHTWEIGHT MULTISPECTRAL AND THERMAL IMAGING SENSORS.” Inter-Commission WG I/V. 6. Print.
• Bigelow, B. V. (2005, August 22). NASA research Team Envisions Flock of Robot Aircraft Monitoring Wildfires. San Diego Union Tribune.
• Campbell, J. (Ed.). (2007). Introduction to remote sensing. Guildford Press: New York, NY, USA
• Carone, T. (2010). Monitoring and Verification of Carbon Capture and Storage. (2010).Global climate monitoring. Northrop Grumman.
• Cione, J, Dunion, J, Holland, G, Mulac, B, & Uhlhorn, E. (2005). Using the Aerosonde UAV

During the 2005 Atlantic Hurricane Season. (2005). NOAA’s hurricane research division. NOAA.
• Cox, T. H., Nagy, C. J., Skoog, M. A., & Somers, I. A. NASA, (2004). Civil UAV Capability Assessment, NASA.
• Donaldson, P. (2010, September). Hybrid progress. Unmanned Vehicles, 15(4), 20 -24.
• Davis, B. (2010, July). Juice where you can find it. Unmanned Systems, 28(7), 24
• Northrop Grumman, Global hawk to the rescue. (2008, December).Technical Services Magazine, 7.
• Gourley, S. (2010, September). Mapping the future. Unmanned Vehicles, 15(4), 33- 36.
• Horcher, A., & Vissier, R. US Forest Service, Technology and Development. (n.d.). Unmanned Aerial Vehicles: Applications for Natural Resource Management and Monitoring. Virginia: Virginia Tech.
• Jensen, Troy, Les Zeller, Armondo and Apan. “THE USE OF A UAV AS A REMOTE SENSING PLATFORM IN AGRICULTURE.” (2008): 8. Print
• Missouri River GIS. (2007). Solutions: The Publication of Western Air Maps, Inc, 7(2), 1-3.
• Nickol, C, Guynn, M, Kohout, L, & Ozoroski, T. (2007). High Altitude Long Endurance UAV Analysis of Alternatives and Technology Requirements Development. (2007). NASA.
• Oczipka, M, Bemmann, J, Piezonka, H, Munkhabayar, J, Ahrens, B, Achtelik, M, & Lehmann, F. (2008). Small Drones for Geo-Archaeology in the Steppe: Locating and Documenting the Archaeological Heritage of the Orkhon Valley in Mongolia. (2008). Berlin: Bonn University, and German Aerospace Center
• Pratt, K. S., Murphy, R., Stover, S., & Griffin, C. (2007). Conops and Autonomy Recommendations for VTOL Mavs Based on Observations of Hurricane Katrina UAV Operations. Autonomous Robots, 1-11. (Pratt, Murphy, Stover, & Griffin, 2007)
• Synthetic Aperture RADAR. (2011). Wikipedia. Retrieved March 7, 2011, from http://en.wikipedia.org/wiki/Synthetic_Aperture_Radar.
• Watson, P. J. (2009, December 21). Spy drones to enforce co2 regulations. Retrieved February 22, 2011, from
www.infowars.com/spy-drones-to-enforce-co2-regulations/
• Wildlife Monitoring. (2009). Insitu. Retrieved March 7, 2011, from http://www.insitu.com/wildlife_monitoring.
• Willis, M. (2009, November 1). 3d aerial archaeology – anyone help with the right imu [Online Forum Comment]. Retrieved February 22, 2011, from
diydrones.com/forum/topics/3d-aerial-archaeology-anyone

Brett Whalin

US Commercial Multi-engine; Instrument Airplane pilot UAS/RPV/RPA/UAV pilot Sensor Operator