Abstract:
This data comprises imagery collected from an unmanned aerial system (DJI M600 Matrice) over oyster reefs in the Bay of St. Louis near Gulfport, Mississippi from 2018-05-07 to 2018-05-11. This dataset was collected with the goal of observing water quality from a remote platform. In particular, the data was collected to see how well it correlates with the hyperspectral imagery collected from the same flights, and to see if the five bands provided can supply enough information to observe relevant phenomena. The imagery was collected using a MicaSense RedEdge sensor, a five-band sensor that produces an image for each band. The bands are blue, green, red, red edge, and near-infrared. Center wavelengths are 475 nm, 560 nm, 668 nm, 717 nm, and 840 nm. The bandwidth for each is 20, 20, 10, 10, and 40 nm, respectively. RedEdge field-of-view (FOV) is 47.9 degrees horizontal, 36.9 degrees vertical. All imagery contains latitude, longitude, and altitude information, along with other camera parameters that allow for imagery post-processing. The processed data has been converted into reflectance values, ranging from what should be 0.0 to 1.0. Due to limitations in the collection and the downwelling light sensor for the RedEdge, these values will be incorrect if the lighting has changed significantly. In general, this will be because of cloud cover. There is a small amount of solar irradiance change as well over the flight, although that change should not heavily impact the data for its intended use. To achieve the reflectance corrections, all images are adjusted for gain, exposure time, vignetting effects, and lens distortion. This code is based on MicaSense's processing tutorials and is available on request. Images are also individually orthorectified. As there are no tie points with which to mosaic the images, a Python script is used to project images individually on a flat surface using the position, altitude, and attitude of the sensor at the time of image capture. This code can also be supplied on request. Related dataset (processed hyperspectral) is available under GRIIDC Unique Dataset Identifier (UDI): MS.x839.000:0017 (DOI: 10.7266/MW2PRASF).
Suggested Citation:
Robert Moorhead, Lee Hathcock. 2020. Multispectral reflectance imagery collected from an unmanned aerial system over oyster reefs in the Bay of St. Louis near Gulfport, Mississippi from 2018-05-07 to 2018-05-11. Distributed by: GRIIDC, Harte Research Institute, Texas A&M University–Corpus Christi. doi:10.7266/n7-gcx0-4e51
Data Parameters and Units:
Reflectance (%). Reflectance panel used: RP02-1603028-SC
Band - Average Reflectance: Blue - 0.64; Green - 0.67; Red - 0.67; NIR - 0.62; Red Edge - 0.66
For each band and filename imagery data has been converted into reflectance values ranging from 0.0 to 1.0.
File names are kept consistent with their raw counterparts. Each band is produced as a TIFF file, with the naming format of IMG_XXXX_B, where XXXX is the image number starting from 0000 and increasing with each successive image. Note that an older version of the firmware for this camera stores images in subfolders of 200 images each, with the image name increasing to 0199 and then resetting to 0000 for the next subfolder. The subfolder names start at 000 and increase as each subfolder is added. Newer versions keep the subfolder structure but maintain the image count, so the last file in subfolder 000 will be 0199, and the first file in subfolder 001 will be 200 and continue in this manner. The "B" is the band number. A "1" is blue, "2" is green, "3" is red, "4" is near-infrared, and "5" is red edge.
Methods:
Conversion from raw digital numbers (DNs) involves following the process set forth by MicaSense at https://github.com/micasense/imageprocessing.
The processing chain is as follows:
1) Remove dark pixel offset (imager noise)
2) Correct for vignetting effects
3) Correct for row gradient
4) Normalize by exposure and gain settings
5) Convert to radiance values
6) Further convert to reflectance by using panel values
7) Remove lens distortion effects
8) Orthorectify image by using various GPS/IMU sources (hyperspectral IMU and UAS IMU values)
Error Analysis:
Potential errors could be present in the data if lighting changes during flight. This occurs somewhat during a normal flight, but not by a large amount. However, cloud cover will dramatically affect the amount of solar energy, and applying the reflectance corrections based on a single panel image will cause the produced values to be incorrect.