Colony Island Network Design and Implementation (CINDI) Project to recover waterbirds in the Gulf of Mexico: Full Colony Island Orthomosaics 2024
Funded By:
National Oceanic and Atmospheric Administration
Research Group:
Conservation and Biodiversity
Dale Gawlik
Texas A&M University-Corpus Christi / The Harte Research Institute for Gulf of Mexico Studies
dale.gawlik@tamucc.edu
Colony Island Network Design and Implementation, Waterbirds, Rookery Islands, Drone Imagery, Orthomosaic
Abstract:
The degradation of critical nesting islands for colonial waterbirds has prompted a move to rehabilitate eroding islands in Texas. In the last decade, large investments of money (>$27 million) have been, and will continue to be, committed towards colony island projects along the Gulf Coast as recognition grows that existing islands are facing threats, which could leave entire areas without nesting habitat. To date, island creation and rehabilitation projects have been largely the product of opportunity, with location and design largely a function of engineering feasibility and available funding. The ”build it and they will come” approach has yielded some good outcomes for birds, but predicting the degree to which species will benefit from islands in particular locations and with certain physical features has been based on expert opinion with high uncertainty and no understanding of the contribution of a rehabilitated island to the regional population of birds. Therefore, a prioritization tool that reduces uncertainty in management decisions by developing a data-driven model incorporating bio-geophysical constraints, as well as economic considerations, is needed to prioritize islands that have the greatest potential to increase waterbird populations on the Texas coast, and by extension for the Gulf of Mexico region. The objective of the Colony Island Network Design and Implementation (CINDI) Project, is to develop a Geographic Information System (GIS)-based prioritization model from stakeholder input and long-term bird nesting data and calibrate the model with stakeholder knowledge and field data on foraging habitat and colony characteristics. The end result will be a tool to prioritize a network of colony islands and sites on the Texas coast that have the highest potential for enhancing waterbird nesting and provide the maximum conservation benefit for colonial waterbirds, accounting for bio-geophysical constraints, relative sea level rise, as well as economic feasibility and social factors. Pursuant to the above objectives, we will produce datasets of long-term bird nesting productivity data for five focal species (Great Egret [Ardea alba], Tricolored Heron [Egretta tricolor], Reddish Egret [Egretta rufescens], Caspian Tern [Hydroprogne caspia], and Black Skimmer [Rhynchops niger] using drone-derived imagery of transects collected weekly at up to 20 colony islands during the breeding seasons of 2024, 2025, and 2026. Imagery will be stitched together in Agisoft Metashape and the resulting orthomosaic will be scanned for nests, which will be identified to species and tracked weekly until fledging or failure occurs. Daily nest survival and overall nest survival for each focal species will then be calculated for each island. This dataset includes orthomosaic GeoTIFFs (.tif) of full colony island surveys that are performed at least once during the breeding season.
Suggested Citation:
Gawlik, Dale; Mirzadi, Rostam; Wolff, Liam. Colony Island Network Design and Implementation (CINDI) Project to recover waterbirds in the Gulf of Mexico: Full Colony Island Orthomosaics 2024. Distributed by: GRIIDC, Harte Research Institute, Texas A&M University–Corpus Christi. doi:10.7266/6ghezv9x
Purpose:
The objective of collecting these data is to generate a dataset of drone-derived imagery of waterbird colonies on the Texas, USA coast that can be used to track nesting progress and productivity of five focal waterbird species (Great Egret, Tricolored Heron, Reddish Egret, Caspian Tern, and Black Skimmer) to support the production of a tool that assists stakeholders in prioritizing colony islands for rehabilitation.
Data Parameters and Units:
Latitude and longitude (WGS84), date and time (UTC), altitude (m), camera make, camera model, f-stop, exposure time (s), ISO speed, exposure bias, focal length (mm), max aperture, metering mode, flash mode, 35mm focal length (mm) We also are including ancillary data including Dense Clouds, Depth Maps, Elevation (m), and Sparse Point Clouds.
Methods:
Image Collection (1). Drone transect surveys were conducted with a commercially available quadcopter drone (DJI Matrice 300 + DJI D-RTK 2 or Freefly Astro) paired with a high resolution RGB camera (DJI Zenmuse P1 or Sony A7R IVA). Quadcopter drones were operated by a single researcher, (REM) launched from a boat >100m from the nearest nest, and operated using a remote controller. To ensure the safety of researchers and the drone, we only operated the drone when wind speeds, at the altitude missions were flown (50 m), were below 33 mph. The drone was programmed to photograph the transect using a drone mapping app specifically created for the quadcopter drone used in this study or a third party app (Auterion Mission Control). The drone was then flown to an altitude of 50 m and captured images of the transect at intervals with the appropriate overlap (88% front overlap and 88% side overlap). The drone continued capturing images until it had recorded the entire transect. Each image captured with the drone included the following metadeta: the date and time when the image was collected, the make, model, and focal length of the camera, the onboard GPS latitude and longitude, the yaw angle of the drone, and the yaw, pitch, and roll angle of the camera gimbal. Post-Processing Images (2). Upon completion of the survey, images were imported from the camera SD cards to Agisoft Metashape. Once imported into the post-processing software, REM scanned through all images captured during the survey and removed those images that were blurry or may otherwise have compromised the resolution of processed images. Images were then aligned. Camera alignment finds the camera position and orientation for each photo and builds a sparse point cloud model. Researchers specified in Agisoft Metashape that images were aligned at the highest resolution (“highest”) attainable. The key point limit, which sets the maximum number of feature points considered as the postprocessing software matches photos together, was set to 40,000, and the tie point limit, which is the maximum number of points that the post-processing software will match between photos, was set to 4,000. The tie point and key point limit used in this study are values recommended by Agisoft for projects that focus on landscape imagery. Once images were aligned, any images that failed to align were rerun through alignment. Images that failed to align often do not contain points which can be “tied” to neighboring images (common if images are taken of water surrounding waterbird breeding colonies). These images were rerun through alignment until they successfully aligned. If images failed to align after repeated attempts, these images were removed from further analysis. After camera alignment, the sparse cloud and image tie points generated through camera alignment were processed into a dense cloud and depth maps. Dense cloud points were calculated by rectifying image tie points using distortion parameters and camera positions determined during the formation of the sparse cloud, such that pixels were matched in image pairs. Dense cloud points were then used to render depth maps of every image. Agisoft allows researchers to set the “quality” of the dense point cloud, and the level of “depth filtering” used to create depth maps. “Quality” of dense clouds refers to the image resolution scaling, such that images are matched using a set fraction of the image pair tie-points found during alignment. “Depth filtering” refers to the mode used to construct depth maps. Due to several factors, like noisy or badly focused images, there were outliers among the dense points, which, if used, would have resulted in blurry portions of subsequent processed images. Agisoft allows researchers to set the “Depth Filtering” mode, which are structured between “Disabled” and “Aggressive”, used to construct depth maps. Given the high canopy cover found on some waterbird breeding colonies, researchers set “Quality” to medium, and “Depth Filtering” to mild, to decrease blurriness resulting from the use of uncertain image tie-points. Depth maps were then rectified into a Digital Elevation Model (DEM), and “stitched” into orthomosaic images.
Instruments:
DJI Matrice 300 RTK + DJI Zenmuse P1 + DJI D-RTK2 Sony A7R IVA