WheatCAP UAS Survey Summary

Texas A&M AgriLife Research scientists, in collaboration with Purdue University, will deploy their expertise in unmanned aircraft systems (UAS) to bolster a comprehensive digital agriculture initiative, which aims to improve wheat crop production across the country. The UAS component is intended as part of a multi-institutional Coordinated Agricultural Project, known as the WheatCAP. It is one of several CAPs funded by the U.S. Department of Agriculture National Institute of Food and Agriculture in the past. The WheatCAP plans for submitting for a new grant in 2021 with UAS as an integral component.

A primary effort of the WheatCAP is to develop a coordinated effort for improving wheat production through developing tools and sharing large datasets among breeders to enhance efforts. One proposed initiative is to develop a centralized database of UAS-based phenotypic data collected from multiple breeding programs. This database will help develop phenomics tools that can be used to screen genotypes across multiple environments.

AgriLife Research’s first major step in the overarching UAS portion of the initiative, to which the agency will contribute many aspects of UAS expertise, was to conduct a survey of capabilities among the collaborating institutions, comprising 19 individuals, to determine the broad capabilities of each.

The survey revealed that, of the 19 participants in the project, 15 have already established a UAS component. Institutions with UAS were asked to rank their programs’ experience levels, of which four identified as advanced, six as intermediate, and five as beginners. The three institutions without the UAS component said they planned to establish one within the next two years.

When asked which year of the grant they would start sending data to the Texas/Purdue UAS Data Analysis Center, 13 responded year one, and the other five said year two of the project. Various drone models, sensors, equipment, and processing software are being used across the different programs, and half of the 19 programs have certified pilots on staff.

When asked how they address the issue of georeferencing or aligning the UAS data products over the growing season, most programs reported using Ground Control Points (GCPs) with Global Positioning System (GPS) surveying, and even a Real-Time Kinematic (RTK) GPS.

Additionally, 13 of the 19 programs had experience capturing raw images using sensors mounted on the UAS, and only 11 had experience sharing data, like raw images and geospatial data, generated from the UAS.

11 programs had indicated experience analyzing geospatial data products such as orthomosaics, digital surface models, or 3D point cloud generated from UAS data.

13 programs indicated having experience analyzing geospatial data products such as vegetation indices like normalized difference vegetative, normalized difference RedEdge, and soil adjusted vegetation indices. Also included is experience with structural phenotypic information like canopy height, canopy volume, canopy cover, and others.

12 programs indicated experience in developing multi- temporal analysis using the data products mentioned previously.

10 of the 19 programs conduct their own UAS data collection, processing, storage, and analysis. Overall, those who described their program level as advanced and intermediate indicated minimal training needs. However, they believe that the training can be useful for them to enhance their ability either in all or one phase of UAS data collection, processing, or storage, and UAS data collected in multiple environments can be standardized for comparison. Much of the training needs are in big data analytics to generate algorithms.

Texas A&M AgriLife Research/Purdue will respond to existing needs among the collaborators by providing its considerable capacity to conduct UAS training and process, collect and store massive datasets required by the overall effort. AgriLife Research will focus its effort on supporting those institutions at the beginner stages of working with UAS.

You can visit http://tinyurl.com/wheatCAPUASSummary or scan the below QR code for the full summary spreadsheet of partner responses on UAS capability.

This article is developed by Amir Ibrahim 1), Jackie Rudd 1), Gabe Saldana 1), Mahendra Bhandari 1), Jinha Jung 3), Juan Landivar 1), Shannon Baker 1), and Anjin Chang 2)

1) Texas A&M University; 2) Texas A&M University-Corpus Christi; 3) Purdue University

4

DJI M300 RTK + Zenmuse P1 maiden flight

We had a maiden flight for DJI M300 RTK + Zenmuse P1 today (3/3/2021) at Martell Forest. Everything went smooth and successfully completed a 21-minute long autonomous flight mission.

DJI M300 RTK + Zenmuse P1 maiden flight at Martell Forest

Image quality from P1 was impressive. A picture paints a thousand words. Comparison of images from DJI Mavic Pro (12MP 1/2.3″ sensor) vs. Zenmuse P1 (45MP full frame sensor) at 120m altitude follows.

Left: Mavic Pro, Right: Zenmuse P1 (images are acquired at 120m altitude). You can compare images by moving the slide.

Online Data Portal for Indiana 2017-2019 3DEP LiDAR Database

Our team launched Online Data Portal for Indiana 2017-2019 3DEP LiDAR Database (lidar.jinha.org) to help users easily and effectively find statewide LiDAR point cloud and data products. Please click on the county name in the below section to navigate to visualization/download page. Currently, we provide below data products:

Point cloud, digital terrain model (DTM), normalized digital height model (NDHM) from the left
  • Point Cloud (.laz): sets of points (x, y, z, intensity, etc.) that describe an object or surface
  • Digital Surface Model (DSM, .tif): 3D representation of surface elevations in raster image format, 5ft by 5ft
  • Digital Terrain Model (DTM, .img): 3D representation of underlying terrain elevations in raster image format, 2.5ft by 2.5ft
  • Normalized Digital Height Model (NDHM, .tif): 3D representation of heights above underlying topography of man-made objects, vegetation, etc. in raster image format, 5ft by 5ft