loading page

Single plant detection from field-based drone images using deep learning
  • +1
  • Piyush Pandy,
  • Daniel R Kick,
  • Norman Best,
  • Jacob D Washburn
Piyush Pandy
USDA-ARS
Daniel R Kick
USDA-ARS
Norman Best
USDA-ARS
Jacob D Washburn
USDA-ARS

Corresponding Author:

Abstract

Plant phenotyping using images from unoccupied aerial systems (UAS) is typically limited to trait acquisition at the plot level, generally accompanied by the manual delineation of plot boundaries. However, phenotyping of natural populations, mutant populations, and segregating germplasm are difficult or impossible without plant level phenotypes. Plant level boundary delineation is complicated by many factors and manual delineation is impossible in even relatively small field experiments. We present a method for using object detection models on raw UAS images followed by the "projection" of the resulting bounding boxes onto the orthophoto for geolocation. The method is able to detect individual maize plants in UAS images, create accurate bounding boxes around these plants on the orthophoto via differential orthorectification, and extract individual plant heights from the Digital Surface Model (DSM) as well as spectral indices like Normalized Difference Vegetation Index (NDVI) values from orthophotos. For verification, we compared the heights obtained from this method to the heights obtained by manually drawing the bounding box for each plant and calculated the precision. The method performs with accuracies as high as 0.9 depending on the growth stage and other factors. We expect this accuracy can be increased with larger training sets and further methods development.
30 Oct 2023Submitted to NAPPN 2024
30 Oct 2023Published in NAPPN 2024