Farah Saeed

and 1 more

Plant architecture is an important contributing factor for enhanced yield production and quality. The architecture traits are analyzed for crop health monitoring and genetic manipulation for generating high yielding varieties. Computer vision methods applied on 3D pointcloud allow more accurate extraction of architecture traits but consume more time and memory compared to 2D images. This study aims to design light weight 3D deep network for Cotton plant part segmentation and derive seven architectural traits of mainstem height, mainstem diameter, branch inclination angle, branch diameter, and number of branches, nodes, and cotton bolls. The pointcloud data is collected using FARO LiDAR scanner. The mainstem, branches and cotton bolls are manually annotated using Open3D. The preprocessing steps of denoising, normalization and down sampling are applied. 3D Deep network is designed to sample 1024, 512 and 256 points where neighborhood aggregation is performed at radius levels of 1cm, 5cm, and 30cm respectively. Features for remaining points are interpolated. The features from each radius level are concatenated and passed to multi-layer perceptron for pointwise classification. Results indicate that mean IoU and accuracy of 84% and 94% are achieved respectively. A 6.5 times speedup in inference time and 2.4 times reduction in memory consumption compared to Pointnet++ is gained. After applying postprocessing on part segments, an R square value of more than 0.8 and mean absolute percentage error of less than 11% are achieved on all derived architecture traits. The trait extraction results indicate potential utility of this process in plant physiology and breeding programs.
To assist plant scientists, geneticists, and growers to understand crop-environment interactions, plant phenotyping is a powerful tool for improving crop cultivars and developing decision support systems in farm management. Recent trends use LiDAR to capture three-dimensional (3D) information from plants to analyze traits vital to plant growth and development. However, current terrestrial-based 3D analysis methodologies are time and labor intensive and can be a bottleneck when large agricultural fields need to be analyzed. Robotic technologies can be used to accelerate the field-based measurements of relevant plant features and optimize the high-throughput phenotyping process. In this paper, we present a robotic system with a 3D LiDAR and a data processing pipeline for efficient, high-throughput field phenotyping of cotton crops. The robotic system consists of a Husky robotic platform equipped with a FARO Focus 3D laser scanner. The components of the system are integrated under the ROS framework to ensure interoperability and data integrity and availability at any given time. The data processing pipeline involves the data collection, registration, and analysis tasks for measuring crop traits at the plot level—canopy height, volume, and light interception—and estimating yield. This work demonstrates a crop phenotyping platform that leverages two off-the-shelf equipment for the quantitative assessment of cotton plant traits in the field. This methodology can be extended to other agricultural crops contributing to the advancement of plant phenomics.