In the quest to enhance citrus yield estimation, this study leverages Multi-Temporal Unoccupied Aerial Systems (UAS) datasets to develop a refined tree-level yield estimation model. Conducted within a 2.22-acre orchard in Weslaco, South Texas, the research integrates data from 56 citrus trees, utilizing RGB and Multispectral UAS imagery collected over critical growth months from June to December (excluding July and August) 2022. The methodology encompasses the collection of UAS data processing using some custom-developed algorithms and automated workflows and the novel application of machine learning algorithms-namely, multiple linear regression, gradient boosting regression, and random forest regression. The study innovatively extracts 11 key phenotypic features, combining tree canopy structural and spectral information to estimate yield with increased accuracy. A significant advancement is proposed in the form of an improved individual tree boundary delineation method. This method addresses the inaccuracies of previous solid shape-based approaches, contributing to more precise feature calculation and improved model performance. Our experimental results indicate the single month whose data is particularly predictive, with the Random Forest model demonstrating robustness and consistency across temporal datasets. The multi-temporal approach confirms that comprehensive data integration yields superior estimation models. This ongoing research promises to bridge the yield estimation gap and set a new standard in precision agriculture methodologies.
ORCiD: [https://orcid.org/0000-0001-6665-6094] Plant phenotyping has been an essential aspect of crop science analytics that is saddled with tasks such as providing critical information about plants' genetics, traits, productivity, and other intricate details to gain insights about their survival under certain conditions and in a specific environment for various analyses. Various methods have been quantifying this information using various models culled from several kinds of datasets. In this study, we extract various phenotypic information about the soybean using UAS-based images captured over the growing fields within the selected experiment field. The DJI M300 unmanned aerial systems were equipped with the Zenmuse P1 and L1 sensors; both used to capture RGB and LiDAR images. In addition, the DJI P4 multispectral UAS was also used to collect multispectral information over these fields at various date intervals. The data captured is being processed using custom-developed algorithms and automated workflows to obtain biomass, vegetation indices, canopy cover, canopy height, and canopy volume. These indices would show variations in the traits of the crop under study as related to the soybean. This phenotypic information would be compared against the field measurements for validation.