Plant reproduction is sensitive to heat stress. Pollen tube growth can be accelerated or arrested by high temperatures, leading to unstable tubes, failed sperm cell delivery, and ultimately crop yield loss. Pollen growth dynamics have historically been observed on the scale of individual pollen grains, but there are only a few studies surveying pollen populations across genotypes and environmental conditions. Here we describe a phenotyping system that quantifies tomato pollen characteristics on a large scale and under varied heat stress conditions. In this system, we combined high-throughput bright-field microscopy with automated object detection and tracking to investigate the lives of growing pollen tubes. We used this method to survey pollen from a diverse panel of 220 tomato and close wild relative accessions under different temperatures. This method can be readily adapted to pollen from difference species, providing a rapid way to characterize heat stress responses and molecular functions in flowering plants.
This study evaluates a hyperspectral imaging (HSI) technique to identify herbicide-resistant kochia (Bassia scoparia) biotypes to support weed management in cropping systems. The experiment was conducted under controlled-environment where glyphosate was applied to six different kochia populations. For each population (72 cell tray of plants), half of the plants were sprayed with Glyphosate 900 g ae ha-1 , while the other half remained an untreated control. Hyperspectral images were acquired over five time points spanning from glyphosate treatment to 15 days after treatment (DAT) using a proximal HSI system (Specim-IQ) with 204 spectral bands from 397nm to 1003nm. The average reflectances were extracted from plants that were characterized as glyphosate-resistant or-susceptible. We first analyzed the temporal variations of the spectra with and without the application of herbicide. The spectral profile exploits the advantages of temporal features in biotype discrimination. Random forest algorithms were used to classify the glyphosate-resistant and-susceptible populations, by using reflectance at optimal wavelengths (near-infrared) and various vegetation indices with high correlations with visual ratings. Based on the classification accuracy, the most important wavebands and vegetation indices were determined to classify the weed biotypes. Preliminary results show that: 1) For the untreated plants, the reflectance at red-edge to near-infrared reached the highest level on 8 DAT, revealing the highest chlorophyll content in the leaves. Then, the reflectance declined until 15 DAT. 2) In contrast, strong effects of glyphosate were captured on 8 DAT for the three herbicide-susceptible populations. For the three glyphosate-resistant populations, reflectance at red-edge to near-infrared did not increase from 1 to 8 DAT, which was opposite of the controlled plants.
Irrigation of crops accounts for a significant portion of fresh water consumption. In order to utilize this resource more efficiently, it is necessary to engineer crops that can more efficiently use water. Water use efficiency, defined as the ratio of plant growth to water used, is a complex property of plants affected by many different factors. Despite this complexity, genetic variability has been able to be identified in a number of different crops. The C4 model species Setaria viridis remains under-studied in this regard and consequently we sought to identify promising genetic loci contributing to variation in water use efficiency. In order to accomplish this goal we leveraged the high-throughput phenotyping platform at the Donald Danforth Plant Science center to grow S. viridis in well-watered and water-limited conditions. This automated system enables strict control of watering regimes as well as measures of plant traits extracted from photographs using computer vision. Combining these two sets of data allows for direct measurement of whole-plant water-use efficiency on a daily basis which was used as a response variable in a genome wide association study. Significant associations were found for water-use efficiency and related traits. These loci were then prioritized further by pooling information across each day of an experiment and across multiple experiments to zero in on the most likely locations of genes responsible for driving water-use efficiency in S. viridis.
ORCiD: [ORCiD of presenting author] Jaebum Park [0000-0001-6459-909X] AND/OR Max Feldman [0000-0002-5415-4326]Tuber size and shape, colorimetric characteristics of tuber skin and flesh, and tuber defect susceptibility are all factors that influence the adoption of potato cultivars. Despite the importance of these characteristics, our understanding of their inheritance is limited by our inability to precisely measure these features on the scale needed to evaluate breeding populations. To alleviate this bottleneck, we have developed a low-cost, semi-automated workflow to capture data and quantify each of these characteristics using machine vision. This workflow was applied to assess the phenotypic variation present within 189 F1 progeny of the A08241 breeding population and map the genetic basis of tuber characteristics. Several medium-to-large effect, quantitative trait loci (QTL) were found to be associated with different measurements of tuber shape. These results indicate that quantitative measurements acquired using machine vision methods are reliable, heritable, and can be used to map and select upon multiple traits simultaneously in structured potato breeding populations.
Cover crops, plants grown during fallow periods between cash crops, are a promising solution to mitigating soil degradation induced by conventional agricultural practices and improving soil health. Cover crops can provide several beneficial ecosystem functions, such as soil structure remediation, soil microbial diversification, and nutrient recycling, depending on the plant species. Interactions between plant roots and the surrounding soil are key to the plant's ability to perform their ecosystem functions. The lack of data on cover crop roots inhibits our understanding of cover crop phenotype-ecosystem function relationships. We combine aboveground and belowground phenotyping measurements with physicochemical soil measurements to evaluate the field performance of 19 different plant species in monocultures and polycultures as winter cover crops in Missouri. Canopy cover imaging reveals significant differences in winter hardiness and weed suppression among cover crop varieties. Root biomass and root length density measured at depths up to 1 meter indicate differences in rooting behavior between cultivars suggesting the ability to breed cover crop varieties with improved root system architecture. I will also highlight our collaborative efforts utilizing remote sensing technologies (aerial RGB and hyperspectral imaging) to model carbon and nitrogen cycling in cover crop systems at a field scale. Finally, we have begun to characterize 3D root system architecture traits at the seedling stage using a gel-imaging system. Better understanding of cover crop rooting behavior will allow us to breed varieties with enhanced performance of beneficial ecosystem functions for sustainable agricultural systems.
Unmanned aerial vehicle (UAV)-based imagery has become widely used in collecting agronomic traits, enabling a much greater volume of data to be generated in a time-series manner. As one of the cutting-edge imagery analysis tools, machine learning-based object detection provides automated techniques to analyze these imagery data. In our previous study, UAVs have been used to collect aerial photography for field trials of 233 diverse inbred lines, grown under different nitrogen treatments. Images were collected during different plant developmental stages throughout the growing season. This dataset of images has here been used in developing machine learning techniques to obtain automated tassel counts at the plot level through the season. To improve detection accuracy, we have developed an image segmentation method to remove non-tassel pixels and then feed these filtered images into machine learning algorithms. As a result, our method showed a significant improvement in the accuracy of maize tassel detection. This method can be used in future research to produce time-series counts of tassels at the plot level, and will allow for accurate estimates of flowering-related traits, such as the earliest detected flowering date and the duration of each plot's flowering period. This phenotypic data and the trait-associated genes provide new opportunities for crop improvement and to facilitate future plant breeding.
BodyText: A significant portion of plant phenotyping research involves development of new instruments and methodology. Thus, a common experiment is to compare new methods to established ones in order to assess the suitability of the new method. Pearson's correlation coefficient, r, is commonly calculated from the correlation between measurements of the two methods on the same subjects, and it is interpreted to assess whether the new method is a suitable replacement for the established one. However, r (and in this context R, and R 2) is not an appropriate statistic for this purpose, and it provides no meaningful information for comparing quality of methods. This is well established, and other alternatives are known. Here we present quantification and statistical tests of bias and variances of two methods that provide a well-founded approach to method comparison. Comparing newly developed methods to measure height and leaf area index (LAI) using lidar, we find that lidar estimates of height are more precise than established methods and lidar estimates of LAI are equivalent or slightly worse. Using r alone it is not possible to make these interpretations. These sorts of statements are possible due to clear, objective approaches to method comparison, which should be the standard for assessing new phenotyping methods.
Minirhizotron imagery can be used to assess plant root health, and the amount of data for analysis motivates automation of root detection through use of neural networks. Building upon previous work, we show that we can use transfer learning from our PRMI dataset to assess root health across twelve classes of a new dataset to answer questions regarding how root health is affected by access of a tree by large herbivores, site infestation by Pheidole megacephala, and location of the tree. This dataset was collected from three paired sites at the Ol Pejeta Conservancy in Laikipia, Kenya, and consists of 20,000 images collected between September 2021 and May 2022. Each paired site represents four locations based on all four possible combinations of site infestation by Pheidole megacephala for at least 20 years and existence of herbivore-exclusion fence to keep large herbivores out. 1,332 images across all twelve classes of site and treatment combination were labeled with respective ground truths for model training. Our work uses the UNet architecture using pretrained weights on the network encoder and decoder which were obtained in 2019 in work which achieved over 99% accuracy on a dataset of peanut and switchgrass imagery. In our work, we found that training the model with our new dataset resulted in consistent performance across all classes of our new dataset, with over 99% accuracy for each class.
High night air temperature stress (HNT) challenges rice production. Findings indicate 10% yield reduction for every 1 o C of increase in night air temperature. The responses of rice to HNT stress have been analyzed in limited number of genotypes mostly under greenhouse conditions. One of the limits for these studies under field conditions is implementing HNT stress on critical rice growth stage. The physiological and metabolic responses of rice to HNT stress under field conditions are not fully understood, thus, field studies are needed. Field-based phenotyping infrastructure that can house rice germplasm and stress imposition using computer-based system basing on ambient temperature still do not exist. In this study, six high tunnel greenhouses were built in a field experimental station in Harrisburg, AR in a split-plot design. These movable infrastructures fitted 310 rice accessions from the Rice Diversity Panel 1 (RDP1) and 10 hybrids from RiceTec. Each high tunnel greenhouse had heating and a cyber-physical system that recorded ambient air temperature and increased night air temperature relative to ambient temperature at the flowering stage. The system successfully imposed HNT stress of 4.01 o C and 3.94 o C as recorded by Raspberry Pi sensors for two weeks in the 2019 and 2020 cropping seasons, respectively. These greenhouses were able to endure constant flooding and resist heavy rain and 40-50 miles/h winds. Grain quality and other biochemical assays are still ongoing to fully assess the effects of HNT in the rice accessions and the hybrids.
The structures of roots play an essential role in plant growth, development, and stress responses. Minirhizotron imaging is one of the widely used approaches to capture and analyze root systems. After segmenting minirhizotron images, every individual root is separated from each other and the background. Root traits, like root lengths and diameter distributions, can provide information about the health of the plants. Current methods to analyze minirhizotron images usually rely on manually annotated labels and commercial software tools, which are time and labor-consuming. Unfortunately, these current methods usually generate a statistical analysis of the input image rather than the features of each root. In this work, we propose a pipeline to automatically use deep neural networks to segment roots from the background and then extract root features like lengths and diameter distributions from the individual segmented root. In detail, we first use a pre-trained U-Net to segment the roots in the minirhizotron images. Then, we separate each individual root with the help of connected component analysis. Finally, we extract the features like diameter distribution or root lengths of every individual root with morphological operations, like skeletonization. For evaluation, we conduct experiments on synthetic roots, which are made of strings and threads, and compare results against a benchmark root dataset (PRMI) of real switchgrass roots and compare the estimated results with the existing commercial software.
Most current phenotype plant research focuses primarily on above-ground traits, like leaves and flowers. Roots often get comparatively less attention because they are challenging to examine and image. Minirhizotron (MR) systems are one of the imaging approaches to studying plant roots underground. In MR systems, a tube is inserted into the ground to allow a camera to be inserted to capture the images of root systems. Unlike minirhizotron imaging, X-ray computed tomography (CT) captures the three-dimensional (3D) information of soil cores extracted from the soil. For a better analysis of roots, the first step is always to segment the roots from the background in the images or image sequences. The results of root segmentation play an essential role in further analysis like root diameter and length estimation. Current fully-supervised segmentation methods mainly use pixel/point-level annotated labels, which require much manual effort and time. In this work, we propose a weakly supervised root segmentation approach with graph convolutional networks. Our model only requires image-level annotations to segment roots from the images or image sequences. In detail, our model first constructs graphs for the neighboring pixels/points and then learns the distinguishable features used as hints for segmentation by training a classifier based on the image-level annotations. Finally, post-processing procedures like principal component analysis (PCA) are applied to refine the final segmentation results. We conduct experiments on the challenging 2D PRMI minirhizotron benchmark and 3D switchgrass root X-ray CT datasets for evaluation.
Hyperspectral imaging is a non-destructive imaging technique used in plant phenotyping to collect and analyze an array of electromagnetic information in visible (380-700 nm) and near-infrared wavelengths region (700-2,500 nm). Hyperspectral imaging can provide information of plant responses under various biotic and abiotic stress, e.g., drought, temperature rising, disease, and nutrition deficiency. We present a hyperspectral data processing pipeline designed for the data collected at Ag Alumni Seed Phenotyping Facility (AAPF) in Purdue University, USA. The procedure consists of initializing a processing session, radiometric calibration with white and dark references, geometric calibration (registration) of visible and near infrared (VNIR) and shortwave infrared (SWIR) images, vegetation and non-vegetation classification, vegetation indices calculation of a plant area, exporting data products, and quality control. In concern of large data size of hyperspectral data, we highlight the need to save memory usage during computation and save disk space for data products. We also address the need of human interpretable images in the hyperspectral data products for plant scientists without experiences in hyperspectral imaging. We expect the developed procedure could improve robustness of large hyperspectral data processing and promote the usage of hyperspectral data by increasing interpretability.
Global wheat production needs to increase by 60% to ensure food security in the future. Radiation use efficiency (RUE), defined as dry matter production per unit of light energy consumption, is an important trait that contributes to wheat yield potential. Traditionally, RUE is estimated through sequential biomass cuts evaluated against cumulative light interception, which is less precise and non-specific to genotypes. 3D models have recently been shown promise in estimating light interception when used along with ray tracing algorithms, mostly deployed in single plant-based models, while light interception at the canopy level remains to be explored. In this study, a mobile robotic phenotyping platform equipped with dual multispectral laser sensors was used to generate canopy 3D data. Using this platform, 100 spring wheat genotypes were scanned at heading stage to understand the genetic variation for RUE and its associated traits under field conditions. Ray-tracing algorithms were used to estimate the fraction of intercepted photosynthetically active radiation (FIPAR) for all genotypes, validated through a hand-held light ceptometer. Genotype-specific RUE was calculated as a slope between dry biomass and accumulated PAR. 3D model-based FIPAR was in close agreement with ceptometer-derived FIPAR. 3D model-derived RUE showed a large genetic variation across 100 wheat genotypes. It explained a higher variation in grain yield than ceptometer-derived RUE. These results indicate that canopy 3D models can be used as a rapid method for estimating canopy RUE in wheat, and potentially are extendable to other cereals.
Grain and seed properties can be evaluated using near-infrared spectroscopy and other methods for post-harvest quality assessment. Hyperspectral imaging combines spectroscopy with spatial information, which provides additional features that may improve predictive models of seed traits. To assess the ability of deep learning models to use hyperspectral data for predicting phenotypes, we first aimed to predict the genotype of maize seeds. Previous work achieved high identification accuracy between a small set of genotypes using either RGB images or hyperspectral data, and we hypothesized that high spectral resolution (350-1000nm) hyperspectral data would outperform simple RGB data in our study. Our dataset consisted of hyperspectral images of maize seeds from 47 inbred lines, including the 26 NAM lines, with 96 individual seeds per genotype. We evaluated the difference in genotype identification accuracy using three different representations of the individual seed data: 1) using the whole scan, containing the reflectance at 580 different wavelengths, 2) using a subset containing the reflectance at 3 different wavelengths corresponding to a pseudo-RGB image, and 3) a gray-scale image derived from the pseudo-RGB image. We fine-tuned VGG11, a popular convolutional neural network, using 85% of the individual seed data for each of the representations. We obtained around 90% genotype prediction accuracy on the unseen data for both the whole scan and the pseudo-RGB data, and 72% genotype prediction accuracy using the gray-scale data. The results indicate that the shape and color information contained in RGB images might be sufficient for the task of maize seed genotype identification.
Holistic assessment of fruit quality is an essential component of producing Strawberry varieties that will succeed in the marketplace and improve consumer satisfaction. However, several key quantitative traits are notoriously slow and expensive to assess using standard procedures, namely acidity and aroma, which require titration and gas chromatography and mass spectroscopy compared to others: brix, anthocyanins, and vitamin C, which are measured by refractometer and parallelized plate reader assays. Scaling up evaluations for acidity and aroma has been difficult as the techniques require 5 and 40 mins/sample, respectively, and sample preparation is equally intense, requiring multiple trained hands working for 10-hour sessions to create the sample series for 100 entries. We evaluated the ability (R 2 , RMSE) of a handheld near infrared (NIR) spectrometer, measuring 125 wavelengths between 800 and 1600 nm, and an electronic nose, measuring the reaction of 32 electrochemical sensors that respond to various compounds in gas samples, on 4,000 diverse strawberry accessions to determine if the 5 and 40 min/sample assays can be replaced with a 1 (0.33%) sec/sample (NIR) and 2 (5%) min/sample (E-nose) assay that require no additional sample prep. We also assess the NIR's ability to predict brix, anthocyanins, and vitamin C. With these two sensors, we will be able to increase the scale of early generation evaluation from hundreds to thousands of samples in early generations, produce full datasets prior to deadlines in the breeding program, and make more reliable genetic gains for quality traits affecting marketability and consumer acceptance.
The California strawberry industry generated more than 2 billion dollars in revenue in 2020 (USDA-ERS). Strawberry breeders develop new varieties to increase productivity in the face of shifting biotic and abiotic stresses. The University of California Davis maintains a strawberry breeding program that evaluates >10,000 entries yearly to meet the demand for new improved varieties, focusing on plant productivity, fruit quality, and resistance to soil borne pathogens. One challenge to a breeding program of this scale is efficiently scoring and collecting detailed information on cultivar performance. Traits like plant size and growth rate are rarely collected. It takes a crew of 4 people 20-25 hours to score fruit count, so it is currently done once per week. Correlated traits, e.g., plant size and vigor, assessed by drone imagery could provide high-quality information and replace labor intensive assessments of phenotypic traits and yield. In 2022 we deployed drones to generate research grade imagery of nearly 10,000 entries at Wolfskill Experimental Orchard in Winters, CA and another 3,000 entries under induced disease pressure to determine the best predictors of productivity and disease severity from drone imagery. We applied image analytics tools developed by HIPHEN to extract ground coverage, plant height, biovolume and a range of visual indices from multiple sensors to assess cultivar performance. The extracted traits were then used as independent variables to predict either yield or visual disease severity. We report our initial findings, examine the successes and learnings, and propose solutions to ongoing challenges in strawberry breeding.
ORCiD: [https://orcid.org/0000-0001-6665-6094] Plant phenotyping has been an essential aspect of crop science analytics that is saddled with tasks such as providing critical information about plants' genetics, traits, productivity, and other intricate details to gain insights about their survival under certain conditions and in a specific environment for various analyses. Various methods have been quantifying this information using various models culled from several kinds of datasets. In this study, we extract various phenotypic information about the soybean using UAS-based images captured over the growing fields within the selected experiment field. The DJI M300 unmanned aerial systems were equipped with the Zenmuse P1 and L1 sensors; both used to capture RGB and LiDAR images. In addition, the DJI P4 multispectral UAS was also used to collect multispectral information over these fields at various date intervals. The data captured is being processed using custom-developed algorithms and automated workflows to obtain biomass, vegetation indices, canopy cover, canopy height, and canopy volume. These indices would show variations in the traits of the crop under study as related to the soybean. This phenotypic information would be compared against the field measurements for validation.
Interactive Annotation for object delineation can be considered as a semi-supervised few-shot learning problem where machine learning models learn from a small set of annotated pixels and generalize to the entire picture to extract the object of interest. One aim of interactive annotation is to reduce the effort of manually labeling data. Some existing works attempted to address this problem with deep metric learning so that the encoding layers in the network are able to extract features that boost discriminability among pixels belonging to different classes. To keep the data structure in the embedding space, metric loss with prototypes has been proposed. In our work, we improved the existing methods by developing a new objective function to update the network and prototypes simultaneously. The prototypes are optimized based on the loss that enhances their dissimilarity instead of clustering or sampling from the dataset. Moreover, we designed a GUI with the proposed method for interdisciplinary collaboration of image-support plant phenotyping studies.