Webinar: Artificial intelligence and hyperspectral imaging for high-throughput plant phenotyping
Ali Moghimi from the University of California, Davis will present a webinar "Artificial intelligence and hyperspectral imaging for high-throughput plant phenotyping" on Tuesday, Tuesday, September 24, 2019, at 8:30 am PDT (UTC -7) as part of the IEEE RAS Technical Committee on Agricultural Robotics and Automation's (AgRA) webinar series.
Please register for this webinar here using Zoom:
This webinar will also be broadcast with YouTube Live, which is an option from Zoom (registration not necessary if you are joining via YouTube Live).
Commenting and asking questions is still possible via YouTube Live. The link is not available beforehand, so check ~15 minutes before the webinar on Twitter, where I will post it publicly:
All details, including a time zone converter, technical details, and presenter biography available here:
Title: Artificial intelligence and hyperspectral imaging for high-throughput plant phenotyping.
Abstract: Artificial intelligence (AI) is becoming an increasingly imperative tool for sustainable crop production in the era of digital agriculture. In this talk, I present my Ph.D. research work in which I utilized AI to leverage the unique advantages of hyperspectral imaging for investigating desired phenotyping traits in wheat with both indoor and field setups.
For our indoor setup, we developed a sensor-based framework for analysis of hyperspectral images to assess the difference between the salt tolerance of four wheat lines. We were able to attain a quantitative ranking as early as one day after applying salt treatment. In addition, we developed an ensemble feature selection pipeline to identify the most informative spectral bands associated with the desired trait in plant phenotyping. I present the results of testing the developed feature selection pipeline in finding the most prominent bands for salt stress assessment and Fusarium head blight detection in wheat.
For our field setup, we mounted the hyperspectral camera on an unmanned aerial vehicle to collect aerial imagery in two consecutive growing seasons from three experimental yield fields composed of hundreds of experimental wheat lines. We trained a deep neural network with fully connected layers for yield prediction. While conventional harvesting of plots for yield measurement relies on demanding, extremely laborious, and time-consuming tasks, our automated framework could predict the yield of wheat plots in a fast, cost-effective manner. In addition, our framework offers a unique insight for breeders to investigate the yield variation at sub-plot scale - a valuable new index in breeding programs to nominate high-yielding cultivars that are capable of producing a uniform yield across the plot. The results revealed that the proposed framework can also serve as a valuable tool for remote visual inspection of the plots and optimizing the plot size to investigate more lines in a dedicated field each year.