Low-Cost Sensors and Vectors for Plant Phenotyping (Session)


 

Speakers:  

1) Antoine Fournier: Towards Low-Cost Hyperspectral Single-Pixel Imaging for Plant Phenotyping
2) Olivier Pieters: Gloxinia—An Open-Source Sensing Platform to Monitor the Dynamic Responses of Plants
3) Salma Samiei: Toward Joint Acquisition-Annotation of Images with Egocentric Devices for a Lower-Cost Machine Learning Application to Apple Detection

 

Date:  October 30th  2020 / Time: 14:00   (Berlin Time) /  7.00 AM (CDT)

 

Watch the recorded webinar

 
Abstracts:

1) Hyperspectral imaging techniques have been expanding considerably in recent years. The cost of current solutions is decreasing, but these high-end technologies are not yet available for moderate to low-cost outdoor and indoor applications. We have used some of the latest compressive sensing methods with a single-pixel imaging setup. Projected patterns were generated on Fourier basis, which is well-known for its properties and reduction of acquisition and calculation times. A low-cost, moderate-flow prototype was developed and studied in the laboratory, which has made it possible to obtain metrologically validated reflectance measurements using a minimal computational workload. From these measurements, it was possible to discriminate plant species from the rest of a scene and to identify biologically contrasted areas within a leaf. This prototype gives access to easy-to-use phenotyping and teaching tools at very low-cost.

 

2) The study of the dynamic responses of plants to short-term environmental changes is becoming increasingly important in basic plant science, phenotyping, breeding, crop management, and modelling. These short-term variations are crucial in plant adaptation to new environments and, consequently, in plant fitness and productivity. Scalable, versatile, accurate, and low-cost data-logging solutions are necessary to advance these fields and complement existing sensing platforms such as high-throughput phenotyping. However, current data logging and sensing platforms do not meet the requirements to monitor these responses. Therefore, a new modular data logging platform was designed, named Gloxinia. Different sensor boards are interconnected depending upon the needs, with the potential to scale to hundreds of sensors in a distributed sensor system. To demonstrate the architecture, two sensor boards were designed—one for single-ended measurements and one for lock-in amplifier based measurements, named Sylvatica and Planalta, respectively. To evaluate the performance of the system in small setups, a small-scale trial was conducted in a growth chamber. Expected plant dynamics were successfully captured, indicating proper operation of the system. Though a large scale trial was not performed, we expect the system to scale very well to larger setups. Additionally, the platform is open-source, enabling other users to easily build upon our work and perform application-specific optimisations.

 

3) Since most computer vision approaches are now driven by machine learning, the current bottleneck is the annotation of images. This time-consuming task is usually performed manually after the acquisition of images. In this article, we assess the value of various egocentric vision approaches in regard to performing joint acquisition and automatic image annotation rather than the conventional two-step process of acquisition followed by manual annotation. This approach is illustrated with apple detection in challenging field conditions. We demonstrate the possibility of high performance in automatic apple segmentation (Dice 0.85), apple counting (88 percent of probability of good detection, and 0.09 true-negative rate), and apple localization (a shift error of fewer than 3 pixels) with eye-tracking systems. This is obtained by simply applying the areas of interest captured by the egocentric devices to standard, non-supervised image segmentation. We especially stress the importance in terms of time of using such eye-tracking devices on head-mounted systems to jointly perform image acquisition and automatic annotation. A gain of time of over 10-fold by comparison with classical image acquisition followed by manual image annotation is demonstrated.

 
Speaker(s) Bio(s):

1) 


Antoine Fournier is Ing. in photonic and PhD on remote sensing of vegetation. After conducting studies with INRAe on remote sensing unmixing algorithms, he work for ARVALIS-institut du vegetal supporting deployment of phenotyping tools and platform among technical teams and agronomical analysts. Since 2018, he lead an AgroPhotonics partnership with Photonics-Bretagne, aiming at speeding up access to the field of breakthrough photonics technologies for plant sciences, agronomy, agriculture and environmental challenges.

Phenotyping Tools at ARVALIS

 

2)

 

Olivier Pieters is a PhD researcher at IDLab-AIRO Ghent University and ILVO with a background in information and communication technology. He has experience with the design of distributed online systems for both phenotyping and micro-climate sensing. During his PhD, he is investigating the dynamic properties of plants to external influences.
These dynamics can provide a new conceptual framework for the understanding of plants' reactions to environmental changes and lead to earlier stress detection and alleviation in more controlled settings such as greenhouses.

 

3)

Salma Samiei is a postdoctoral researcher at Angers University, working on developing and applying computer vision and deep learning approaches in the context of plant phenotyping. On her Ph.D., she had contributions to low-cost imaging and machine learning for plant phenotyping by focusing on reducing the cost of annotation and computation in deep learning. She works with large-scale, real-world data to build intelligent systems that have real-world impact.