Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

This is just the initial draft of the program. The complete program will be available soon.

Precision Agriculture (PA)
Agriculture Engineering (AE)
Food and Bioprocessing (F&B)
Irrigiation (Irri)
Aquaculture/Aquaponics (Aqua)
Greenhouse (GH)
Bioenergy (BioE)
Environment (ENV)
Climate Change (CC)
Water and Soil management (W&SM)
Waste Management (WASM)
Knowledge Transfer, Society and Economics (KTSE)
Other

Sections

FBWK:

F&B+BioE+WASM+KTSE

IAWGO:

Irri+Aqua+W&SM+GH+Other

AP:

AE+PA

CE:

CC+ENV

 
 
Session Overview
Session
CE2
Time:
Monday, 24/July/2023:
10:40am - 12:00pm

Session Chair: Grant Clark
Location: Room no: TT1941

Trades, Technology & Innovation Facility

Show help for 'Increase or decrease the abstract text size'
Presentations
10:40am - 11:00am

Fusion of thermal and RGB-D imagery for 3D Plant Structure and Temperature Characterization

Xintong Jiang, Shangpeng Sun

McGill University, Canada

Temperature is one of the most widely acknowledged abiotic environmental factors impacting almost all aspects of plant functions. Thermal infrared (TIR) imaging has been employed to measure plant canopy temperature and assess physiological characteristics. Compared to RGB point clouds, challenges present in directly obtaining accurate thermal point clouds due to the low resolution of thermal sensors. This study anticipates comparing two photogrammetric approaches to generating high-density thermal point clouds via image fusion techniques, which characterize surface temperature at both organ and the whole plant scales. A FLIR Boson 640 Thermal Camera (with a resolution of 640×480 pixels) and an Intel RealSense Depth Camera D455 (with a resolution of 1280 × 800 pixels for RGB) will be calibrated for the acquisition of TIR images and RGB-D images of single plants. The first approach will register the TIR images with RGB images and Depth images. The TIR images will then be projected onto the RGB point clouds for thermal texture mapping. The second approach will first construct thermal point clouds and RGB point clouds using structure from motion, respectively, followed by the registration of the two point clouds. The point clouds generated using the two methods will be compared based on their point density, resolution, completeness and interpolation. The method of obtaining high-resolution thermal point clouds will provide novel 3D analytical tools for analyzing plant temperature patterns and relationships to structure and function at both organ and the whole plant levels.



11:00am - 11:20am

Crop row detection by UAV images based on an improved Mask-RCNN

Kunwei Sun1, Shangpeng Sun1, Ana Julia Righetto2, Breno Rachid2

1McGill University, Canada; 2Alvaz Agritech do Brasil, Brazil

Unmanned aerial vehicles (UAVs)-based technologies have been widely applied to take images for crop growth and development monitoring. In this study, we will propose a Mask-RCNN-based method for crop row detection by RGB images captured using a UAV platform. Instead of operating on regular dense tensors, the proposed network can detect error-prone tree nodes and self-correct errors in parallel by decomposing and representing the image regions as a quadtree, leading to low computational costs and highly accurate instance masks. The method will be able to process images with shadows and weeds. Also, it can address the problem of curved row detection. The detection results can provide growers with computer vision-based navigation for agricultural machinery.



11:20am - 11:40am

Generative Adversarial Networks and RGB Cameras: A Cost-Effective Approach for Vegetation Index Analysis in Agricultural Crop Monitoring

Hassan Afzaal1, Aitazaz Farooque1,2, Kuljeet Grewal1

1Faculty of Sustainable Design Engineering, University of Prince Edward Island, Charlottetown, Canada; 2School of Climate Change and Adaptation, University of Prince Edward Island, Charlottetown, Canada

Vegetation indices (VI) have become an important tool in agriculture for monitoring crop growth, health, and yield potential. These indices provide valuable information that can guide informed decisions in crop management, including fertilizer applications, irrigation scheduling, and crop harvesting. In this study, we investigated the potential of generative adversarial networks (GANs) to translate RGB images into VI, including the normalized difference vegetation index, simple ratio, and green vegetation index.

To evaluate the performance of GANs, extensive drone imagery was collected over the growing season of 2021 for three potato fields in Prince Edward Island. The dataset was divided into three stages: early, mid, and late season, to capture the vegetation variations throughout the growing season. The Pix2PixHD deep learning architecture was used to train the GANs on over 500 diverse field images, enabling the translation of RGB images into useful VI.

We employed statistical indicators such as histogram, structure similarity index, and root mean square error to compare the GAN-generated images with ground truth images. The results demonstrated the potential of GANs in VI monitoring. Specifically, the GAN-generated VI exhibited a high degree of similarity to ground truth VI, indicating the efficacy of the GAN-based approach.

Our findings hold significant implications for the future of precision agriculture, where GANs can play an important role in improving crop management and yield potential. The detailed results of this study will be presented at the CSBE 2023 conference.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: CSBE/SCGAB Conference 2023
Conference Software: ConfTool Pro 2.8.101
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany