Plant Leaf Position Estimation with Computer Vision
Autonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared li...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-10-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/20/20/5933 |
id |
doaj-ce03b13292c24aa1aebb405eb8f73ce4 |
---|---|
record_format |
Article |
spelling |
doaj-ce03b13292c24aa1aebb405eb8f73ce42020-11-25T03:56:36ZengMDPI AGSensors1424-82202020-10-01205933593310.3390/s20205933Plant Leaf Position Estimation with Computer VisionJames Beadle0C. James Taylor1Kirsti Ashworth2David Cheneler3Engineering Department, Lancaster University, Lancaster LA1 4YW, UKEngineering Department, Lancaster University, Lancaster LA1 4YW, UKLancaster Environment Centre, Lancaster University, Lancaster LA1 4YW, UKEngineering Department, Lancaster University, Lancaster LA1 4YW, UKAutonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared light or ultrasound, in addition to imaging. However, infrared methods have the disadvantage of being affected by the presence of ambient light, and ultrasound methods generally have too wide a field of view, making them ineffective for measuring complex and intricate structures. Alternatives may include stereoscopic or structured light scanners, but these can be costly and overly complex to implement. This article presents a fully computer-vision based solution capable of estimating the three-dimensional location of all leaves of a subject plant with the use of a single digital camera autonomously positioned by a three-axis linear robot. A custom trained neural network was used to classify leaves captured in multiple images taken of a subject plant. Parallax calculations were applied to predict leaf depth, and from this, the three-dimensional position. This article demonstrates proof of concept of the method, and initial tests with positioned leaves suggest an expected error of 20 mm. Future modifications are identified to further improve accuracy and utility across different plant canopies.https://www.mdpi.com/1424-8220/20/20/5933neural networkcomputer visiondepth estimationposition estimationparallax |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
James Beadle C. James Taylor Kirsti Ashworth David Cheneler |
spellingShingle |
James Beadle C. James Taylor Kirsti Ashworth David Cheneler Plant Leaf Position Estimation with Computer Vision Sensors neural network computer vision depth estimation position estimation parallax |
author_facet |
James Beadle C. James Taylor Kirsti Ashworth David Cheneler |
author_sort |
James Beadle |
title |
Plant Leaf Position Estimation with Computer Vision |
title_short |
Plant Leaf Position Estimation with Computer Vision |
title_full |
Plant Leaf Position Estimation with Computer Vision |
title_fullStr |
Plant Leaf Position Estimation with Computer Vision |
title_full_unstemmed |
Plant Leaf Position Estimation with Computer Vision |
title_sort |
plant leaf position estimation with computer vision |
publisher |
MDPI AG |
series |
Sensors |
issn |
1424-8220 |
publishDate |
2020-10-01 |
description |
Autonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared light or ultrasound, in addition to imaging. However, infrared methods have the disadvantage of being affected by the presence of ambient light, and ultrasound methods generally have too wide a field of view, making them ineffective for measuring complex and intricate structures. Alternatives may include stereoscopic or structured light scanners, but these can be costly and overly complex to implement. This article presents a fully computer-vision based solution capable of estimating the three-dimensional location of all leaves of a subject plant with the use of a single digital camera autonomously positioned by a three-axis linear robot. A custom trained neural network was used to classify leaves captured in multiple images taken of a subject plant. Parallax calculations were applied to predict leaf depth, and from this, the three-dimensional position. This article demonstrates proof of concept of the method, and initial tests with positioned leaves suggest an expected error of 20 mm. Future modifications are identified to further improve accuracy and utility across different plant canopies. |
topic |
neural network computer vision depth estimation position estimation parallax |
url |
https://www.mdpi.com/1424-8220/20/20/5933 |
work_keys_str_mv |
AT jamesbeadle plantleafpositionestimationwithcomputervision AT cjamestaylor plantleafpositionestimationwithcomputervision AT kirstiashworth plantleafpositionestimationwithcomputervision AT davidcheneler plantleafpositionestimationwithcomputervision |
_version_ |
1724464026955546624 |