WSEAS Transactions on Systems

Print ISSN: 1109-2777
E-ISSN: 2224-2678

Volume 17, 2018

Notice: As of 2014 and for the forthcoming years, the publication frequency/periodicity of WSEAS Journals is adapted to the 'continuously updated' model. What this means is that instead of being separated into issues, new papers will be added on a continuous basis, allowing a more regular flow and shorter publication times. The papers will appear in reverse order, therefore the most recent one will be on top.

Volume 17, 2018

Geometrical Stem Detection from Image Data for Precision Agriculture

AUTHORS: Ferdinand Langer, Leonard Mandtler, Andres Milioto, Emanuele Palazzolo, Cyrill Stachniss

Download as PDF

ABSTRACT: High efficiency in precision farming depends on accurate tools to perform weed detection and mapping of crops. This allows for precise removal of harmful weeds with a lower amount of pesticides, as well as increase of the harvest’s yield by providing the farmer with valuable information. In this paper, we address the problem of fully automatic stem detection from image data for this purpose. Our approach runs on mobile agricultural robots taking RGB images. After processing the images to obtain a vegetation mask, our approach separates each plant into its individual leaves, and later estimates a precise stem position. This allows an upstream mapping algorithm to add the high-resolution stem positions as a semantic aggregate to the global map of the robot, which can be used for weeding and for analyzing crop statistics. We implemented our approach and thoroughly tested it on three different datasets with vegetation masks and stem position ground truth. The experiments presented in this paper conclude that our module is able to detect leaves and estimate the stem’s position at a rate of 56 Hz on a single CPU. We furthermore provide the software to the community.

KEYWORDS: Stem Detection, Sugar Beets, Weeding, Robotics


[1] W. Guo, U.K. Rage, and S. Ninomiya. Illumination invariant segmentation of vegetation for time series wheat images based on decision tree model. Computers and Electronics in Agriculture, 96:58–66, 2013.

[2] D. Hall, C.S. McCool, F. Dayoub, N. Sunderhauf, and B. Upcroft. Evaluation of features for leaf classification in challenging conditions. In Proc. of the IEEE Winter Conf. on Applications of Computer Vision (WACV), pages 797–804, Jan 2015.

[3] E. Hamuda, M. Glavin, and E. Jones. A survey of image processing techniques for plant extraction and segmentation in the field. Computers and Electronics in Agriculture, 125:184–199, 2016.

[4] S. Haug, P. Biber, A. Michaels, and J. Ostermann. Plant stem detection and position estimation using machine vision. In Proc. of the Intl. Workshop on Recent Advances in Agricultural Robotics, 2014.

[5] J. Hemming, E.J. van Henten, B.A.J. van Tuijl, and J. Bontsema. A leaf detection method using image sequences and leaf movement. Acta Horticulturae, 691:877 – 884, 2005.

[6] S. Kiani and A. Jafari. Crop detection and positioning in the field using discriminant analysis and neural networks based on shape features. Journal of Agricultural Science and Technology, 14:755–765, 07 2012.

[7] F. Kraemer, A. Schaefer, A. Eitel, J. Vertens, and W. Burgard. From Plants to Landmarks: Timeinvariant Plant Localization that uses Deep Pose Regression in Agricultural Fields. In IROS Workshop on Agri-Food Robotics, 2017.

[8] P. Lottes, M. Hoferlin, S. Sander, M. M ¨ uter, ¨ P. Schulze-Lammers, and C. Stachniss. An Effective Classification System for Separating Sugar Beets and Weeds for Precision Farming Applications. In Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2016.

[9] P. Lottes and C. Stachniss. Semi-supervised online visual crop and weed classification in precision farming exploiting plant arrangement. In Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2017.

[10] H.S. Midtiby, T.M. Giselsson, and R.N. Jørgensen. Estimating the plant stem emerging points (pseps) of sugar beets at early growth stages. Biosystems Engineering, 111(1):83 – 90, 2012.

[11] H.S. Midtiby, T.M. Giselsson, and R.N. Jørgensen. Location of individual leaves in images of sugar beets in early growth stages. In Proc. of the Intl. Conf. of Agricultural Engineering, pages 1–6, 2012.

[12] A. Milioto, P. Lottes, and C. Stachniss. Realtime blob-wise sugar beets vs weeds classification for monitoring fields using convolutional neural networks. In ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 2017.

[13] A. Milioto, P. Lottes, and C. Stachniss. Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs. In Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2018.

[14] M. Muter, P. Schulze Lammers, and L. Damerow. ¨ Development of an intra-row weeding system using electric servo drives and machine vision for plant detection. In Proc. of the Agricultural Engineering Conference, 2013.

[15] A.T. Nieuwenhuizen. Automated detection and control of volunteer potato plants. PhD thesis, Wageningen University, 2009.

[16] J. Torres-Sanchez, F. Lopez-Granados, and J.M. ´ Pena. An automatic object-based method for optimal ˜ thresholding in uav images: Application for vegetation detection in herbaceous crops. Computers and Electronics in Agriculture, 114:43 – 52, 2015.

[17] X.-F. Wang, D. Huang, J. Du, H. Xu, and L. Heutte. Classification of plant leaf images with complicated background. Applied Mathematics and Computation, 205:916–926, 2008.

WSEAS Transactions on Systems, ISSN / E-ISSN: 1109-2777 / 2224-2678, Volume 17, 2018, Art. #28, pp. 253-258

Copyright Β© 2018 Author(s) retain the copyright of this article. This article is published under the terms of the Creative Commons Attribution License 4.0

Bulletin Board


The editorial board is accepting papers.

WSEAS Main Site