his.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Vision-Based Perception for Localization of Autonomous Agricultural Robots
University of Skövde, School of Engineering Science. University of Skövde, The Virtual Systems Research Centre. (Produktion och Automatiseringsteknik)
2017 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

In this thesis Stefan investigates how cameras can be used for localization of an agricultural mobile robot. He focuses on relative measurement that can be used to determine where a weeding tool is operating relative a weed detection sensor. It incorporates downward-facing perspective cameras, forward-facing perspective cameras and omnidirectional cameras. Stefan shows how the camera’s ego-motion can be estimated to obtain not only the position in 3D but also the orientation. He also shows how line structures in the field can be used to navigate a robot along the rows.

Place, publisher, year, edition, pages
Skövde: University of Skövde , 2017. , 164 p.
Series
Dissertation Series, 16 (2017)
National Category
Robotics
Identifiers
URN: urn:nbn:se:his:diva-13408ISBN: 978-91-982690-7-9 (print)OAI: oai:DiVA.org:his-13408DiVA: diva2:1077579
Opponent
Supervisors
Available from: 2017-02-28 Created: 2017-02-28 Last updated: 2017-03-06Bibliographically approved
List of papers
1. Stereo Visual Odometry for Mobile Robots on Uneven Terrain
Open this publication in new window or tab >>Stereo Visual Odometry for Mobile Robots on Uneven Terrain
2008 (English)In: Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008 / [ed] Sio-Iong Ao, IEEE Computer Society, 2008, 150-157 p.Conference paper, (Refereed)
Abstract [en]

In this paper we present a stereo visual odometry system for mobile robots that is not sensitive to uneven terrain. Two cameras is mounted perpendicular to the ground and height and traveled distance are calculated using normalized cross correlation. A method for evaluating the system is developed, where flower boxes containing representative surfaces are placed in a metal-working lathe. The cameras are mounted on the carriage which can be positioned manually with 0.1 mm accuracy. Images are captured every 10 mm over 700 mm. The tests are performed on eight different surfaces representing real world situations. The resulting error is less than 0.6% of traveled distance on surfaces where the maximum height variation is measured to 96 mm. The variance is measured for eight test runs, total 5.6 m, to 0.040 mm. This accuracy is sufficient for crop-scale agricultural operations.

Place, publisher, year, edition, pages
IEEE Computer Society, 2008
Keyword
Agricultural applications, Image processing, Mobile robot localization, Visual odometry
Research subject
Technology
Identifiers
urn:nbn:se:his:diva-3950 (URN)10.1109/WCECS.2008.26 (DOI)000275915300018 ()2-s2.0-70350527326 (Scopus ID)978-0-7695-3555-5 (ISBN)978-1-4244-3545-6 (ISBN)
Conference
Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008, WCECS 2008, 22-24 October 2008, San Francisco, California, USA
Available from: 2010-05-20 Created: 2010-05-20 Last updated: 2017-03-08Bibliographically approved
2. A vision-guided mobile robot for precision agriculture
Open this publication in new window or tab >>A vision-guided mobile robot for precision agriculture
2009 (English)In: Proceedings of 7th European Conference on Precision Agriculture / [ed] Eldert J. van Henten, D. Goense and C. Lokhorst, Wageningen Academic Publishers, 2009, 623-630 p.Conference paper, (Refereed)
Abstract [en]

In this paper we have developed a mobile robot which is able to perform crop-scale operations using vision as only sensor. The system consists of a row-following system and a visual odometry system. The row following system captures images from a front looking camera on the robot and the crop rows are extracted using Hough transform. Both distance to the rows and heading angle is provided which both are used to control the steering. The visual odometry system uses two cameras in a stereo setup pointing perpendicular to the ground. This system measures the travelled distance by measuring the ground movement and compensate for height variation. Experiments are performed on an artificial field due to the season. The result shows that the visual odometry have accuracy better than 2.1% of travelled distance.

Place, publisher, year, edition, pages
Wageningen Academic Publishers, 2009
Keyword
visual odometry, row following
National Category
Engineering and Technology
Research subject
Technology
Identifiers
urn:nbn:se:his:diva-3427 (URN)2-s2.0-84893371202 (Scopus ID)978-90-8686-113-2 (ISBN)978-90-8686-664-9 (ISBN)
Conference
Precision agriculture '09 : papers presented at the 7th European Conference on Precision Agriculture, Wageningen, the Netherlands, 6 - 8 June 2009
Available from: 2009-10-15 Created: 2009-10-15 Last updated: 2017-03-13Bibliographically approved
3. Row-detection on an agricultural field using omnidirectional camera
Open this publication in new window or tab >>Row-detection on an agricultural field using omnidirectional camera
2010 (English)In: The IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems (IROS 2010): Conference Proceedings, IEEE conference proceedings, 2010, 4982-4987 p.Conference paper, (Refereed)
Abstract [en]

This paper describes a method of detecting parallel  rows  on  an  agricultural  field  using  an  omnidirectional camera.  The  method  works  both  on  cameras  with  a  fisheye lens and cameras with a catadioptric lens. A combination of an edge based method and a Hough transform method is suggested to find the rows. The vanishing point of several parallel rows is estimated using a second Hough transform. The method is evaluated on synthetic images generated with calibration data from real lenses. Scenes with several rows are produced, where each  plant  is  positioned  with  a  specified  error.  Experiments are  performed  on  these  synthetic  images  and  on  real  field images. The result shows that good accuracy is obtained on the vanishing point once it is detected correctly. Further it shows that the edge based method works best when the rows consists of solid lines, and the Hough method works best when the rows consists  of  individual  plants.  The  experiments  also  show  that the combined method provides better detection than using the methods separately.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2010
Series
IEEE International Conference on Intelligent Robots and Systems. Proceedings, ISSN 2153-0858
National Category
Engineering and Technology
Research subject
Technology
Identifiers
urn:nbn:se:his:diva-4597 (URN)10.1109/IROS.2010.5650964 (DOI)000287672004089 ()2-s2.0-78651477189 (Scopus ID)978-1-4244-6676-4 (ISBN)978-1-4244-6675-7 (ISBN)978-1-4244-6674-0 (ISBN)
Conference
23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010; Taipei; 18 October 2010 through 22 October 2010
Available from: 2011-01-20 Created: 2011-01-20 Last updated: 2017-02-28Bibliographically approved

Open Access in DiVA

fulltext(18184 kB)202 downloads
File information
File name FULLTEXT01.pdfFile size 18184 kBChecksum SHA-512
d0be3c9a71e417ea9365f5667328970d105319c9fcada2e828818e0cb95fdf6289ec05fe399fc239cd759aa6088c2c36112452fc8114538b7bb3dd63b44ba017
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Ericson, Stefan
By organisation
School of Engineering ScienceThe Virtual Systems Research Centre
Robotics

Search outside of DiVA

GoogleGoogle Scholar
Total: 202 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 560 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf