his.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Ericson, Stefan
Publications (9 of 9) Show all publications
Huang, R. & Ericson, S. (2018). An Efficient Way to Estimate the Focus of Expansion. In: 2018 3rd IEEE International Conference on Image, Vision and Computing (ICIVC 2018): . Paper presented at 2018 3rd IEEE International Conference on Image, Vision and Computing (ICIVC 2018), Chongqing, China, June 27-29, 2018 (pp. 691-695). IEEE
Open this publication in new window or tab >>An Efficient Way to Estimate the Focus of Expansion
2018 (English)In: 2018 3rd IEEE International Conference on Image, Vision and Computing (ICIVC 2018), IEEE, 2018, p. 691-695Conference paper, Published paper (Refereed)
Abstract [en]

Detecting independent motion from a single camera is a difficult task in computer vision. It is because the captured image sequences are the combinations of the objects' movements and the camera's ego-motion. One major branch is to find the focus of expansion (FOE) instead as the goal. This is ideal for the situation commonly seen in UAV's camera system. In this case, the translation is dominant in camera's motion while the rotation is relatively small. To separate the ego motion and scene structure, many researchers used the directional flow as the theoretic basis and extracted its properties related to FOE. In this paper, we formulate finding FOE as an optimizing problem. The position of FOE has the minimal standard deviation for the directional flow in all directions, which is also subjected to the introduced constraint. The experiments show the proposed methods out-perform the previous method.

Place, publisher, year, edition, pages
IEEE, 2018
Keywords
focus of expansion, directional flow, independent motion detection
National Category
Robotics
Research subject
Production and Automation Engineering; INF201 Virtual Production Development
Identifiers
urn:nbn:se:his:diva-16399 (URN)10.1109/ICIVC.2018.8492881 (DOI)000448170000136 ()2-s2.0-85056554769 (Scopus ID)978-1-5386-4992-3 (ISBN)978-1-5386-4991-6 (ISBN)978-1-5386-4990-9 (ISBN)
Conference
2018 3rd IEEE International Conference on Image, Vision and Computing (ICIVC 2018), Chongqing, China, June 27-29, 2018
Available from: 2018-11-15 Created: 2018-11-15 Last updated: 2019-02-05Bibliographically approved
Ericson, S. K. & Åstrand, B. S. (2018). Analysis of two visual odometry systems for use in an agricultural field environment. Biosystems Engineering, 166, 116-125
Open this publication in new window or tab >>Analysis of two visual odometry systems for use in an agricultural field environment
2018 (English)In: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 166, p. 116-125Article in journal (Refereed) Published
Abstract [en]

This paper analyses two visual odometry systems for use in an agricultural field environment. The impact of various design parameters and camera setups are evaluated in a simulation environment. Four real field experiments were conducted using a mobile robot operating in an agricultural field. The robot was controlled to travel in a regular back-and-forth pattern with headland turns. The experimental runs were 1.8–3.1 km long and consisted of 32–63,000 frames. The results indicate that a camera angle of 75° gives the best results with the least error. An increased camera resolution only improves the result slightly. The algorithm must be able to reduce error accumulation by adapting the frame rate to minimise error. The results also illustrate the difficulties of estimating roll and pitch using a downward-facing camera. The best results for full 6-DOF position estimation were obtained on a 1.8-km run using 6680 frames captured from the forward-facing cameras. The translation error (x,y,z) is 3.76% and the rotational error (i.e., roll, pitch, and yaw) is 0.0482 deg m−1. The main contributions of this paper are an analysis of design option impacts on visual odometry results and a comparison of two state-of-the-art visual odometry algorithms, applied to agricultural field data.

Place, publisher, year, edition, pages
Elsevier, 2018
Keywords
Visual odometry, Agricultural field robots, Visual navigation
National Category
Robotics
Research subject
Production and Automation Engineering
Identifiers
urn:nbn:se:his:diva-14585 (URN)10.1016/j.biosystemseng.2017.11.009 (DOI)000424726400009 ()2-s2.0-85037985130 (Scopus ID)
Available from: 2017-12-15 Created: 2017-12-15 Last updated: 2018-03-02Bibliographically approved
Ericson, S. (2017). Vision-Based Perception for Localization of Autonomous Agricultural Robots. (Doctoral dissertation). Skövde: University of Skövde
Open this publication in new window or tab >>Vision-Based Perception for Localization of Autonomous Agricultural Robots
2017 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

In this thesis Stefan investigates how cameras can be used for localization of an agricultural mobile robot. He focuses on relative measurement that can be used to determine where a weeding tool is operating relative a weed detection sensor. It incorporates downward-facing perspective cameras, forward-facing perspective cameras and omnidirectional cameras. Stefan shows how the camera’s ego-motion can be estimated to obtain not only the position in 3D but also the orientation. He also shows how line structures in the field can be used to navigate a robot along the rows.

Place, publisher, year, edition, pages
Skövde: University of Skövde, 2017. p. 164
Series
Dissertation Series ; 16 (2017)
National Category
Robotics
Research subject
Production and Automation Engineering; INF201 Virtual Production Development
Identifiers
urn:nbn:se:his:diva-13408 (URN)978-91-982690-7-9 (ISBN)
Opponent
Supervisors
Available from: 2017-02-28 Created: 2017-02-28 Last updated: 2019-01-24Bibliographically approved
Ericson, S. & Åstrand, B. (2010). Row-detection on an agricultural field using omnidirectional camera. In: The IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems (IROS 2010): Conference Proceedings. Paper presented at 23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010; Taipei; 18 October 2010 through 22 October 2010 (pp. 4982-4987). IEEE conference proceedings
Open this publication in new window or tab >>Row-detection on an agricultural field using omnidirectional camera
2010 (English)In: The IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems (IROS 2010): Conference Proceedings, IEEE conference proceedings, 2010, p. 4982-4987Conference paper, Published paper (Refereed)
Abstract [en]

This paper describes a method of detecting parallel  rows  on  an  agricultural  field  using  an  omnidirectional camera.  The  method  works  both  on  cameras  with  a  fisheye lens and cameras with a catadioptric lens. A combination of an edge based method and a Hough transform method is suggested to find the rows. The vanishing point of several parallel rows is estimated using a second Hough transform. The method is evaluated on synthetic images generated with calibration data from real lenses. Scenes with several rows are produced, where each  plant  is  positioned  with  a  specified  error.  Experiments are  performed  on  these  synthetic  images  and  on  real  field images. The result shows that good accuracy is obtained on the vanishing point once it is detected correctly. Further it shows that the edge based method works best when the rows consists of solid lines, and the Hough method works best when the rows consists  of  individual  plants.  The  experiments  also  show  that the combined method provides better detection than using the methods separately.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2010
Series
IEEE International Conference on Intelligent Robots and Systems. Proceedings, ISSN 2153-0858
National Category
Engineering and Technology
Research subject
Technology
Identifiers
urn:nbn:se:his:diva-4597 (URN)10.1109/IROS.2010.5650964 (DOI)000287672004089 ()2-s2.0-78651477189 (Scopus ID)978-1-4244-6676-4 (ISBN)978-1-4244-6675-7 (ISBN)978-1-4244-6674-0 (ISBN)
Conference
23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010; Taipei; 18 October 2010 through 22 October 2010
Available from: 2011-01-20 Created: 2011-01-20 Last updated: 2017-11-27Bibliographically approved
Ericson, S. & Åstrand, B. (2009). A vision-guided mobile robot for precision agriculture. In: Eldert J. van Henten, D. Goense and C. Lokhorst (Ed.), Proceedings of 7th European Conference on Precision Agriculture: . Paper presented at Precision agriculture '09 : papers presented at the 7th European Conference on Precision Agriculture, Wageningen, the Netherlands, 6 - 8 June 2009 (pp. 623-630). Wageningen Academic Publishers
Open this publication in new window or tab >>A vision-guided mobile robot for precision agriculture
2009 (English)In: Proceedings of 7th European Conference on Precision Agriculture / [ed] Eldert J. van Henten, D. Goense and C. Lokhorst, Wageningen Academic Publishers, 2009, p. 623-630Conference paper, Published paper (Refereed)
Abstract [en]

In this paper we have developed a mobile robot which is able to perform crop-scale operations using vision as only sensor. The system consists of a row-following system and a visual odometry system. The row following system captures images from a front looking camera on the robot and the crop rows are extracted using Hough transform. Both distance to the rows and heading angle is provided which both are used to control the steering. The visual odometry system uses two cameras in a stereo setup pointing perpendicular to the ground. This system measures the travelled distance by measuring the ground movement and compensate for height variation. Experiments are performed on an artificial field due to the season. The result shows that the visual odometry have accuracy better than 2.1% of travelled distance.

Place, publisher, year, edition, pages
Wageningen Academic Publishers, 2009
Keywords
visual odometry, row following
National Category
Engineering and Technology
Research subject
Technology
Identifiers
urn:nbn:se:his:diva-3427 (URN)2-s2.0-84893371202 (Scopus ID)978-90-8686-113-2 (ISBN)978-90-8686-664-9 (ISBN)
Conference
Precision agriculture '09 : papers presented at the 7th European Conference on Precision Agriculture, Wageningen, the Netherlands, 6 - 8 June 2009
Available from: 2009-10-15 Created: 2009-10-15 Last updated: 2017-11-27Bibliographically approved
Ericson, S., Hedenberg, K. & Johansson, R. (2009). Information Fusion for Autonomous Robotic Weeding. In: Stefan Fischer, Erik Maehle, Rüdiger Reischuk (Ed.), INFORMATIK 2009: Im Focus das Leben. Paper presented at 39th Jahrestagung der Gesellschaft fur Informatik e.V. (GI): Im Focus das Leben, INFORMATIK 2009. 39th Annual Meeting of the German Informatics Society (GI): Focus on Life, INFORMATIK 2009, Lübeck, Germany, 28 September 2009 through 2 October 2009 (pp. 2461-2473). Köllen Druck + Verlag GmbH
Open this publication in new window or tab >>Information Fusion for Autonomous Robotic Weeding
2009 (English)In: INFORMATIK 2009: Im Focus das Leben / [ed] Stefan Fischer, Erik Maehle, Rüdiger Reischuk, Köllen Druck + Verlag GmbH , 2009, p. 2461-2473Conference paper, Published paper (Refereed)
Abstract [en]

Information fusion has a potential applicability to a multitude of differentapplications. Still, the JDL model is mostly used to describe defense applications.This paper describes the information fusion process for a robot removing weed ina field. We analyze the robotic system by relating it to the JDL model functions.The civilian application we consider here has some properties which differ from thetypical defense applications: (1) indifferent environment and (2) a predictable andstructured process to achieve its objectives. As a consequence, situation estimatestend to deal with internal properties of the robot and its mission progress (throughmission state transition) rather than external entities and their relations. Nevertheless, the JDL model appears useful for describing the fusion activities of the weeding robot system. We provide an example of how state transitions may be detected and exploited using information fusion and report on some initial results. An additional finding is that process refinement for this type of application can be expressed in terms of a finite state machine.

Place, publisher, year, edition, pages
Köllen Druck + Verlag GmbH, 2009
Series
Lecture Notes in Informatics, ISSN 1617-5468 ; Vol. P-154
National Category
Computer and Information Sciences
Research subject
Technology
Identifiers
urn:nbn:se:his:diva-3525 (URN)2-s2.0-84874333160 (Scopus ID)978-3-88579-248-2 (ISBN)
Conference
39th Jahrestagung der Gesellschaft fur Informatik e.V. (GI): Im Focus das Leben, INFORMATIK 2009. 39th Annual Meeting of the German Informatics Society (GI): Focus on Life, INFORMATIK 2009, Lübeck, Germany, 28 September 2009 through 2 October 2009
Available from: 2009-12-08 Created: 2009-12-08 Last updated: 2018-01-12Bibliographically approved
Ericson, S. & Åstrand, B. (2008). Stereo Visual Odometry for Mobile Robots on Uneven Terrain. In: Sio-Iong Ao (Ed.), Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008: . Paper presented at Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008, WCECS 2008, 22-24 October 2008, San Francisco, California, USA (pp. 150-157). IEEE Computer Society
Open this publication in new window or tab >>Stereo Visual Odometry for Mobile Robots on Uneven Terrain
2008 (English)In: Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008 / [ed] Sio-Iong Ao, IEEE Computer Society, 2008, p. 150-157Conference paper, Published paper (Refereed)
Abstract [en]

In this paper we present a stereo visual odometry system for mobile robots that is not sensitive to uneven terrain. Two cameras is mounted perpendicular to the ground and height and traveled distance are calculated using normalized cross correlation. A method for evaluating the system is developed, where flower boxes containing representative surfaces are placed in a metal-working lathe. The cameras are mounted on the carriage which can be positioned manually with 0.1 mm accuracy. Images are captured every 10 mm over 700 mm. The tests are performed on eight different surfaces representing real world situations. The resulting error is less than 0.6% of traveled distance on surfaces where the maximum height variation is measured to 96 mm. The variance is measured for eight test runs, total 5.6 m, to 0.040 mm. This accuracy is sufficient for crop-scale agricultural operations.

Place, publisher, year, edition, pages
IEEE Computer Society, 2008
Keywords
Agricultural applications, Image processing, Mobile robot localization, Visual odometry
Research subject
Technology
Identifiers
urn:nbn:se:his:diva-3950 (URN)10.1109/WCECS.2008.26 (DOI)000275915300018 ()2-s2.0-70350527326 (Scopus ID)978-0-7695-3555-5 (ISBN)978-1-4244-3545-6 (ISBN)
Conference
Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008, WCECS 2008, 22-24 October 2008, San Francisco, California, USA
Available from: 2010-05-20 Created: 2010-05-20 Last updated: 2017-11-27Bibliographically approved
Ericson, S. & Åstrand, B. (2008). Visual Odometry System for Agricultural Field Robots. In: World Congress on Engineering and Computer Science 2008. Paper presented at World Congress on Engineering and Computer Science (WCECS 2008) Location: San Francisco, CA Date: OCT 11-24, 2008 (pp. 619-624). IAENG
Open this publication in new window or tab >>Visual Odometry System for Agricultural Field Robots
2008 (English)In: World Congress on Engineering and Computer Science 2008, IAENG , 2008, p. 619-624Conference paper, Published paper (Refereed)
Abstract [en]

 In this paper we present a visual odometry system for agricultural field robots that is not sensitive to uneven terrain. A stereo camera system is mounted perpendicular to the ground and height and traveled distance are calculated using normalized cross correlation. A method for evaluating the system is developed, where flower boxes containing representative surfaces are placed in a metal-working lathe. The cameras are mounted on the carriage which can be positioned manually with 0.1 mm accuracy. Images are captured every 10 mm over 700 mm. The tests are performed on eight different surfaces representing real world situations. The resulting error is less than 0.6% of traveled distance on surfaces where the maximum height variation is measured to 96 mm. The variance is measured for eight test runs, total 5.6 m, to 0.040 mm. This accuracy is sufficient for crop-scale agricultural operations.

 

 

Place, publisher, year, edition, pages
IAENG, 2008
Series
Lecture Notes in Engineering and Computer Science, ISSN 2078-0958
Keywords
Agricultural applications, Image processing, Mobile robot localization, Visual odometry
Identifiers
urn:nbn:se:his:diva-2410 (URN)000263417100117 ()978-988-98671-0-2 (ISBN)
Conference
World Congress on Engineering and Computer Science (WCECS 2008) Location: San Francisco, CA Date: OCT 11-24, 2008
Available from: 2008-12-05 Created: 2008-12-02 Last updated: 2017-11-27
Ericson, S. & Åstrand, B. (2007). Proceedings of the 13th IASTED International Conference on Robotics and Applications, RA 2007 and Proceedings of the IASTED International Conference on Telematics. In: Robotics and Applications and Telematics. Paper presented at 13th IASTED International Conference on Robotics and Applications, RA 2007 and Proceedings of the IASTED International Conference on Telematics;Wurzburg;29 August 2007through31 August 2007 (pp. 287-292). ACTA Press
Open this publication in new window or tab >>Proceedings of the 13th IASTED International Conference on Robotics and Applications, RA 2007 and Proceedings of the IASTED International Conference on Telematics
2007 (English)In: Robotics and Applications and Telematics, ACTA Press, 2007, p. 287-292Conference paper, Published paper (Refereed)
Place, publisher, year, edition, pages
ACTA Press, 2007
Series
IASTED International Conference on Robotics and Applications, ISSN 1027-264X
Identifiers
urn:nbn:se:his:diva-2150 (URN)000251420100048 ()2-s2.0-56149093160 (Scopus ID)978-0-88986-686-7 (ISBN)
Conference
13th IASTED International Conference on Robotics and Applications, RA 2007 and Proceedings of the IASTED International Conference on Telematics;Wurzburg;29 August 2007through31 August 2007
Available from: 2008-06-09 Created: 2008-06-09 Last updated: 2017-11-27
Organisations

Search in DiVA

Show all publications