Högskolan i Skövde

his.sePublikationer
Ändra sökning
Länk till posten
Permanent länk

Direktlänk
Ericson, Stefan
Publikationer (10 of 10) Visa alla publikationer
Lidell, A., Ericson, S. & Ng, A. H. C. (2022). The Current and Future Challenges for Virtual Commissioning and Digital Twins of Production Lines. In: Amos H. C. Ng; Anna Syberfeldt; Dan Högberg; Magnus Holm (Ed.), SPS2022: Proceedings of the 10th Swedish Production Symposium. Paper presented at 10th Swedish Production Symposium (SPS2022), Skövde, April 26–29 2022 (pp. 508-519). Amsterdam; Berlin; Washington, DC: IOS Press
Öppna denna publikation i ny flik eller fönster >>The Current and Future Challenges for Virtual Commissioning and Digital Twins of Production Lines
2022 (Engelska)Ingår i: SPS2022: Proceedings of the 10th Swedish Production Symposium / [ed] Amos H. C. Ng; Anna Syberfeldt; Dan Högberg; Magnus Holm, Amsterdam; Berlin; Washington, DC: IOS Press, 2022, s. 508-519Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

The use of virtual commissioning has increased in the last decade, but there are still challenges before the software code validation method is widespread in use. One of the extensions to virtual commissioning is the digital twin technology to allow for further improved accuracy. The aim of this paper is to review existing standards and approaches to developing virtual commissioning, through a literature review and interviews with experts in the industry. First, the definitions and classifications related to virtual commissioning and digital twins are reviewed, followed by, the approaches for the development of virtual commissioning and digital twins reported in the literature are explored. Then, in three interviews with experts of varying backgrounds and competencies, the views of the virtual technologies are assessed to provide new insight for the industry. The findings of the literature review and interviews are, among others, the apparent need for standardisation in the field and that a sought-after standard in the form of ISO 23247-1 is underway. The key finding of this paper is that digital twin is a concept with a promising future in combination with other technologies of Industry 4.0. We also outline the challenges and possibilities of virtual commissioning and the digital twin and could be used as a starting point for further research in standardisations and improvements sprung from the new standard.

Ort, förlag, år, upplaga, sidor
Amsterdam; Berlin; Washington, DC: IOS Press, 2022
Serie
Advances in Transdisciplinary Engineering, ISSN 2352-751X, E-ISSN 2352-7528 ; 21
Nyckelord
Virtual commissioning, digital twin, simulation, production system, literature review, interview
Nationell ämneskategori
Produktionsteknik, arbetsvetenskap och ergonomi Robotteknik och automation Datorsystem
Forskningsämne
Produktion och automatiseringsteknik; VF-KDO
Identifikatorer
urn:nbn:se:his:diva-21108 (URN)10.3233/ATDE220169 (DOI)2-s2.0-85132830658 (Scopus ID)978-1-64368-268-6 (ISBN)978-1-64368-269-3 (ISBN)
Konferens
10th Swedish Production Symposium (SPS2022), Skövde, April 26–29 2022
Anmärkning

CC BY-NC 4.0

Corresponding Author, Anton Lidell, University of Skövde, Sweden, E-mail: antonlidell@live.se

Tillgänglig från: 2022-05-02 Skapad: 2022-05-02 Senast uppdaterad: 2023-02-24Bibliografiskt granskad
Huang, R. & Ericson, S. (2018). An Efficient Way to Estimate the Focus of Expansion. In: 2018 3rd IEEE International Conference on Image, Vision and Computing (ICIVC 2018): . Paper presented at 2018 3rd IEEE International Conference on Image, Vision and Computing (ICIVC 2018), Chongqing, China, June 27-29, 2018 (pp. 691-695). IEEE
Öppna denna publikation i ny flik eller fönster >>An Efficient Way to Estimate the Focus of Expansion
2018 (Engelska)Ingår i: 2018 3rd IEEE International Conference on Image, Vision and Computing (ICIVC 2018), IEEE, 2018, s. 691-695Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Detecting independent motion from a single camera is a difficult task in computer vision. It is because the captured image sequences are the combinations of the objects' movements and the camera's ego-motion. One major branch is to find the focus of expansion (FOE) instead as the goal. This is ideal for the situation commonly seen in UAV's camera system. In this case, the translation is dominant in camera's motion while the rotation is relatively small. To separate the ego motion and scene structure, many researchers used the directional flow as the theoretic basis and extracted its properties related to FOE. In this paper, we formulate finding FOE as an optimizing problem. The position of FOE has the minimal standard deviation for the directional flow in all directions, which is also subjected to the introduced constraint. The experiments show the proposed methods out-perform the previous method.

Ort, förlag, år, upplaga, sidor
IEEE, 2018
Nyckelord
focus of expansion, directional flow, independent motion detection
Nationell ämneskategori
Robotteknik och automation
Forskningsämne
Produktion och automatiseringsteknik; INF201 Virtual Production Development
Identifikatorer
urn:nbn:se:his:diva-16399 (URN)10.1109/ICIVC.2018.8492881 (DOI)000448170000136 ()2-s2.0-85056554769 (Scopus ID)978-1-5386-4992-3 (ISBN)978-1-5386-4991-6 (ISBN)978-1-5386-4990-9 (ISBN)
Konferens
2018 3rd IEEE International Conference on Image, Vision and Computing (ICIVC 2018), Chongqing, China, June 27-29, 2018
Tillgänglig från: 2018-11-15 Skapad: 2018-11-15 Senast uppdaterad: 2019-02-05Bibliografiskt granskad
Ericson, S. K. & Åstrand, B. S. (2018). Analysis of two visual odometry systems for use in an agricultural field environment. Biosystems Engineering, 166, 116-125
Öppna denna publikation i ny flik eller fönster >>Analysis of two visual odometry systems for use in an agricultural field environment
2018 (Engelska)Ingår i: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 166, s. 116-125Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

This paper analyses two visual odometry systems for use in an agricultural field environment. The impact of various design parameters and camera setups are evaluated in a simulation environment. Four real field experiments were conducted using a mobile robot operating in an agricultural field. The robot was controlled to travel in a regular back-and-forth pattern with headland turns. The experimental runs were 1.8–3.1 km long and consisted of 32–63,000 frames. The results indicate that a camera angle of 75° gives the best results with the least error. An increased camera resolution only improves the result slightly. The algorithm must be able to reduce error accumulation by adapting the frame rate to minimise error. The results also illustrate the difficulties of estimating roll and pitch using a downward-facing camera. The best results for full 6-DOF position estimation were obtained on a 1.8-km run using 6680 frames captured from the forward-facing cameras. The translation error (x,y,z) is 3.76% and the rotational error (i.e., roll, pitch, and yaw) is 0.0482 deg m−1. The main contributions of this paper are an analysis of design option impacts on visual odometry results and a comparison of two state-of-the-art visual odometry algorithms, applied to agricultural field data.

Ort, förlag, år, upplaga, sidor
Elsevier, 2018
Nyckelord
Visual odometry, Agricultural field robots, Visual navigation
Nationell ämneskategori
Robotteknik och automation
Forskningsämne
Produktion och automatiseringsteknik
Identifikatorer
urn:nbn:se:his:diva-14585 (URN)10.1016/j.biosystemseng.2017.11.009 (DOI)000424726400009 ()2-s2.0-85037985130 (Scopus ID)
Tillgänglig från: 2017-12-15 Skapad: 2017-12-15 Senast uppdaterad: 2021-01-05Bibliografiskt granskad
Ericson, S. (2017). Vision-Based Perception for Localization of Autonomous Agricultural Robots. (Doctoral dissertation). Skövde: University of Skövde
Öppna denna publikation i ny flik eller fönster >>Vision-Based Perception for Localization of Autonomous Agricultural Robots
2017 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

In this thesis Stefan investigates how cameras can be used for localization of an agricultural mobile robot. He focuses on relative measurement that can be used to determine where a weeding tool is operating relative a weed detection sensor. It incorporates downward-facing perspective cameras, forward-facing perspective cameras and omnidirectional cameras. Stefan shows how the camera’s ego-motion can be estimated to obtain not only the position in 3D but also the orientation. He also shows how line structures in the field can be used to navigate a robot along the rows.

Ort, förlag, år, upplaga, sidor
Skövde: University of Skövde, 2017. s. 164
Serie
Dissertation Series ; 16 (2017)
Nationell ämneskategori
Robotteknik och automation
Forskningsämne
Produktion och automatiseringsteknik; INF201 Virtual Production Development
Identifikatorer
urn:nbn:se:his:diva-13408 (URN)978-91-982690-7-9 (ISBN)
Opponent
Handledare
Tillgänglig från: 2017-02-28 Skapad: 2017-02-28 Senast uppdaterad: 2019-01-24Bibliografiskt granskad
Ericson, S. & Åstrand, B. (2010). Row-detection on an agricultural field using omnidirectional camera. In: The IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems (IROS 2010): Conference Proceedings. Paper presented at 23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010; Taipei; 18 October 2010 through 22 October 2010 (pp. 4982-4987). IEEE conference proceedings
Öppna denna publikation i ny flik eller fönster >>Row-detection on an agricultural field using omnidirectional camera
2010 (Engelska)Ingår i: The IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems (IROS 2010): Conference Proceedings, IEEE conference proceedings, 2010, s. 4982-4987Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

This paper describes a method of detecting parallel  rows  on  an  agricultural  field  using  an  omnidirectional camera.  The  method  works  both  on  cameras  with  a  fisheye lens and cameras with a catadioptric lens. A combination of an edge based method and a Hough transform method is suggested to find the rows. The vanishing point of several parallel rows is estimated using a second Hough transform. The method is evaluated on synthetic images generated with calibration data from real lenses. Scenes with several rows are produced, where each  plant  is  positioned  with  a  specified  error.  Experiments are  performed  on  these  synthetic  images  and  on  real  field images. The result shows that good accuracy is obtained on the vanishing point once it is detected correctly. Further it shows that the edge based method works best when the rows consists of solid lines, and the Hough method works best when the rows consists  of  individual  plants.  The  experiments  also  show  that the combined method provides better detection than using the methods separately.

Ort, förlag, år, upplaga, sidor
IEEE conference proceedings, 2010
Serie
IEEE International Conference on Intelligent Robots and Systems. Proceedings, ISSN 2153-0858
Nationell ämneskategori
Teknik och teknologier
Forskningsämne
Teknik
Identifikatorer
urn:nbn:se:his:diva-4597 (URN)10.1109/IROS.2010.5650964 (DOI)000287672004089 ()2-s2.0-78651477189 (Scopus ID)978-1-4244-6676-4 (ISBN)978-1-4244-6675-7 (ISBN)978-1-4244-6674-0 (ISBN)
Konferens
23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010; Taipei; 18 October 2010 through 22 October 2010
Tillgänglig från: 2011-01-20 Skapad: 2011-01-20 Senast uppdaterad: 2017-11-27Bibliografiskt granskad
Ericson, S. & Åstrand, B. (2009). A vision-guided mobile robot for precision agriculture. In: Eldert J. van Henten, D. Goense and C. Lokhorst (Ed.), Proceedings of 7th European Conference on Precision Agriculture: . Paper presented at Precision agriculture '09 : papers presented at the 7th European Conference on Precision Agriculture, Wageningen, the Netherlands, 6 - 8 June 2009 (pp. 623-630). Wageningen Academic Publishers
Öppna denna publikation i ny flik eller fönster >>A vision-guided mobile robot for precision agriculture
2009 (Engelska)Ingår i: Proceedings of 7th European Conference on Precision Agriculture / [ed] Eldert J. van Henten, D. Goense and C. Lokhorst, Wageningen Academic Publishers, 2009, s. 623-630Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

In this paper we have developed a mobile robot which is able to perform crop-scale operations using vision as only sensor. The system consists of a row-following system and a visual odometry system. The row following system captures images from a front looking camera on the robot and the crop rows are extracted using Hough transform. Both distance to the rows and heading angle is provided which both are used to control the steering. The visual odometry system uses two cameras in a stereo setup pointing perpendicular to the ground. This system measures the travelled distance by measuring the ground movement and compensate for height variation. Experiments are performed on an artificial field due to the season. The result shows that the visual odometry have accuracy better than 2.1% of travelled distance.

Ort, förlag, år, upplaga, sidor
Wageningen Academic Publishers, 2009
Nyckelord
visual odometry, row following
Nationell ämneskategori
Teknik och teknologier
Forskningsämne
Teknik
Identifikatorer
urn:nbn:se:his:diva-3427 (URN)2-s2.0-84893371202 (Scopus ID)978-90-8686-113-2 (ISBN)978-90-8686-664-9 (ISBN)
Konferens
Precision agriculture '09 : papers presented at the 7th European Conference on Precision Agriculture, Wageningen, the Netherlands, 6 - 8 June 2009
Tillgänglig från: 2009-10-15 Skapad: 2009-10-15 Senast uppdaterad: 2017-11-27Bibliografiskt granskad
Ericson, S., Hedenberg, K. & Johansson, R. (2009). Information Fusion for Autonomous Robotic Weeding. In: Stefan Fischer, Erik Maehle, Rüdiger Reischuk (Ed.), INFORMATIK 2009: Im Focus das Leben. Paper presented at 39th Jahrestagung der Gesellschaft fur Informatik e.V. (GI): Im Focus das Leben, INFORMATIK 2009. 39th Annual Meeting of the German Informatics Society (GI): Focus on Life, INFORMATIK 2009, Lübeck, Germany, 28 September 2009 through 2 October 2009 (pp. 2461-2473). Köllen Druck + Verlag GmbH
Öppna denna publikation i ny flik eller fönster >>Information Fusion for Autonomous Robotic Weeding
2009 (Engelska)Ingår i: INFORMATIK 2009: Im Focus das Leben / [ed] Stefan Fischer, Erik Maehle, Rüdiger Reischuk, Köllen Druck + Verlag GmbH , 2009, s. 2461-2473Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Information fusion has a potential applicability to a multitude of differentapplications. Still, the JDL model is mostly used to describe defense applications.This paper describes the information fusion process for a robot removing weed ina field. We analyze the robotic system by relating it to the JDL model functions.The civilian application we consider here has some properties which differ from thetypical defense applications: (1) indifferent environment and (2) a predictable andstructured process to achieve its objectives. As a consequence, situation estimatestend to deal with internal properties of the robot and its mission progress (throughmission state transition) rather than external entities and their relations. Nevertheless, the JDL model appears useful for describing the fusion activities of the weeding robot system. We provide an example of how state transitions may be detected and exploited using information fusion and report on some initial results. An additional finding is that process refinement for this type of application can be expressed in terms of a finite state machine.

Ort, förlag, år, upplaga, sidor
Köllen Druck + Verlag GmbH, 2009
Serie
Lecture Notes in Informatics, ISSN 1617-5468 ; Vol. P-154
Nationell ämneskategori
Data- och informationsvetenskap
Forskningsämne
Teknik
Identifikatorer
urn:nbn:se:his:diva-3525 (URN)2-s2.0-84874333160 (Scopus ID)978-3-88579-248-2 (ISBN)
Konferens
39th Jahrestagung der Gesellschaft fur Informatik e.V. (GI): Im Focus das Leben, INFORMATIK 2009. 39th Annual Meeting of the German Informatics Society (GI): Focus on Life, INFORMATIK 2009, Lübeck, Germany, 28 September 2009 through 2 October 2009
Tillgänglig från: 2009-12-08 Skapad: 2009-12-08 Senast uppdaterad: 2018-01-12Bibliografiskt granskad
Ericson, S. & Åstrand, B. (2008). Stereo Visual Odometry for Mobile Robots on Uneven Terrain. In: Sio-Iong Ao (Ed.), Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008: . Paper presented at Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008, WCECS 2008, 22-24 October 2008, San Francisco, California, USA (pp. 150-157). IEEE Computer Society
Öppna denna publikation i ny flik eller fönster >>Stereo Visual Odometry for Mobile Robots on Uneven Terrain
2008 (Engelska)Ingår i: Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008 / [ed] Sio-Iong Ao, IEEE Computer Society, 2008, s. 150-157Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

In this paper we present a stereo visual odometry system for mobile robots that is not sensitive to uneven terrain. Two cameras is mounted perpendicular to the ground and height and traveled distance are calculated using normalized cross correlation. A method for evaluating the system is developed, where flower boxes containing representative surfaces are placed in a metal-working lathe. The cameras are mounted on the carriage which can be positioned manually with 0.1 mm accuracy. Images are captured every 10 mm over 700 mm. The tests are performed on eight different surfaces representing real world situations. The resulting error is less than 0.6% of traveled distance on surfaces where the maximum height variation is measured to 96 mm. The variance is measured for eight test runs, total 5.6 m, to 0.040 mm. This accuracy is sufficient for crop-scale agricultural operations.

Ort, förlag, år, upplaga, sidor
IEEE Computer Society, 2008
Nyckelord
Agricultural applications, Image processing, Mobile robot localization, Visual odometry
Forskningsämne
Teknik
Identifikatorer
urn:nbn:se:his:diva-3950 (URN)10.1109/WCECS.2008.26 (DOI)000275915300018 ()2-s2.0-70350527326 (Scopus ID)978-0-7695-3555-5 (ISBN)978-1-4244-3545-6 (ISBN)
Konferens
Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008, WCECS 2008, 22-24 October 2008, San Francisco, California, USA
Tillgänglig från: 2010-05-20 Skapad: 2010-05-20 Senast uppdaterad: 2017-11-27Bibliografiskt granskad
Ericson, S. & Åstrand, B. (2008). Visual Odometry System for Agricultural Field Robots. In: S. I. Ao, Craig Douglas, W. S. Grundfest, Lee Schruben, Jon Burgstone (Ed.), Proceedings of the World Congress on Engineering and Computer Science 2008: WCECS 2008, October 22 - 24, 2008, San Francisco, USA. Paper presented at World Congress on Engineering and Computer Science (WCECS 2008), October 22 - 24, 2008, San Francisco, USA (pp. 619-624). International Association of Engineers
Öppna denna publikation i ny flik eller fönster >>Visual Odometry System for Agricultural Field Robots
2008 (Engelska)Ingår i: Proceedings of the World Congress on Engineering and Computer Science 2008: WCECS 2008, October 22 - 24, 2008, San Francisco, USA / [ed] S. I. Ao, Craig Douglas, W. S. Grundfest, Lee Schruben, Jon Burgstone, International Association of Engineers , 2008, s. 619-624Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

In this paper we present a visual odometry system for agricultural field robots that is not sensitive to uneven terrain. A stereo camera system is mounted perpendicular to the ground and height and traveled distance are calculated using normalized cross correlation. A method for evaluating the system is developed, where flower boxes containing representative surfaces are placed in a metal-working lathe. The cameras are mounted on the carriage which can be positioned manually with 0.1 mm accuracy. Images are captured every 10 mm over 700 mm. The tests are performed on eight different surfaces representing real world situations. The resulting error is less than 0.6% of traveled distance on surfaces where the maximum height variation is measured to 96 mm. The variance is measured for eight test runs, total 5.6 m, to 0.040 mm. This accuracy is sufficient for crop-scale agricultural operations.

 

 

Ort, förlag, år, upplaga, sidor
International Association of Engineers, 2008
Serie
Lecture Notes in Engineering and Computer Science, ISSN 2078-0958, E-ISSN 2078-0966
Nyckelord
Agricultural applications, Image processing, Mobile robot localization, Visual odometry
Identifikatorer
urn:nbn:se:his:diva-2410 (URN)000263417100117 ()978-988-98671-0-2 (ISBN)
Konferens
World Congress on Engineering and Computer Science (WCECS 2008), October 22 - 24, 2008, San Francisco, USA
Tillgänglig från: 2008-12-05 Skapad: 2008-12-02 Senast uppdaterad: 2020-12-07Bibliografiskt granskad
Ericson, S. & Åstrand, B. (2007). Algorithms for visual odometry in outdoor field environment. In: Klaus Schilling (Ed.), RA '07: Proceedings of the 13th IASTED International Conference on Robotics and Applications. Paper presented at 13th IASTED International Conference on Robotics and Applications, RA 2007 and IASTED International Conference on Telematics, Würzburg, 29 August 2007 through 31 August 2007 (pp. 414-419). ACTA Press
Öppna denna publikation i ny flik eller fönster >>Algorithms for visual odometry in outdoor field environment
2007 (Engelska)Ingår i: RA '07: Proceedings of the 13th IASTED International Conference on Robotics and Applications / [ed] Klaus Schilling, ACTA Press, 2007, s. 414-419Konferensbidrag, Publicerat paper (Refereegranskat)
Ort, förlag, år, upplaga, sidor
ACTA Press, 2007
Serie
Proceedings of the IASTED International Conference on Robotics and Applications, ISSN 1027-264X
Nationell ämneskategori
Datorseende och robotik (autonoma system) Datavetenskap (datalogi) Signalbehandling
Identifikatorer
urn:nbn:se:his:diva-2150 (URN)000251420100048 ()2-s2.0-56149093160 (Scopus ID)978-0-88986-686-7 (ISBN)978-0-88986-685-0 (ISBN)
Konferens
13th IASTED International Conference on Robotics and Applications, RA 2007 and IASTED International Conference on Telematics, Würzburg, 29 August 2007 through 31 August 2007
Tillgänglig från: 2008-06-09 Skapad: 2008-06-09 Senast uppdaterad: 2021-05-04Bibliografiskt granskad
Organisationer

Sök vidare i DiVA

Visa alla publikationer