Högskolan i Skövde

his.sePublikasjoner
Endre søk
Begrens søket
1 - 10 of 10
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • apa-cv
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Treff pr side
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
Merk
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 1.
    Ericson, Stefan
    Högskolan i Skövde, Institutionen för ingenjörsvetenskap. Högskolan i Skövde, Forskningscentrum för Virtuella system.
    Vision-Based Perception for Localization of Autonomous Agricultural Robots2017Doktoravhandling, med artikler (Annet vitenskapelig)
    Abstract [en]

    In this thesis Stefan investigates how cameras can be used for localization of an agricultural mobile robot. He focuses on relative measurement that can be used to determine where a weeding tool is operating relative a weed detection sensor. It incorporates downward-facing perspective cameras, forward-facing perspective cameras and omnidirectional cameras. Stefan shows how the camera’s ego-motion can be estimated to obtain not only the position in 3D but also the orientation. He also shows how line structures in the field can be used to navigate a robot along the rows.

    Fulltekst (pdf)
    fulltext
  • 2.
    Ericson, Stefan
    et al.
    Högskolan i Skövde, Institutionen för teknik och samhälle.
    Hedenberg, Klas
    Högskolan i Skövde, Institutionen för teknik och samhälle.
    Johansson, Ronnie
    Högskolan i Skövde, Institutionen för kommunikation och information. Högskolan i Skövde, Forskningscentrum för Informationsteknologi.
    Information Fusion for Autonomous Robotic Weeding2009Inngår i: INFORMATIK 2009: Im Focus das Leben / [ed] Stefan Fischer, Erik Maehle, Rüdiger Reischuk, Köllen Druck + Verlag GmbH , 2009, s. 2461-2473Konferansepaper (Fagfellevurdert)
    Abstract [en]

    Information fusion has a potential applicability to a multitude of differentapplications. Still, the JDL model is mostly used to describe defense applications.This paper describes the information fusion process for a robot removing weed ina field. We analyze the robotic system by relating it to the JDL model functions.The civilian application we consider here has some properties which differ from thetypical defense applications: (1) indifferent environment and (2) a predictable andstructured process to achieve its objectives. As a consequence, situation estimatestend to deal with internal properties of the robot and its mission progress (throughmission state transition) rather than external entities and their relations. Nevertheless, the JDL model appears useful for describing the fusion activities of the weeding robot system. We provide an example of how state transitions may be detected and exploited using information fusion and report on some initial results. An additional finding is that process refinement for this type of application can be expressed in terms of a finite state machine.

  • 3.
    Ericson, Stefan K.
    et al.
    Högskolan i Skövde, Institutionen för ingenjörsvetenskap. Högskolan i Skövde, Forskningscentrum för Virtuella system.
    Åstrand, Björn S.
    School of Information Science, Computer and Electrical Engineering, Halmstad University, Halmstad, Sweden.
    Analysis of two visual odometry systems for use in an agricultural field environment2018Inngår i: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 166, s. 116-125Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    This paper analyses two visual odometry systems for use in an agricultural field environment. The impact of various design parameters and camera setups are evaluated in a simulation environment. Four real field experiments were conducted using a mobile robot operating in an agricultural field. The robot was controlled to travel in a regular back-and-forth pattern with headland turns. The experimental runs were 1.8–3.1 km long and consisted of 32–63,000 frames. The results indicate that a camera angle of 75° gives the best results with the least error. An increased camera resolution only improves the result slightly. The algorithm must be able to reduce error accumulation by adapting the frame rate to minimise error. The results also illustrate the difficulties of estimating roll and pitch using a downward-facing camera. The best results for full 6-DOF position estimation were obtained on a 1.8-km run using 6680 frames captured from the forward-facing cameras. The translation error (x,y,z) is 3.76% and the rotational error (i.e., roll, pitch, and yaw) is 0.0482 deg m−1. The main contributions of this paper are an analysis of design option impacts on visual odometry results and a comparison of two state-of-the-art visual odometry algorithms, applied to agricultural field data.

  • 4.
    Ericson, Stefan
    et al.
    Högskolan i Skövde, Institutionen för teknik och samhälle.
    Åstrand, B.
    Halmstad University.
    A vision-guided mobile robot for precision agriculture2009Inngår i: Proceedings of 7th European Conference on Precision Agriculture / [ed] Eldert J. van Henten, D. Goense and C. Lokhorst, Wageningen Academic Publishers, 2009, s. 623-630Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In this paper we have developed a mobile robot which is able to perform crop-scale operations using vision as only sensor. The system consists of a row-following system and a visual odometry system. The row following system captures images from a front looking camera on the robot and the crop rows are extracted using Hough transform. Both distance to the rows and heading angle is provided which both are used to control the steering. The visual odometry system uses two cameras in a stereo setup pointing perpendicular to the ground. This system measures the travelled distance by measuring the ground movement and compensate for height variation. Experiments are performed on an artificial field due to the season. The result shows that the visual odometry have accuracy better than 2.1% of travelled distance.

  • 5.
    Ericson, Stefan
    et al.
    Högskolan i Skövde, Institutionen för teknik och samhälle.
    Åstrand, Björn
    Halmstad University, Sweden.
    Algorithms for visual odometry in outdoor field environment2007Inngår i: RA '07: Proceedings of the 13th IASTED International Conference on Robotics and Applications / [ed] Klaus Schilling, ACTA Press, 2007, s. 414-419Konferansepaper (Fagfellevurdert)
  • 6.
    Ericson, Stefan
    et al.
    Högskolan i Skövde, Institutionen för teknik och samhälle.
    Åstrand, Björn
    School of Information Science, Computer and Electrical Engineering, Halmstad University, Halmstad, Sweden.
    Row-detection on an agricultural field using omnidirectional camera2010Inngår i: The IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems (IROS 2010): Conference Proceedings, IEEE conference proceedings, 2010, s. 4982-4987Konferansepaper (Fagfellevurdert)
    Abstract [en]

    This paper describes a method of detecting parallel  rows  on  an  agricultural  field  using  an  omnidirectional camera.  The  method  works  both  on  cameras  with  a  fisheye lens and cameras with a catadioptric lens. A combination of an edge based method and a Hough transform method is suggested to find the rows. The vanishing point of several parallel rows is estimated using a second Hough transform. The method is evaluated on synthetic images generated with calibration data from real lenses. Scenes with several rows are produced, where each  plant  is  positioned  with  a  specified  error.  Experiments are  performed  on  these  synthetic  images  and  on  real  field images. The result shows that good accuracy is obtained on the vanishing point once it is detected correctly. Further it shows that the edge based method works best when the rows consists of solid lines, and the Hough method works best when the rows consists  of  individual  plants.  The  experiments  also  show  that the combined method provides better detection than using the methods separately.

  • 7.
    Ericson, Stefan
    et al.
    Högskolan i Skövde, Institutionen för teknik och samhälle.
    Åstrand, Björn
    Halmstad University.
    Stereo Visual Odometry for Mobile Robots on Uneven Terrain2008Inngår i: Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008 / [ed] Sio-Iong Ao, IEEE Computer Society, 2008, s. 150-157Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In this paper we present a stereo visual odometry system for mobile robots that is not sensitive to uneven terrain. Two cameras is mounted perpendicular to the ground and height and traveled distance are calculated using normalized cross correlation. A method for evaluating the system is developed, where flower boxes containing representative surfaces are placed in a metal-working lathe. The cameras are mounted on the carriage which can be positioned manually with 0.1 mm accuracy. Images are captured every 10 mm over 700 mm. The tests are performed on eight different surfaces representing real world situations. The resulting error is less than 0.6% of traveled distance on surfaces where the maximum height variation is measured to 96 mm. The variance is measured for eight test runs, total 5.6 m, to 0.040 mm. This accuracy is sufficient for crop-scale agricultural operations.

  • 8.
    Ericson, Stefan
    et al.
    Högskolan i Skövde, Institutionen för teknik och samhälle.
    Åstrand, Björn
    Halmstad University, Sweden.
    Visual Odometry System for Agricultural Field Robots2008Inngår i: Proceedings of the World Congress on Engineering and Computer Science 2008: WCECS 2008, October 22 - 24, 2008, San Francisco, USA / [ed] S. I. Ao, Craig Douglas, W. S. Grundfest, Lee Schruben, Jon Burgstone, International Association of Engineers , 2008, s. 619-624Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In this paper we present a visual odometry system for agricultural field robots that is not sensitive to uneven terrain. A stereo camera system is mounted perpendicular to the ground and height and traveled distance are calculated using normalized cross correlation. A method for evaluating the system is developed, where flower boxes containing representative surfaces are placed in a metal-working lathe. The cameras are mounted on the carriage which can be positioned manually with 0.1 mm accuracy. Images are captured every 10 mm over 700 mm. The tests are performed on eight different surfaces representing real world situations. The resulting error is less than 0.6% of traveled distance on surfaces where the maximum height variation is measured to 96 mm. The variance is measured for eight test runs, total 5.6 m, to 0.040 mm. This accuracy is sufficient for crop-scale agricultural operations.

     

     

  • 9.
    Huang, Rui
    et al.
    School of Computer Science, Nanjing University of Posts and Telecommunications Nanjing City, China.
    Ericson, Stefan
    Högskolan i Skövde, Institutionen för ingenjörsvetenskap. Högskolan i Skövde, Forskningscentrum för Virtuella system.
    An Efficient Way to Estimate the Focus of Expansion2018Inngår i: 2018 3rd IEEE International Conference on Image, Vision and Computing (ICIVC 2018), IEEE, 2018, s. 691-695Konferansepaper (Fagfellevurdert)
    Abstract [en]

    Detecting independent motion from a single camera is a difficult task in computer vision. It is because the captured image sequences are the combinations of the objects' movements and the camera's ego-motion. One major branch is to find the focus of expansion (FOE) instead as the goal. This is ideal for the situation commonly seen in UAV's camera system. In this case, the translation is dominant in camera's motion while the rotation is relatively small. To separate the ego motion and scene structure, many researchers used the directional flow as the theoretic basis and extracted its properties related to FOE. In this paper, we formulate finding FOE as an optimizing problem. The position of FOE has the minimal standard deviation for the directional flow in all directions, which is also subjected to the introduced constraint. The experiments show the proposed methods out-perform the previous method.

  • 10.
    Lidell, Anton
    et al.
    Högskolan i Skövde, Institutionen för ingenjörsvetenskap.
    Ericson, Stefan
    Högskolan i Skövde, Institutionen för ingenjörsvetenskap. Högskolan i Skövde, Forskningsmiljön Virtuell produkt- och produktionsutveckling.
    Ng, Amos H. C.
    Högskolan i Skövde, Institutionen för ingenjörsvetenskap. Högskolan i Skövde, Forskningsmiljön Virtuell produkt- och produktionsutveckling. Division of Industrial Engineering & Management, Uppsala University, Sweden.
    The Current and Future Challenges for Virtual Commissioning and Digital Twins of Production Lines2022Inngår i: SPS2022: Proceedings of the 10th Swedish Production Symposium / [ed] Amos H. C. Ng; Anna Syberfeldt; Dan Högberg; Magnus Holm, Amsterdam; Berlin; Washington, DC: IOS Press, 2022, s. 508-519Konferansepaper (Fagfellevurdert)
    Abstract [en]

    The use of virtual commissioning has increased in the last decade, but there are still challenges before the software code validation method is widespread in use. One of the extensions to virtual commissioning is the digital twin technology to allow for further improved accuracy. The aim of this paper is to review existing standards and approaches to developing virtual commissioning, through a literature review and interviews with experts in the industry. First, the definitions and classifications related to virtual commissioning and digital twins are reviewed, followed by, the approaches for the development of virtual commissioning and digital twins reported in the literature are explored. Then, in three interviews with experts of varying backgrounds and competencies, the views of the virtual technologies are assessed to provide new insight for the industry. The findings of the literature review and interviews are, among others, the apparent need for standardisation in the field and that a sought-after standard in the form of ISO 23247-1 is underway. The key finding of this paper is that digital twin is a concept with a promising future in combination with other technologies of Industry 4.0. We also outline the challenges and possibilities of virtual commissioning and the digital twin and could be used as a starting point for further research in standardisations and improvements sprung from the new standard.

    Fulltekst (pdf)
    fulltext
1 - 10 of 10
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • apa-cv
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf