his.sePublications
Change search
Refine search result
12 1 - 50 of 59
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Amorim, Joni A.
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Hendrix, Maurice
    Coventry University Technology Park, Coventry, UK.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Llinas, James
    State University of New York at Buffalo, USA.
    Gustavsson, Per M.
    Försvarshögskolan, Militärvetenskapliga institutionen (MVI), Ledningsvetenskapliga avdelningen (LVA).
    Brodin, Martin
    Actea Consulting, Sweden.
    Cyber Security Training Perspectives2013Conference paper (Refereed)
    Abstract [en]

    Building comprehensive cyber security strategies to protect people, infrastructure and assets demands research on methods and practices to reduce risks. Once the methods and practices are identified, there is a need to develop training for the manystakeholders involved, from security experts to the end user. In thispaper, we discuss new approaches for training, which includes the development of serious games for training on cyber security. The identification of the theoretical framework to be used for situation and threat assessment receives special consideration.

  • 2.
    Amorim, Joni A.
    et al.
    University of Skövde, The Informatics Research Centre. University of Skövde, School of Informatics.
    Yano, Edgar T.
    Department of Computer Science, Instituto Tecnologico de Aeronautica, São José dos Campos, Brazil.
    Åhlfeldt, Rose-Mharie
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Gustavsson, Per M.
    Combitech, SAAB Group, Skövde.
    Awareness and training: Identification of relevant security skills and competencies2014In: Engineering Education in a Technology-Dependent World: Book of Abstracts / [ed] Claudio da Rocha Brito, Melany M. Ciampi, Guimarães: INTERTECH , 2014, , p. 57p. 37-Conference paper (Refereed)
    Abstract [en]

    In order to identify needed skills and competencies for privacy and security, we propose a systematic process that maps privacy and security threats to related controls that are required to prevent, detect or remove such threats. This work suggests how to apply the process, while discussing how games and simulations can be used both to develop the desired behavior and to monitor the current competency level.

  • 3.
    Amorim, Joni A.
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Åhlfeldt, Rose-Mharie
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre. Saab Training Systems, Saab AB, Skövde, Sweden.
    Gustavsson, Per M.
    Saab Training Systems, Saab AB Skövde, Sweden.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Privacy and Security in Cyberspace: Training Perspectives on the Personal Data Ecosystem2013In: European Intelligence and Security Informatics Conference (EISIC), Proceedings CD / [ed] Joel Brynielsson and Fredrik Johansson, IEEE conference proceedings, 2013, p. 139-142, article id 6657140Conference paper (Refereed)
    Abstract [en]

    There is a growing understanding that privacy is an essential component of security. In order to decrease the probability of having data breaches, the design of information systems,  processes  and  architectures  should  incorporate considerations  related  to  both  privacy  and  security.  This incorporation may benefit from the offering of appropriate training. In this way, this paper intends to discuss how to better offer training while considering new developments that involve both multimedia production and the “gamification” of training. The paper suggests the use in conjunction of two frameworks: the EduPMO Framework, useful for the management of large scale projects  that  may  involve  a  consortium  of  organizations developing multimedia for the offering of training, and the Game Development Framework, useful for the identification of the main components of the serious game for training on privacy by design to be developed as part of the training offering.

  • 4.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics.
    Information Fusion from Databases, Sensors and Simulations: Annual Report 20052006Report (Other academic)
  • 5.
    Andler, Sten F.
    et al.
    University of Skövde, School of Humanities and Informatics.
    Brohede, Marcus
    University of Skövde, School of Humanities and Informatics.
    Information Fusion from Databases, Sensors and Simulations: Annual Report 20062007Report (Other academic)
  • 6.
    Andler, Sten F.
    et al.
    University of Skövde, School of Humanities and Informatics.
    Brohede, Marcus
    University of Skövde, School of Humanities and Informatics.
    Information Fusion from Databases, Sensors and Simulations: Annual Report 20072008Report (Other academic)
  • 7.
    Andler, Sten F.
    et al.
    University of Skövde, School of Humanities and Informatics.
    Brohede, Marcus
    University of Skövde, School of Humanities and Informatics.
    Information Fusion from Databases, Sensors and Simulations: Annual Report 20082009Report (Other academic)
  • 8.
    Andler, Sten F
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Brohede, Marcus
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Gustavsson, Sanny
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Mathiason, Gunnar
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    DeeDS NG: Architecture, Design, and Sample Application Scenario2007In: Handbook of Real-Time and Embedded Systems, CRC Press, 2007Chapter in book (Other academic)
  • 9.
    Andler, Sten F.
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Fredin, Mikael
    Saab Microwave Systems AB, Sweden.
    Gustavsson, Per M.
    George Mason Univ., USA.
    van Laere, Joeri
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Nilsson, Maria
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Svenson, Pontus
    Swedish Defence Research Agency, Sweden.
    SMARTracIn: a concept for spoof resistant tracking of vessels and detection of adverse intentions2009In: Sensors, and Command, Control, Communications, and Intelligence (C3I) Technologies for Homeland Security and Homeland Defense VIII / [ed] Edward M. Carapezza, SPIE - International Society for Optical Engineering, 2009, p. 73050G-1-73050G-9, article id 73050GConference paper (Refereed)
    Abstract [en]

    The aim of maritime surveillance systems is to detect threats earlyenough to take appropriate actions. We present the results ofa study on maritime domain awareness performed during the fallof 2008. We analyze an identified capability gap of worldwidesurveillance in the maritime domain, and report from a userworkshop addressing the identified gap. We describe a SMARTracIn conceptsystem that integrates information from surveillance systems with background knowledgeon normal conditions to help users detect and visualize anomaliesin vessel traffic. Land-based systems that cover the coastal watersas well as airborne, space-borne and ships covering open seaare considered. Sensor data are combined with intelligence information fromship reporting systems and databases. We describe how information fusion,anomaly detection and semantic technology can be used to helpusers achieve more detailed maritime domain awareness. Human operators area vital part of this system and should be activecomponents in the fusion process. We focus on the problemof detecting anomalous behavior in ocean-going traffic, and a roomand door segmentation concept to achieve this. This requires theability to identify vessels that enter into areas covered bysensors as well as the use of information management systemsthat allow us to quickly find all relevant information.

  • 10.
    Andler, Sten F.
    et al.
    University of Skövde, Department of Computer Science.
    Hansson, Jörgen
    University of Skövde, Department of Computer Science.
    Eriksson, Joakim
    University of Skövde, Department of Computer Science.
    Mellin, Jonas
    University of Skövde, Department of Computer Science.
    Berndtsson, Mikael
    University of Skövde, Department of Computer Science.
    Eftring, Bengt
    University of Skövde, Department of Computer Science.
    DeeDS: Towards a Distributed and Active Real-Time Database Systems1996In: ACM Sigmod Record, Vol. 25, no 1, p. 38-40Article in journal (Refereed)
    Abstract [en]

    DeeDS combines active database functionality with critical timing constraints and integrated system monitoring. Since the reactive database mechanisms, or rule management system, must meet critical deadlines, we must employ methods that make triggering of rules and execution of actions predictable. We will focus on the scheduling issues associated with dynamic scheduling of workloads where the triggered transactions have hard, firm or soft deadlines, and how transient overloads may be resolved by substituting transactions by computationally cheaper ones. The rationale for a loosely coupled general purpose event monitoring facility, that works in tight connection with the scheduler, is presented. For performance and predictability, the scheduler and event monitor are executing on a separate CPU from the rest of the system. Real-time database accesses in DeeDS are made predictable and efficient by employing methods such as main memory resident data, full replication, eventual consistency, and prevention of global deadlocks.

  • 11.
    Andler, Sten F.
    et al.
    University of Skövde, School of Humanities and Informatics.
    Niklasson, Lars
    University of Skövde, School of Humanities and Informatics.
    Olsson, Björn
    University of Skövde, School of Humanities and Informatics.
    Persson, Anne
    University of Skövde, School of Humanities and Informatics.
    Planstedt, Tomas
    Ericsson Microwave Systems AB, Skövde, Sweden.
    De Vin, Leo J.
    University of Skövde, School of Technology and Society.
    Wangler, Benkt
    University of Skövde, School of Humanities and Informatics.
    Ziemke, Tom
    University of Skövde, School of Humanities and Informatics.
    Information Fusion from Databases, Sensors and Simulations: A Collaborative Research Program2005In: Proceedings: 29th Annual IEEE/NASA Software Engineering Workshop, IEEE Computer Society, 2005, p. 234-241Conference paper (Refereed)
    Abstract [en]

    This paper provides an overview of a collaborative research program in information fusion from databases, sensors and simulations. Information fusion entails the combination of data from multiple sources, to generate information that cannot be derived from the individual sources. This area is of strategic importance for industry and defense, as well as public administration areas such as health care, and needs to be pursued as an academic subject. A large number of industrial partners are supporting and participating in the development of the area. The paper describes the program’s general approach and main research areas, with a particular focus on the role of information fusion in systems development

  • 12.
    Atif, Yacine
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Ding, Jianguo
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Lindström, Birgitta
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Jeusfeld, Manfred
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Yuning, Jiang
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Brax, Christoffer
    CombiTech AB, Skövde, Sweden.
    Gustavsson, Per M.
    CombiTech AB, Skövde, Sweden.
    Cyber-Threat Intelligence Architecture for Smart-Grid Critical Infrastructures Protection2017Conference paper (Refereed)
    Abstract [en]

    Critical infrastructures (CIs) are becoming increasingly sophisticated with embedded cyber-physical systems (CPSs) that provide managerial automation and autonomic controls. Yet these advances expose CI components to new cyber-threats, leading to a chain of dysfunctionalities with catastrophic socio-economical implications. We propose a comprehensive architectural model to support the development of incident management tools that provide situation-awareness and cyber-threats intelligence for CI protection, with a special focus on smart-grid CI. The goal is to unleash forensic data from CPS-based CIs to perform some predictive analytics. In doing so, we use some AI (Artificial Intelligence) paradigms for both data collection, threat detection, and cascade-effects prediction. 

  • 13.
    Atif, Yacine
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Jiang, Yuning
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Jeusfeld, Manfred A.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Ding, Jianguo
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Lindström, Birgitta
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Brax, Christoffer
    Combitech.
    Haglund, Daniel
    Combitech.
    Lindström, Björn
    Combitech.
    Cyber-threat analysis for Cyber-Physical Systems: Technical report for Package 4, Activity 3 of ELVIRA project2018Report (Other academic)
    Abstract [en]

    Smart grid employs ICT infrastructure and network connectivity to optimize efficiency and deliver new functionalities. This evolu- tion is associated with an increased risk for cybersecurity threats that may hamper smart grid operations. Power utility providers need tools for assessing risk of prevailing cyberthreats over ICT infrastructures. The need for frameworks to guide the develop- ment of these tools is essential to define and reveal vulnerability analysis indicators. We propose a data-driven approach for design- ing testbeds to evaluate the vulnerability of cyberphysical systems against cyberthreats. The proposed framework uses data reported from multiple components of cyberphysical system architecture layers, including physical, control, and cyber layers. At the phys- ical layer, we consider component inventory and related physi- cal flows. At the control level, we consider control data, such as SCADA data flows in industrial and critical infrastructure control systems. Finally, at the cyber layer level, we consider existing secu- rity and monitoring data from cyber-incident event management tools, which are increasingly embedded into the control fabrics of cyberphysical systems.

  • 14.
    Atif, Yacine
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Jiang, Yuning
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Lindström, Birgitta
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Ding, Jianguo
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Jeusfeld, Manfred
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Nero, Eva
    Combitech, Sweden.
    Brax, Christoffer
    Combitech, Sweden.
    Haglund, Daniel
    Combitech, Sweden.
    Multi-agent Systems for Power Grid Monitoring: Technical report for Package 4.1 of ELVIRA project2018Report (Other academic)
    Abstract [en]

    This document reports a technical description of ELVIRA project results obtained as part of Work- package 4.1 entitled “Multi-agent systems for power Grid monitoring”. ELVIRA project is a collaboration between researchers in School of IT at University of Skövde and Combitech Technical Consulting Company in Sweden, with the aim to design, develop and test a testbed simulator for critical infrastructures cybersecurity. This report outlines intelligent approaches that continuously analyze data flows generated by Supervisory Control And Data Acquisition (SCADA) systems, which monitor contemporary power grid infrastructures. However, cybersecurity threats and security mechanisms cannot be analyzed and tested on actual systems, and thus testbed simulators are necessary to assess vulnerabilities and evaluate the infrastructure resilience against cyberattacks. This report suggests an agent-based model to simulate SCADA- like cyber-components behaviour when facing cyber-infection in order to experiment and test intelligent mitigation mechanisms. 

  • 15.
    Birgersson, Ragnar
    et al.
    University of Skövde, School of Humanities and Informatics.
    Mellin, Jonas
    University of Skövde, School of Humanities and Informatics.
    Andler, Sten F
    University of Skövde, School of Humanities and Informatics.
    Bounds on Test Effort for Event-Triggered Real-Time Systems1999Report (Other academic)
    Abstract [en]

    The test effort required for full test coverage is much higher in an event-triggered than in a time-triggered real-timesystem. This makes it difficult to attain confidence in the correctness of event-triggered real-time applications by testing,which is a necessary complement to other verification methods. We present a more general upper bound on the test effort of constrained event-triggered real-time systems, assuming multiple resources (a refinement of previous results). The emphasis is on system level testing of application timeliness, assuming that sufficient confidence in its functional correctness has been attained. Covered fault types include incorrect assumptions about temporal attributes of application and execution environment, and synchronization faults. An analysis of the effects that our constraints have on predictability and efficiency shows that the use of designated preemption points is required. A key factor in this approach is the ability to reduce the number of required test cases while maintaining full test coverage.

  • 16.
    Boström, Henrik
    et al.
    University of Skövde, School of Humanities and Informatics.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics.
    Brohede, Marcus
    University of Skövde, School of Humanities and Informatics.
    Johansson, Ronnie
    University of Skövde, School of Humanities and Informatics.
    Karlsson, Alexander
    University of Skövde, School of Humanities and Informatics.
    van Laere, Joeri
    University of Skövde, School of Humanities and Informatics.
    Niklasson, Lars
    University of Skövde, School of Humanities and Informatics.
    Nilsson, Marie
    University of Skövde, School of Humanities and Informatics.
    Persson, Anne
    University of Skövde, School of Humanities and Informatics.
    Ziemke, Tom
    University of Skövde, School of Humanities and Informatics.
    On the Definition of Information Fusion as a Field of Research2007Report (Other academic)
    Abstract [en]

    A more precise definition of the field of information fusion can be of benefit to researchers within the field, who may use uch a definition when motivating their own work and evaluating the contribution of others. Moreover, it can enable researchers and practitioners outside the field to more easily relate their own work to the field and more easily understand the scope of the techniques and methods developed in the field. Previous definitions of information fusion are reviewed from that perspective, including definitions of data and sensor fusion, and their appropriateness as definitions for the entire research field are discussed. Based on strengths and weaknesses of existing definitions, a novel definition is proposed, which is argued to effectively fulfill the requirements that can be put on a definition of information fusion as a field of research.

  • 17.
    Brax, Christoffer
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Karlsson, Alexander
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Johansson, Ronnie
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Niklasson, Lars
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Evaluating Precise and Imprecise State-Based Anomaly Detectors for Maritime Surveillance2010In: Proceedings of the 13th International Conference on Information Fusion, IEEE conference proceedings, 2010, p. Article number 5711997-Conference paper (Refereed)
    Abstract [en]

    We extend the State-Based Anomaly Detection approach by introducing precise and imprecise anomaly detectors using the Bayesian and credal combination operators, where evidences over time are combined into a joint evidence. We use imprecision in order to represent the sensitivity of the classification regarding an object being  normal or anomalous. We evaluate the detectors on a real-world maritime dataset containing recorded AIS data and show that the anomaly detectors outperform   previously proposed detectors based on Gaussian mixture models and kernel density estimators. We also show that our introduced anomaly detectors perform slightly better than the State-Based Anomaly Detection approach with a sliding window.

  • 18.
    Brohede, Marcus
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    A Distributed Whiteboard Infrastructure for Information Fusion Simulations2009In: Proceedings of the 32nd Annual IEEE Software Engineering Workshop (SEW-32) / [ed] Breitman, K; Sterritt, R; Bohner, S, IEEE Computer Society Press , 2009, p. 134-142Conference paper (Refereed)
    Abstract [en]

    We argue that architecural ideas from DeeDS, a distributed real-time database, can be used to create a whiteboard infrastructure usable in information fusion applications as well as in fault tolerant simulations. We discuss the need for faulttolerant real-time simulation in information fusion, and how this can be supported by the whiteboard infrastructure. There are many reasons to perform real-time simulations (including environment simulation) - the use of real hardware or systems can be prohibitively expensive or dangerous to people and equipment, or may simply not be available at all. The combination of real and simulated objects, or nodes, is of particular interest when it is prohibitively expensive, dangerous, or impossible to test sufficient numbers of collaborating nodes in a realistic application. The need for fault tolerance of such real-time simulations, with a mixture of real and simulated nodes, occurs because it is difficult/impossible to restart physical nodes or wait for restart of simulated nodes, if one or more simulation instances fail. The key problem when mixing real and simulated nodes is the occurrence of "external actions" that cannot easily be undone and redone when a failure has occurred. We describe a natural approach to support multiple degrees of fault tolerance in realtime simulations, based on optimistic synchronization on top of a whiteboard architecture. The whiteboard architecture is a natural and useful infrastructure metaphor for information fusion, an information-based exchange that allows real and simulated nodes to be freely mixed.

  • 19.
    Brohede, Marcus
    et al.
    University of Skövde, School of Humanities and Informatics.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics.
    Using Distributed Active Real-Time Database Functionality in Information-Fusion Infrastructures2005In: RTiS 2005: proceedings of Real time in Sweden 2005, the 8th Biennial SNART Conference on Real-Time Systems / [ed] Sten F. Andler, Anton Cervin, Skövde: Skövde University , 2005, p. 5-10Conference paper (Other academic)
    Abstract [en]

    We present a list of requirements that need to be

    addressed by an infrastructure for information fusion where

    applications have real-time requirements. The requirements are

    grouped into configuration requirements, temporal requirements,

    and robustness requirements. We show how the functionality of

    a distributed active real-time database system (DARTDBS) can

    meet many of the given requirements, and therefore, argue that

    it is suitable for use in an information-fusion infrastructure with

    real-time requirements. The design of a particular DARTDBS,

    the DeeDS architecture and prototype, is presented as proof

    of concept. A comparison with some alternative infrastructures

    is briefly discussed. We describe a small distributed real-time

    simulation experiment using DeeDS as infrastructure, and discuss

    open questions such as how to deal with uncertainty management

    of information sources, recovery of information fusion nodes, and

    harmonizing data structures from different information sources.

  • 20.
    Brohede, Marcus
    et al.
    University of Skövde, School of Humanities and Informatics.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics.
    Son, Sang Hyuk
    Department of Computer Science, University of Virginia, 151 Engineer’s Way, P.O. Box 400740, USA.
    Optimistic database-driven distributed real-time simulation (05F-SIW-031)2005In: Fall 2005 Simulation Interoperability Workshop: Fall SIW 2005, Curran Associates, Inc., 2005, p. 223-228Conference paper (Refereed)
    Abstract [en]

    In this paper we present an optimistic synchronization protocol for distributed real-time simulations that uses a database as communication and storage mechanism. Each node in the simulation is also a database node and communication in the simulation is done by storing and reading to the database. The underlying replication protocol in the database then makes sure that all updates are propagated. The progress in the simulation is optimistic, i.e., each node tries to simulate as far ahead as possible without waiting for input from any other node. Since the simulations are said to be real-time we must guarantee that no events can be delivered too early nor too late. Also, recovery of a node must be done within predictable time due to the real-time constraints. Since all updates in the simulation are done through transactions we have a well-defined foundation for recovery and we show how the recovery can be done deterministically. For the simulation to function (and keep deadlines) during network partitions we allow local commits in the database. This requires that all data required on a specific node must be reachable from that node, i.e., no remote accesses should be needed. However, allowing local commits may introduce conflicting updates. These conflicts are detected and solved predictably

  • 21.
    De Vin, Leo
    et al.
    University of Skövde, School of Technology and Society.
    Andler, Sten F.
    University of Skövde, School of Technology and Society.
    Ng, Amos
    University of Skövde, School of Technology and Society.
    Moore, P. R.
    Pu, J.
    Wong, B. C.-B.
    Information Fusion: What can the manufacturing sector learn from the defence industry?2005In: IMC 22: Challenges facing manufacturing : proceedings of the 22nd International Manufacturing Conference, 31st August to 2nd September 2005 / [ed] John Vickery, Dublin: Institute of Technology, Tallaght , 2005, p. 363-371Conference paper (Refereed)
    Abstract [en]

    This paper discusses information fusion including its nature as well as some models for information fusion. Research on information fusion is dominated by defence applications and therefore, most models to a certain extent are defence specific; it is explained how these can be made more generic by adapting them. It is explained how the manufacturing sector can benefit from information fusion research; some analogies between issues in manufacturing and issues in military applications are given. A specific area in which the manufacturing sector can benefit from research on information fusion is the area of virtual manufacturing. Many issues related to decision support through modelling, simulation and synthetic environments are identical for manufacturing and defence applications. A particular area of interest for the future will be verification, validation and accreditation of modelling and simulation components for synthetic environments with various involved parties.

  • 22.
    De Vin, Leo J.
    et al.
    University of Skövde, School of Technology and Society.
    Ng, Amos H. C.
    University of Skövde, School of Technology and Society.
    Oscarsson, Jan
    University of Skövde, School of Technology and Society.
    Andler, Sten F.
    University of Skövde, School of Technology and Society.
    Information Fusion for Simulation Based Decision Support in manufacturing2005In: FAIM 2005: Volume I: Proceedings of the 15th International conference on flexible automation and intelligent manufacturing, July 18th-20th, 2005 / [ed] Esther Alvarez, Jalal Ashayeri, William G. Sullivan, Munir Ahmad, Bilbao: University of Deusto , 2005, p. 136-144Conference paper (Refereed)
  • 23.
    De Vin, Leo
    et al.
    University of Skövde, School of Technology and Society.
    Ng, Amos
    University of Skövde, School of Technology and Society.
    Oscarsson, Jan
    University of Skövde, School of Technology and Society.
    Andler, Sten F.
    University of Skövde, School of Technology and Society.
    Information Fusion for Simulation Based Decision Support in Manufacturing2006In: Robotics and Computer-Integrated Manufacturing, ISSN 0736-5845, E-ISSN 1879-2537, Vol. 22, no 5-6, p. 429-436Article in journal (Refereed)
    Abstract [en]

    Robust and informed decisions are important for the efficient and effective operation of installed production facilities. The paper discusses Information fusion (IF) including a generic model for IF, and situations for decision-making. The paper also discusses current and future use of manufacturing resource simulation for design/configuration, operational planning and scheduling, and service and maintenance of manufacturing systems. Many of these applications use IF in some way, as is explained in more detail for simulation based service and maintenance. An extension of the generic model for IF is presented which incorporates modeling and simulation, and active databases as used in a simulation based service and maintenance system at the authors’ laboratory

  • 24.
    Ding, Jianguo
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Atif, Yacine
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Lindström, Birgitta
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Jeusfeld, Manfred
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    CPS-based Threat Modeling for Critical Infrastructure Protection2017In: Performance Evaluation Review, ISSN 0163-5999, E-ISSN 1557-9484, Vol. 45, no 2, p. 129-132Article in journal (Refereed)
    Abstract [en]

    Cyber-Physical Systems (CPSs) are augmenting traditionalCritical Infrastructures (CIs) with data-rich operations. Thisintegration creates complex interdependencies that exposeCIs and their components to new threats. A systematicapproach to threat modeling is necessary to assess CIs’ vulnerabilityto cyber, physical, or social attacks. We suggest anew threat modeling approach to systematically synthesizeknowledge about the safety management of complex CIs andsituational awareness that helps understanding the nature ofa threat and its potential cascading-effects implications.

  • 25.
    Ding, Jianguo
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Lindström, Birgitta
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Mathiason, Gunnar
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Towards Threat Modeling for CPS-based Critical Infrastructure Protection2015In: Proceedings of the International Emergency Management Society (TIEMS), 22nd TIEMS Annual Conference: Evolving threats and vulnerability landscape: new challenges for the emergency management / [ed] Snjezana Knezic & Meen Poudyal Chhetri, Brussels: TIEMS, The International Emergency Management Society , 2015, Vol. 22Conference paper (Refereed)
    Abstract [en]

    With the evolution of modern Critical Infrastructures (CI), more Cyber-Physical systems are integrated into the traditional CIs. This makes the CIs a multidimensional complex system, which is characterized by integrating cyber-physical systems into CI sectors (e.g., transportation, energy or food & agriculture). This integration creates complex interdependencies and dynamics among the system and its components. We suggest using a model with a multi-dimensional operational specification to allow detection of operational threats. Embedded (and distributed) information systems are critical parts of the CI where disruption can lead to serious consequences. Embedded information system protection is therefore crucial. As there are many different stakeholders of a CI, comprehensive protection must be viewed as a cross-sector activity to identify and monitor the critical elements, evaluate and determine the threat, and eliminate potential vulnerabilities in the CI. A systematic approach to threat modeling is necessary to support the CI threat and vulnerability assessment. We suggest a Threat Graph Model (TGM) to systematically model the complex CIs. Such modeling is expected to help the understanding of the nature of a threat and its impact on throughout the system. In order to handle threat cascading, the model must capture local vulnerabilities as well as how a threat might propagate to other components. The model can be used for improving the resilience of the CI by encouraging a design that enhances the system's ability to predict threats and mitigate their damages. This paper surveys and investigates the various threats and current approaches to threat modeling of CI. We suggest integrating both a vulnerability model and an attack model, and we incorporate the interdependencies within CI cross CI sectors. Finally, we present a multi-dimensional threat modeling approach for critical infrastructure protection.

  • 26.
    Ericsson, Ann-marie
    et al.
    University of Skövde, Department of Computer Science.
    Nilsson, Robert
    University of Skövde, Department of Computer Science.
    Andler, Sten F.
    University of Skövde, Department of Computer Science.
    Operator Patterns for Analysis of Composite Events in Timed Automata2003In: WIP Proceedings: 24th IEEE Real-Time Systems Symposium, 2003, p. 1555-1558Conference paper (Refereed)
    Abstract [en]

    Abstract—Event-triggered real-time systems interact with the environment by executing actions in response to monitored events. Such systems may be implemented using event condition action (ECA) rules, which execute an action if the associated event occurs and a specified condition is true. However, the ECA rule paradigm is known to be hard to analyze with respect to correctness and timeliness, which is not conducive to the high predictability requirements typically associated with real-time systems. To still take advantage of the ECA rule paradigm when event-triggered real-time systems are developed, we propose an approach where systems are specified and analyzed in a high-level formal language (timed automata) and later transformed into the ECA rule paradigm. We especially focus on a high-level approach for specifying and analyzing composite event occurrences in timed automata.

  • 27.
    Eriksson, Anders
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Lindström, Birgitta
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Offutt, Jeff
    Software Engineering, George Mason University, Fairfax, VA 22030-4444, United States.
    Model transformation impact on test artifacts: An empirical study2012In: Proceedings of the Workshop on Model-Driven Engineering, Verification and Validation, MoDeVVa 2012, Association for Computing Machinery (ACM), 2012, p. 5-10Conference paper (Refereed)
    Abstract [en]

    Development environments that support Model-Driven Development often focus on model-level functional testing, enabling verification of design models against their specifications. However, developers of safety-critical software systems are also required to show that tests cover the structure of the implementation. Unfortunately, the implementation structure can diverge from the model depending on choices such as the model compiler or target language. Therefore, structural coverage at the model level may not guarantee coverage of the implementation. We present results from an industrial experiment that demonstrates the model-compiler effect on test artifacts in xtUML models when these models are transformed into C++. Test artifacts, i.e., predicates and clauses, are used to satisfy the structural code coverage criterion, in this case MCDC, which is required by the US Federal Aviation Administration. The results of the experiment show not only that the implementation contains more test artifacts than the model, but also that the test artifacts can be deterministically enumerated during translation. The analysis identifies two major sources for these additional test artifacts. © 2012 ACM.

  • 28.
    González-Hernández, Loreto
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Lindström, Birgitta
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Offutt, Jeff
    George Mason University, USA.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Potena, Pasqualina
    RISE SICS, Västerås.
    Bohlin, Markus
    RISE SICS, Västerås.
    Using Mutant Stubbornness to Create Minimal and Prioritized Test Sets2018In: 2018 IEEE International Conference on Software Quality, Reliability and Security (QRS), IEEE Computer Society, 2018, p. 446-457Conference paper (Refereed)
    Abstract [en]

    In testing, engineers want to run the most useful tests early (prioritization). When tests are run hundreds or thousands of times, minimizing a test set can result in significant savings (minimization). This paper proposes a new analysis technique to address both the minimal test set and the test case prioritization problems. This paper precisely defines the concept of mutant stubbornness, which is the basis for our analysis technique. We empirically compare our technique with other test case minimization and prioritization techniques in terms of the size of the minimized test sets and how quickly mutants are killed. We used seven C language subjects from the Siemens Repository, specifically the test sets and the killing matrices from a previous study. We used 30 different orders for each set and ran every technique 100 times over each set. Results show that our analysis technique performed significantly better than prior techniques for creating minimal test sets and was able to establish new bounds for all cases. Also, our analysis technique killed mutants as fast or faster than prior techniques. These results indicate that our mutant stubbornness technique constructs test sets that are both minimal in size, and prioritized effectively, as well or better than other techniques.

  • 29.
    Grindal, Mats
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Lindström, Birgitta
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Offutt, Jeff
    George Mason Univ, Dept Informat & Software Engn, Fairfax, VA 22030 USA.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    An Evaluation of Combination Strategies for Test Case Selection2006In: Journal of Empirical Software Engineering, ISSN 1382-3256, E-ISSN 1573-7616, Vol. 11, no 4, p. 583-611Article in journal (Refereed)
    Abstract [en]

    This paper presents results from a comparative evaluation of five combination strategies. Combination strategies are test case selection methods that combine “interesting” values of the input parameters of a test subject to form test cases. This research comparatively evaluated five combination strategies; the All Combination strategy (AC), the Each Choice strategy (EC), the Base Choice strategy (BC), Orthogonal Arrays (OA) and the algorithm from the Automatic Efficient Test Generator (AETG). AC satisfies n-wise coverage, EC and BC satisfy 1-wise coverage, and OA and AETG satisfy pair-wise coverage. The All Combinations strategy was used as a “gold standard” strategy; it subsumes the others but is usually too expensive for practical use. The others were used in an experiment that used five programs seeded with 128 faults. The combination strategies were evaluated with respect to the number of test cases, the number of faults found, failure size, and number of decisions covered. The strategy that requires the least number of tests, Each Choice, found the smallest number of faults. Although the Base Choice strategy requires fewer test cases than Orthogonal Arrays and AETG, it found as many faults. Analysis also shows some properties of the combination strategies that appear significant. The two most important results are that the Each Choice strategy is unpredictable in terms of which faults will be revealed, possibly indicating that faults are found by chance, and that the Base Choice and the pair-wise combination strategies to some extent target different types of faults.

  • 30.
    Grindal, Mats
    et al.
    University of Skövde, School of Humanities and Informatics.
    Lindström, Birgitta
    University of Skövde, School of Humanities and Informatics.
    Offutt, Jeff
    University of Skövde, School of Humanities and Informatics.
    Andler, Sten F
    University of Skövde, School of Humanities and Informatics.
    An Evaluation of Combination Strategies for Test Case Selection2003Report (Other academic)
    Abstract [en]

    In this report we present the results from a comparative evaluation of five combination strategies. Combination strategies are test case selection methods that combine interesting values of the input parameters of a test object to form test cases. One of the investigated combination strategies, namely the Each Choice strategy, satisfies 1-wise coverage, i.e., each interesting value of each parameter is represented at least once in the test suite. Two of the strategies, the Orthogonal Arrays and Heuristic Pair-Wise strategies both satisfy pair-wise coverage, i.e., every possible pair of interesting values of any two parameters are included in the test suite. The fourth combination strategy, the All Values strategy, generates all possible combinations of the interesting values of the input parameters. The fifth and last combination strategy, the Base Choice combination strategy, satisfies 1-wise coverage but in addition makes use of some semantic information to construct the test cases.

    Except for the All Values strategy, which is only used as a reference point with respect to the number of test cases, the combination strategies are evaluated and compared with respect to number of test cases, number of faults found, test suite failure density, and achieved decision coverage in an experiment comprising five programs, similar to Unix commands, seeded with 131 faults. As expected, the Each Choice strategy finds the smallest number of faults among the evaluated combination strategies. Surprisingly, the Base Choice strategy performs as well, in terms of detecting faults, as the pair-wise combination strategies, despite fewer test cases. Since the programs and faults in our experiment may not be representative of actual testing problems in an industrial setting, we cannot draw any general conclusions regarding the number of faults detected by the evaluated combination strategies. However, our analysis shows some properties of the combination strategies that appear significant in spite of the programs and faults not being representative. The two most important results are that the Each Choice strategy is unpredictable in terms of which faults will be detected, i.e., most faults found are found by chance, and that the Base Choice and the pair-wise combination strategies to some extent target different types of faults.

  • 31.
    Grindal, Mats
    et al.
    University of Skövde, School of Humanities and Informatics.
    Offutt, Jeff
    University of Skövde, School of Humanities and Informatics.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics.
    Combination Testing Strategies: A Survey2005In: Software testing, verification & reliability, ISSN 0960-0833, E-ISSN 1099-1689, Vol. 15, no 3, p. 167-199Article in journal (Refereed)
    Abstract [en]

    Combination strategies are test case selection methods that identify test cases by combining values of the different test object input parameters based on some combinatorial strategy. This survey presents 16 different combination strategies, covering more than 40 papers that focus on one or several combination strategies. This collection represents most of the existing work performed on combination strategies. This survey describes the basic algorithms used by the combination strategies. Some properties of combination strategies, including coverage criteria and theoretical bounds on the size of test suites, are also included in this description. This survey paper also includes a subsumption hierarchy that attempts to relate the various coverage criteria associated with the identified combination strategies

  • 32.
    Gustavsson, Sanny
    et al.
    University of Skövde, School of Humanities and Informatics.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics.
    Continuous Consistency Management in Distributed Real-Time Databases with Multiple Writers of Replicated Data2005In: Proceedings: 19th IEEE International Parallel and Distributed Processing Symposium IPDPS 2005, IEEE Computer Society, 2005, p. 137b-Conference paper (Refereed)
    Abstract [en]

    We introduce a continuous convergence protocol for handling locally committed and possibly conflicting updates to replicated data. The protocol supports local consistency and predictability while allowing replicas to deterministically diverge and converge as updates are committed and replicated. We discuss how applications may exploit the protocol characteristics and describe an implementation where conflicting updates are detected, qualified by a partial update order, and resolved using application-specific forward conflict resolution.

  • 33.
    Hassan, M. Mahdi
    et al.
    Karlstads Universitet.
    Afzal, Wasif
    Mälardalens Högskola.
    Lindström, Birgitta
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Shah, Syed M. A.
    SICS.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Blom, Martin
    Karlstads Universitet.
    Testability and Software Performance: A Systematic Mapping Study2016Conference paper (Refereed)
    Abstract [en]

    In most of the research on software testability, functional correctness of the software has been the focus while the evidence regarding testability and non-functional properties such as performance is sporadic. The objective of this study is to present the current state-of-the-art related to issues of importance, types and domains of software under test, types of research, contribution types and design evaluation methods concerning testability and software performance. We find that observability, controllability and testing effort are the main testability issues while timeliness and response time (i.e., time constraints) are the main performance issues in focus. The primary studies in the area use diverse types of software under test within different domains, with realtime systems as being a dominant domain. The researchers have proposed many different methods in the area, however these methods lack implementation in practice.

  • 34.
    Hassan, Mohammad Mahdi
    et al.
    Karlstad University, Sweden.
    Afzal, Wasif
    Mälardalen University, Sweden.
    Blom, Martin
    Karlstad University, Sweden.
    Lindström, Birgitta
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, The Informatics Research Centre. University of Skövde, School of Informatics.
    Eldh, Sigrid
    Ericsson AB, Sweden.
    Testability and Software Robustness: A Systematic Literature Review2015In: Proceedings 41st Euromicro Conference on Software Engineering and Advanced Applications SEAA 2015, IEEE Computer Society, 2015, p. 341-348Conference paper (Refereed)
  • 35.
    Jiang, Yuning
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Ding, Jianguo
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Atif, Yacine
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Jeusfeld, Manfred
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Lindström, Birgitta
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Brax, Christoffer
    Combitech, Sweden.
    Haglund, Daniel
    Combitech, Sweden.
    Complex Dependencies Analysis: Technical Description of Complex Dependencies in Critical Infrastructures, i.e. Smart Grids. Work Package 2.1 of the ELVIRA Project2018Report (Other academic)
    Abstract [en]

    This document reports a technical description of ELVIRA project results obtained as part of Work-package 2.1 entitled “Complex Dependencies Analysis”. In this technical report, we review attempts in recent researches where connections are regarded as influencing factors to  IT systems monitoring critical infrastructure, based on which potential dependencies and resulting disturbances are identified and categorized. Each kind of dependence has been discussed based on our own entity based model. Among those dependencies, logical and functional connections have been analysed with more details on modelling and simulation techniques.

  • 36.
    Karlsson, Alexander
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Johansson, Ronnie
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    An Empirical Comparison of Bayesian and Credal Combination Operators2010In: FUSION 2010: 13th international Conference on Information Fusion, 26-29 July 2010, EICC, Edinburgh, UK, IEEE conference proceedings, 2010, p. Article number 5711907-Conference paper (Refereed)
    Abstract [en]

    We are interested in whether or not representing and maintaining imprecision is beneficial when combining evidences from multiple sources. We perform two experiments that contain different levels of risk and where we measure the performance of the Bayesian and credal combination operators by using a simple score function that measures the informativeness of a reported decision set. We show that the Bayesian combination operator performed on centroids of operand credal sets outperforms the credal combination operator when no risk is involved in the decision problem. We also show that if a risk component is present in the decision problem, a simple cautious decision policy for the Bayesian combination operator can be constructed that outperforms the corresponding credal decision policy.

  • 37.
    Karlsson, Alexander
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Johansson, Ronnie
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    An Empirical Comparison of Bayesian and Credal Networks for Dependable High-Level Information Fusion2008In: Proceedings of the 11th International Conference on Information Fusion, IEEE Press, 2008, p. 1359-1366Conference paper (Refereed)
    Abstract [en]

    Bayesian networks are often proposed as a method for high-level information fusion. However, a Bayesian network relies on strong assumptions about the underlying probabilities. In many cases it is not realistic to require such precise probability assessments. We show that there exists a significant set of problems where credal networks outperform Bayesian networks, thus enabling more dependable decision making for this type of problems. A credal network is a graphical probabilistic method that utilizes sets of probability distributions, e.g., interval probabilities, for representation of belief. Such a representation allows one to properly express epistemic uncertainty, i.e., uncertainty that can be reduced if more information becomes available. Since reducing uncertainty has been proposed as one of the main goals of information fusion, the ability to represent epistemic uncertainty becomes an important aspect in all fusion applications.

  • 38.
    Karlsson, Alexander
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Johansson, Ronnie
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    An Empirical Comparison of Bayesian and Credal Set Theory for Discrete State Estimation2010In: Information Processing and Management of Uncertainty in Knowledge-Based Systems. Theory and Methods: 13th International Conference, IPMU 2010, Dortmund, Germany, June 28–July 2, 2010. Proceedings, Part I / [ed] Eyke Hüllermeier, Rudolf Kruse, Frank Hoffmann, Springer Berlin/Heidelberg, 2010, p. 80-89Conference paper (Refereed)
    Abstract [en]

    We are interested in whether or not there exist any advantages of utilizing credal set theory for the discrete state estimation problem. We present an experiment where we compare in total six different methods, three based on Bayesian theory and three on credal set theory. The results show that Bayesian updating performed on centroids of operand credal sets significantly outperforms the other methods. We analyze the result based on degree of imprecision, position of extreme points, and second-order distributions.

  • 39.
    Karlsson, Alexander
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Johansson, Ronnie
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre. Swedish Defence Research Agency (FOI), Stockholm.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Characterization and Empirical Evaluation of Bayesian and Credal Combination Operators2011In: Journal of Advances in Information Fusion, ISSN 1557-6418, Vol. 6, no 2, p. 150-166Article in journal (Refereed)
    Abstract [en]

    We address the problem of combining independent evidences from multiple sources by utilizing the Bayesian and credal combination operators. We present measures for degree of conflict and imprecision, which we use in order to characterize the behavior of the operators through a number of examples. We introduce discounting operators that can be used whenever information about the reliability of sources is available. The credal discounting operator discounts a credal set with respect to an interval of reliability weights, hence, we allow for expressing reliability of sources imprecisely. We prove that the credal discounting operator can be computed by using the extreme points of its operands. We also perform two experiments containing different levels of risk where we compare the performance of the Bayesian and credal combination operators by using a simple score function that measures the informativeness of a reported decision set. We show that the Bayesian combination operator performed on centroids of operand credal sets outperforms the credal combination operator when no risk is involved in the decision problem. We also show that if a risk component is present in the decision problem, a simple cautious decision policy for the Bayesian combination operator can be constructed that outperforms the corresponding credal decision policy.

  • 40.
    Karlsson, Alexander
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Johansson, Ronnie
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Imprecise Probability as an Approach to Improved Dependability in High-Level Information Fusion2008In: Interval / Probabilistic Uncertainty and Non-Classical Logics / [ed] Van-Nam Huynh, Yoshiteru Nakamori, Hiroakira Ono, Jonathan Lawry, Vladik Kreinovich, Hung T. Nguyen, Springer Berlin/Heidelberg, 2008, p. 70-84Conference paper (Other academic)
    Abstract [en]

    The main goal of information fusion can be seen as improving human or automatic decision-making by exploiting diversities in information from multiple sources. High-level information fusion aims specifically at decision support regarding situations, often expressed as “achieving situation awareness”. A crucial issue for decision making based on such support is trust that can be defined as “accepted dependence”, where dependence or dependability is an overall term for many other concepts, e.g., reliability. This position paper reports on ongoing and planned research concerning imprecise probability as an approach to improved dependability in high-level information fusion. We elaborate on high-level information fusion from a generic perspective and a partial mapping from a taxonomy of dependability to high-level information fusion is presented. Three application domains: defense, manufacturing, and precision agriculture, where experiments are planned to be implemented are depicted. We conclude that high-level information fusion as an application-oriented research area, where precise probability (Bayesian theory) is commonly adopted, provides an excellent evaluation ground for imprecise probability.

  • 41.
    Karlsson, Alexander
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Johansson, Ronnie
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    On the Behavior of the Robust Bayesian Combination Operator and the Significance of Discounting2009In: ISIPTA ’09: Proceedings of the Sixth International Symposium on Imprecise Probability: Theories and Applications / [ed] Thomas Augustin, Frank P. A. Coolen, Serafin Moral, Matthias C. M. Troffaes, Society for Imprecise Probability , 2009, p. 259-268Conference paper (Refereed)
    Abstract [en]

    We study the combination problem for credal sets via the robust Bayesian combination operator. We extend Walley's notion of degree of imprecision and introduce a measure for degree of conflict between two credal sets. Several examples are presented in order to explore the behavior of the robust Bayesian combination operator in terms of imprecision and conflict. We further propose a discounting operator that suppresses a source given an interval of reliability weights, and highlight the importance of using such weights whenever additional information about the reliability of a source is available.

  • 42.
    Lindström, Birgitta
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Offutt, Jeff
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre. George Mason University, Fairfax VA, USA.
    Pettersson, Paul
    Mälardalen University, Västerås, Sweden.
    Sundmark, Daniel
    Swedish Institute of Computer Science, Kista, Sweden.
    Mutating Aspect-Oriented Models to Test Cross-Cutting Concerns2015In: 2015 IEEE 8th International Conference on Software Testing, Verification and Validation Workshops, ICSTW 2015 - Proceedings, IEEE conference proceedings, 2015, p. Article number 7107456-Conference paper (Refereed)
  • 43.
    Lindström, Birgitta
    et al.
    University of Skövde, Department of Computer Science.
    Mellin, Jonas
    University of Skövde, Department of Computer Science.
    Andler, Sten F.
    University of Skövde, Department of Computer Science.
    Testability of dynamic real-time systems2002In: Proceedings of Eigth International Conference on Real-Time Computing Systems and Applications (RTCSA2002), 2002, p. 93-97Conference paper (Refereed)
  • 44.
    Lindström, Birgitta
    et al.
    University of Skövde, School of Humanities and Informatics.
    Nilsson, Robert
    University of Skövde, School of Humanities and Informatics.
    Ericsson, AnnMarie
    University of Skövde, School of Humanities and Informatics.
    Grindal, Mats
    University of Skövde, School of Humanities and Informatics.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics.
    Eftring, Bengt
    University of Skövde, School of Humanities and Informatics.
    Offutt, Jeff
    George Mason University, Fairfax, VA, USA.
    Six Issues in Testing Event-Triggered Real-Time Systems2007Report (Other academic)
    Abstract [en]

    Verification of real-time systems is a complex task, with problems coming from issues like concurrency. A previous paper suggested dealing with these problems by using a time-triggered design, which gives good support both for testing and formal analysis. However, a

    time-triggered solution is not always feasible and an event-triggered design is needed. Event-triggered systems are far more difficult to test than time-triggered systems.

    This paper revisits previously identified testing problems from a new perspective and identifies additional problems for event-triggered systems. The paper also presents an approach to deal with these problems. The TETReS project assumes a model-driven development

    process. We combine research within three different fields: (i) transformation of rule sets between timed automata specifications and ECA rules with maintained semantics, (ii) increasing testability in event-triggered system, and (iii) development of test case generation methods for event-triggered systems.

  • 45.
    Lindström, Birgitta
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Offutt, Jeff
    Andler, Sten F
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Testability of Dynamic Real-Time Systems: An Empirical Study of Constrained Execution Environment Implications2008In: 1st IEEE International Conference on Software Testing, Verification, and Validation (ICST 2008), IEEE , 2008, p. 112-120Conference paper (Refereed)
    Abstract [en]

    Real-time systems must respond to events in a timely fashion; in hard real-time systems the penalty for a missed deadline is high. It is therefore necessary to design hard real-time systems so that the timing behavior of the tasks can be predicted. Static real-time systems have prior knowledge of the worst-case arrival patterns and resource usage. Therefore, a schedule can be calculated off-line and tasks can be guaranteed to have sufficient resources to complete (resource adequacy). Dynamic real-time systems, on the other hand, do not have such prior knowledge, and therefore must react to events when they occur. They also must adapt to changes in the urgencies of various tasks, and fairly allocate resources among the tasks. A disadvantage of static real-time systems is that a requirement on resource adequacy makes them expensive and often impractical. Dynamic realtime systems, on the other hand, have the disadvantage of being less predictable and therefore difficult to test. Hence, in dynamic systems, timeliness is hard to guarantee and reliability is often low. Using a constrained execution environment, we attempt to increase the testability of such systems. An initial step is to identify factors that affect testability. We present empirical results on how various factors in the execution environment impacts testability of real-time systems. The results show that some of the factors, previously identified as possibly impacting testability, do not have an impact, while others do.

  • 46.
    Lindström, Birgitta
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Offutt, Jeff
    George Mason University, USA.
    González-Hernández, Loreto
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Identifying Useful Mutants to Test Time Properties2018In: 2018 IEEE International Conference on Software Testing, Verification and Validation Workshops (ICSTW), IEEE Computer Society, 2018, p. 69-76Conference paper (Refereed)
    Abstract [en]

    Real-time systems have to be verified and tested for timely behavior as well as functional behavior. Thus, time is an extra dimension that adds to the complexity of software testing. A timed automata model with a model-checker can be used to generate timed test traces. To properly test the timely behavior, the set of test traces should challenge the different time constraints in the model. This paper describes and adapts mutation operators that target such time constraints in timed automata models. Time mutation operators apply a delta to the time constraints to help testers design tests that exceed the time constraints. We suggest that the size of this delta determines how easy the mutant is to kill and that the optimal delta varies by the program, mutation operator, and the individual mutant. To avoid trivial and equivalent time mutants, the delta should be set individually for each mutant. We discuss mutant subsumption and define the problem of finding dominator mutants in this new domain. In this position paper, we outline an iterative tuning process where a statistical model-checker, UPPAAL SMC, is used to: (i) create a tuned set of dominator time mutants, and (ii) generate test traces that kill the mutants.

  • 47.
    Lindström, Birgitta
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Offutt, Jeff
    George Mason University, Fairfax VA, USA.
    Sundmark, Daniel
    Swedish Institute of Computer Science, Kista, Sweden.
    Andler, Sten F.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Pettersson, Paul
    Mälardalen University, Västerås, Sweden.
    Using mutation to design tests for aspect-oriented models2017In: Information and Software Technology, ISSN 0950-5849, E-ISSN 1873-6025, Vol. 81, p. 112-130Article in journal (Refereed)
    Abstract [en]

    Abstract Context: Testing for properties such as robustness or security is complicated because their concerns are often repeated in many locations and muddled with the normal code. Such “cross-cutting concerns” include things like interrupt events, exception handling, and security protocols. Aspect-oriented (AO) modeling allows developers to model the cross-cutting behavior independently of the normal behavior, thus supporting model-based testing of cross-cutting concerns. However, mutation operators defined for AO programs (source code) are usually not applicable to AO models (AOMs) and operators defined for models do not target the AO features. Objective: We present a method to design abstract tests at the aspect-oriented model level. We define mutation operators for aspect-oriented models and evaluate the generated mutants for an example system. Method: AOMs are mutated with novel operators that specifically target the AO modeling features. Test traces killing these mutant models are then generated. The generated and selected traces are abstract tests that can be transformed to concrete black-box tests and run on the implementation level, to evaluate the behavior of the woven cross-cutting concerns (combined aspect and base models). Results: This paper is a significant extension of our paper at Mutation 2015. We present a complete fault model, additional mutation operators, and a thorough analysis of the mutants generated for an example system. Conclusions: The analysis shows that some mutants are stillborn (syntactically illegal) but none is equivalent (exhibiting the same behavior as the original model). Additionally, our AOM-specific mutation operators can be combined with pre-existing operators to mutate code or models without any overlap.

  • 48.
    Mathiason, Gunnar
    et al.
    University of Skövde, School of Humanities and Informatics.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics.
    Jagszent, Daniel
    Institute for Program Structures and Data Organization, University of Karlsruhe.
    Virtual full replication by static segmentation for multiple properties of data objects2005In: RTiS 2005: proceedings of Real time in Sweden 2005, the 8th Biennal SNART Conference on Real-Time Systems / [ed] Sten F. Andler, Anton Cervin, Skövde: Skövde University , 2005, p. 11-18Conference paper (Refereed)
    Abstract [en]

    We implement Virtual full replication for a distributed real-time database by segmenting the database on multiple data properties. Virtual full replication provides an image to the application of full replication in a partially replicated database, by replicating data to meet the actual data needs of the users of the data. This is useful since fully replicated real-time databases, that allow updates at all nodes, do not scale well as updates must be replicated to every other node for replica consistency, also to nodes where only a small share of the database will ever be used. We propose an algorithm that segments the database on multiple data properties without causing a combinatorial problem. We show, by analysis and an implementation, that scalability for such a system can be improved due to scalable resource usage, while application semantics of full replication is unchanged.

  • 49.
    Mathiason, Gunnar
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Kang, Woochul
    University of Virginia, USA.
    Exploring a Multi-Tiered Whiteboard Infrastructure for Information Fusion in Wireless Sensor Networks2008In: Proceedings of the second Skövde Workshop on Information Fusion Topics (SWIFT 2008) / [ed] H. Boström, R. Johansson, Joeri van Laere, Skövde: University of Skövde , 2008, p. 63-66Conference paper (Refereed)
    Abstract [en]

     It is important for the life time of a wireless sensor network (WSN) to reduce the amount of data transferred through the network. As a typical approach, sensor data is filtered before propagating updates, to a node at the edge of a network, where it can be fused. Information Fusion inside the network can reduce the amount of data propagated, by fusing data before and in propagation, without losing the information value in it. We explore infrastructures for distributed fusion, with fusion nodes located at strategic nodes inside the network, as an approach of structured distributed fusion for WSNs. We propose an infrastructure for a white-board approach that uses a distributed real-time database with virtual full replication. With such an approach, both raw and fused data are logically available at all nodes and physically available where used, such that only used data will be propagated and use resources. The actual resource usage will be relative to the actual demand for data, rather than to the amount of data published at the white-board. We present an exploration of such an infrastructure, and points out future key research questions for such a white-board approach.

  • 50.
    Mathiason, Gunnar
    et al.
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Andler, Sten F
    University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
    Son, S H
    University of Virginia.
    Virtual Full Replication by Adaptive Segmentation2007In: 13th IEEE International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA 2007), IEEE Computer Society, 2007, p. 327-337Conference paper (Refereed)
    Abstract [en]

    We propose virtual full replication by adaptive segmentation (ViFuR-A), and evaluate its ability to maintain scalability in a replicated real-time database. With full replication and eventual consistency, transaction timeliness becomes independent of network delays for all transactions. However, full replication does not scale well, since all updates must be replicated to all nodes, also when data is needed only at a subset of the nodes. With virtual full replication that adapts to actual data needs, resource usage can be bounded and the database can be made scalable. We propose a scheme for adaptive segmentation that detects new data needs and adapts replication. The scheme includes an architecture, a scalable protocol and a replicated directory service that together maintains scalability. We show that adaptive segmentation bounds the required storage at a significantly lower level compared to static segmentation, for a typical workload where the data needs change repeatedly. Adaptation time can be kept constant for the workload when there are sufficient resources. Also, the storage is constant with an increasing amount of nodes and linear with an increasing rate of change to data needs.

12 1 - 50 of 59
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf