Högskolan i Skövde

his.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • apa-cv
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Inconsistency: Friend or Foe
The School of Business and Informatics, University of Borås, Sweden.
The School of Business and Informatics, University of Borås, Sweden.
University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre. (Skövde Cognition and Artificial Intelligence Lab (SCAI))
2007 (English)In: The 2007 International Joint Conferenceon Neural Networks: IJCNN 2007 Conference Proceedings, IEEE, 2007, p. 1383-1388Conference paper, Published paper (Refereed)
Abstract [en]

One way of obtaining accurate yet comprehensible models is to extract rules from opaque predictive models. When evaluating rule extraction algorithms, one frequently used criterion is consistency; i.e. the algorithm must produce similar rules every time it is applied to the same problem. Rule extraction algorithms based on evolutionary algorithms are, however, inherently inconsistent, something that is regarded as their main drawback. In this paper, we argue that consistency is an overvalued criterion, and that inconsistency can even be beneficial in some situations. The study contains two experiments, both using publicly available data sets, where rules are extracted from neural network ensembles. In the first experiment, it is shown that it is normally possible to extract several different rule sets from an opaque model, all having high and similar accuracy. The implication is that consistency in that perspective is useless; why should one specific rule set be considered superior? Clearly, it should instead be regarded as an advantage to obtain several accurate and comprehensible descriptions of the relationship. In the second experiment, rule extraction is used for probability estimation. More specifically, an ensemble of extracted trees is used in order to obtain probability estimates. Here, it is exactly the inconsistency of the rule extraction algorithm that makes the suggested approach possible.

Place, publisher, year, edition, pages
IEEE, 2007. p. 1383-1388
Series
Proceedings of the International Joint Conference on Neural Networks, ISSN 2161-4393, E-ISSN 2161-4407
National Category
Computer Sciences Information Systems
Research subject
Technology
Identifiers
URN: urn:nbn:se:his:diva-2104DOI: 10.1109/IJCNN.2007.4371160ISI: 000254291101059Scopus ID: 2-s2.0-51749099818ISBN: 978-1-4244-1380-5 (electronic)ISBN: 1-4244-1380-X (print)ISBN: 978-1-4244-1379-9 (print)OAI: oai:DiVA.org:his-2104DiVA, id: diva2:32380
Conference
The 2007 International Joint Conference on Neural Networks, IJCNN 2007, August 12-17, 2007, Renaissance Orlando Resort, Florida, USA
Available from: 2008-05-30 Created: 2008-05-30 Last updated: 2021-04-22Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Niklasson, Lars

Search in DiVA

By author/editor
Niklasson, Lars
By organisation
School of Humanities and InformaticsThe Informatics Research Centre
Computer SciencesInformation Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 531 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • apa-cv
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf