Högskolan i Skövde

his.sePublikasjoner
Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • apa-cv
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Building Neural Network Ensembles using Genetic Programming
School of Business and Informatics, University of Borås, Sweden.
School of Business and Informatics, University of Borås, Sweden.
School of Business and Informatics, University of Borås, Sweden.
Högskolan i Skövde, Institutionen för kommunikation och information. Högskolan i Skövde, Forskningscentrum för Informationsteknologi. (Skövde Cognition and Artificial Intelligence Lab (SCAI))
2006 (engelsk)Inngår i: The 2006 IEEE International Joint Conference on Neural Network Proceedings, IEEE, 2006, s. 1260-1265Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

In this paper we present and evaluate a novel algorithm for ensemble creation. The main idea of the algorithm is to first independently train a fixed number of neural networks (here ten) and then use genetic programming to combine these networks into an ensemble. The use of genetic programming makes it possible to not only consider ensembles of different sizes, but also to use ensembles as intermediate building blocks. The final result is therefore more correctly described as an ensemble of neural network ensembles. The experiments show that the proposed method, when evaluated on 22 publicly available data sets, obtains very high accuracy, clearly outperforming the other methods evaluated. In this study several micro techniques are used, and we believe that they all contribute to the increased performance. One such micro technique, aimed at reducing overtraining, is the training method, called tombola training, used during genetic evolution. When using tombola training, training data is regularly resampled into new parts, called training groups. Each ensemble is then evaluated on every training group and the actual fitness is determined solely from the result on the hardest part.

sted, utgiver, år, opplag, sider
IEEE, 2006. s. 1260-1265
Serie
Proceedings of the International Joint Conference on Neural Networks, ISSN 2161-4393, E-ISSN 2161-4407
HSV kategori
Identifikatorer
URN: urn:nbn:se:his:diva-1806DOI: 10.1109/IJCNN.2006.246836ISI: 000245125902029Scopus ID: 2-s2.0-38049049329ISBN: 0-7803-9490-9 (tryckt)ISBN: 978-0-7803-9490-2 (tryckt)OAI: oai:DiVA.org:his-1806DiVA, id: diva2:32082
Konferanse
International Joint Conference on Neural Networks 2006, IJCNN '06, Vancouver, BC, 16 July 2006 through 21 July 2006
Tilgjengelig fra: 2007-10-10 Laget: 2007-10-10 Sist oppdatert: 2021-04-22bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Forlagets fulltekstScopus

Person

Löfström, TuveKönig, RikardNiklasson, Lars

Søk i DiVA

Av forfatter/redaktør
Löfström, TuveKönig, RikardNiklasson, Lars
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric

doi
isbn
urn-nbn
Totalt: 565 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • apa-cv
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf