his.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Classification of Microarrays with kNN: Comparison of Dimensionality Reduction Methods
Stockholms Universitet.
University of Skövde, School of Humanities and Informatics. University of Skövde, The Informatics Research Centre.
2007 (English)In: Proceedings of the 8th International Conference on Intelligent Data Engineering and Automated Learning (IDEAL 2007) / [ed] H. Yin et al., Springer Berlin/Heidelberg, 2007, 800-809 p.Conference paper, Published paper (Refereed)
Abstract [en]

Dimensionality reduction can often improve the performance of the k-nearest neighbor classifier (kNN) for high-dimensional data sets, such as microarrays. The effect of the choice of dimensionality reduction method on the predictive performance of kNN for classifying microarray data is an open issue, and four common dimensionality reduction methods, Principal Component Analysis (PCA), Random Projection (RP), Partial Least Squares (PLS) and Information Gain(IG), are compared on eight microarray data sets. It is observed that all dimensionality reduction methods result in more accurate classifiers than what is obtained from using the raw attributes. Furthermore, it is observed that both PCA and PLS reach their best accuracies with fewer components than the other two methods, and that RP needs far more components than the others to outperform kNN on the non-reduced dataset. None of the dimensionality reduction methods can be concluded to generally outperform the others, although PLS is shown to be superior on all four binary classification tasks, but the main conclusion from the study is that the choice of dimensionality reduction method can be of major importance when classifying microarrays using kNN.

Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2007. 800-809 p.
Series
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), ISSN 0302-9743, 1611-3349 ; 4881
Research subject
Technology
Identifiers
URN: urn:nbn:se:his:diva-3681ISI: 000252394900080Scopus ID: 2-s2.0-38449102440ISBN: 978-3-540-77225-5 OAI: oai:DiVA.org:his-3681DiVA: diva2:292961
Conference
8th International Conference on Intelligent Data Engineering and Automated Learning, IDEAL 2007;Birmingham;16 December 2007through19 December 2007
Available from: 2010-02-10 Created: 2010-02-10 Last updated: 2013-03-18

Open Access in DiVA

No full text

Scopus

Search in DiVA

By author/editor
Boström, Henrik
By organisation
School of Humanities and InformaticsThe Informatics Research Centre

Search outside of DiVA

GoogleGoogle Scholar

Total: 106 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf