his.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
An Infinite Replicated Softmax Model for Topic Modeling
University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre. (Skövde Artificial Intelligence Lab (SAIL))
University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre. (Skövde Artificial Intelligence Lab (SAIL))ORCID iD: 0000-0003-2973-3112
Högskolan i Jönköping, JTH, Datateknik och informatik.ORCID iD: 0000-0003-2900-9335
University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre. (Skövde Artificial Intelligence Lab (SAIL))ORCID iD: 0000-0003-2949-4123
2019 (English)In: Modeling Decisions for Artificial Intelligence: 16th International Conference, MDAI 2019, Milan, Italy, September 4–6, 2019, Proceedings / [ed] Vicenç Torra, Yasuo Narukawa, Gabriella Pasi, Marco Viviani, Springer, 2019, p. 307-318Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, we describe the infinite replicated Softmax model (iRSM) as an adaptive topic model, utilizing the combination of the infinite restricted Boltzmann machine (iRBM) and the replicated Softmax model (RSM). In our approach, the iRBM extends the RBM by enabling its hidden layer to adapt to the data at hand, while the RSM allows for modeling low-dimensional latent semantic representation from a corpus. The combination of the two results is a method that is able to self-adapt to the number of topics within the document corpus and hence, renders manual identification of the correct number of topics superfluous. We propose a hybrid training approach to effectively improve the performance of the iRSM. An empirical evaluation is performed on a standard data set and the results are compared to the results of a baseline topic model. The results show that the iRSM adapts its hidden layer size to the data and when trained in the proposed hybrid manner outperforms the base RSM model.

Place, publisher, year, edition, pages
Springer, 2019. p. 307-318
Series
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 11676
Keywords [en]
Restricted Boltzmann machine, Unsupervised learning, Topic modeling, Adaptive Neural Network
National Category
Computer Sciences Language Technology (Computational Linguistics)
Research subject
Skövde Artificial Intelligence Lab (SAIL)
Identifiers
URN: urn:nbn:se:his:diva-17664DOI: 10.1007/978-3-030-26773-5_27ISBN: 978-3-030-26772-8 (print)ISBN: 978-3-030-26773-5 (electronic)OAI: oai:DiVA.org:his-17664DiVA, id: diva2:1350115
Conference
16th International Conference, MDAI 2019, Milan, Italy, September 4–6, 2019
Available from: 2019-09-09 Created: 2019-09-10 Last updated: 2019-09-11Bibliographically approved

Open Access in DiVA

The full text will be freely available from 2020-07-25 00:00
Available from 2020-07-25 00:00

Other links

Publisher's full text

Authority records BETA

Huhnstock, Nikolas AlexanderKarlsson, AlexanderRiveiro, MariaSteinhauer, H. Joe

Search in DiVA

By author/editor
Huhnstock, Nikolas AlexanderKarlsson, AlexanderRiveiro, MariaSteinhauer, H. Joe
By organisation
School of InformaticsThe Informatics Research Centre
Computer SciencesLanguage Technology (Computational Linguistics)

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 62 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf