Jameel, Shoaib, Fu, Zihao, Shi, Bei, Lam, Wai, Schockart, Steven (2019) Word Embedding as Maximum A Posteriori Estimation. In: Proceedings of the AAAI Conference on Artificial Intelligence. 33 (1). pp. 6562-6569. Association for the Advancement of Artificial Intelligence (doi:10.1609/aaai.v33i01.33016562) (KAR id:70009)
PDF (updated version)
Author's Accepted Manuscript
Language: English |
|
Download this file (PDF/264kB) |
|
Request a format suitable for use with assistive technology e.g. a screenreader | |
Official URL: https://doi.org/10.1609/aaai.v33i01.33016562 |
Abstract
The GloVe word embedding model relies on solving a global optimization problem, which can be reformulated as a maximum likelihood estimation problem. In this paper, we propose to generalize this approach to word embedding by considering parametrized variants of the GloVe model and incorporating priors on these parameters. To demonstrate the usefulness of this approach, we consider a word embedding model in which each context word is associated with a corresponding variance, intuitively encoding how informative it is. Using our framework, we can then learn these variances together with the resulting word vectors in a unified way. We experimentally show that the resulting word embedding models outperform GloVe, as well as many popular alternatives.
Item Type: | Conference or workshop item (Proceeding) |
---|---|
DOI/Identification number: | 10.1609/aaai.v33i01.33016562 |
Subjects: | Q Science > QA Mathematics (inc Computing science) > QA 76 Software, computer programming, |
Divisions: | Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Computing |
Depositing User: | Shoaib Jameel |
Date Deposited: | 09 Nov 2018 10:56 UTC |
Last Modified: | 05 Nov 2024 12:32 UTC |
Resource URI: | https://kar.kent.ac.uk/id/eprint/70009 (The current URI for this page, for reference purposes) |
- Link to SensusAccess
- Export to:
- RefWorks
- EPrints3 XML
- BibTeX
- CSV
- Depositors only (login required):