Skip to main content

Relation Induction in Word Embeddings Revisited

Bouraoui, Zied and Jameel, Shoaib and Schockaert, Steven (2018) Relation Induction in Word Embeddings Revisited. In: Proceedings of the 27th International Conference on Computational Linguistics. Association for Computational Linguistics, pp. 1627-1637. ISBN 978-1-948087-50-6. (KAR id:67263)

PDF Author's Accepted Manuscript
Language: English
Download (551kB)
[thumbnail of Shoaib-COLING-2018.pdf]
This file may not be suitable for users of assistive technology.
Request an accessible format


Given a set of instances of some relation, the relation induction task is to predict which other word pairs are likely to be related in the same way. While it is natural to use word embeddingsfor this task, standard approaches based on vector translations turn out to perform poorly. To address this issue, we propose two probabilistic relation induction models. The first model is based on translations, but uses Gaussians to explicitly model the variability of these translations and to encode soft constraints on the source and target words that may be chosen. In the second model, we use Bayesian linear regression to encode the assumption that there is a linear relationship between the vector representations of related words, which is considerably weaker than the assumption underlying translation based models.

Item Type: Book section
Divisions: Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Computing
Depositing User: Shoaib Jameel
Date Deposited: 11 Jun 2018 12:37 UTC
Last Modified: 08 Dec 2022 22:55 UTC
Resource URI: (The current URI for this page, for reference purposes)
  • Depositors only (login required):


Downloads per month over past year