分类:Poincaré Embeddings for Learning Hierarchical Representations

来自Big Physics
Jinshanw讨论 | 贡献2019年5月24日 (五) 09:24的版本 →‎Abstract


Maximilian Nickel, Douwe Kiela, Poincaré Embeddings for Learning Hierarchical Representations, arXiv:1705.08039 [cs.AI]


Abstract

Representation learning has become an invaluable approach for learning from symbolic data such as text and graphs. However, while complex symbolic datasets often exhibit a latent hierarchical structure, state-of-the-art methods typically learn embeddings in Euclidean vector spaces, which do not account for this property. For this purpose, we introduce a new approach for learning hierarchical representations of symbolic data by embedding them into hyperbolic space -- or more precisely into an n-dimensional Poincaré ball. Due to the underlying hyperbolic geometry, this allows us to learn parsimonious representations of symbolic data by simultaneously capturing hierarchy and similarity. We introduce an efficient algorithm to learn the embeddings based on Riemannian optimization and show experimentally that Poincaré embeddings outperform Euclidean embeddings significantly on data with latent hierarchies, both in terms of representation capacity and in terms of generalization ability.

Subjects: Artificial Intelligence (cs.AI); Machine Learning (cs.LG); Machine Learning (stat.ML)

总结和评论

就像用欧氏空间的矢量来表示词语[1] [2] 一样,这篇文章[3]考虑了用黎曼空间的矢量来表示词语等矢量。作者发现,对于大量的包含层级关系的系统来说,这个非欧空间中的矢量表示更合理。

参考文献

  1. T Mikolov, K Chen, G Corrado, J Dean, Efficient estimation of word representations in vector space, arXiv preprint arXiv:1301.3781.
  2. T Mikolov, I Sutskever, K Chen, GS Corrado, J Dean, Distributed representations of words and phrases and their compositionality, Advances in neural information processing systems, 3111-3119.
  3. Maximilian Nickel, Douwe Kiela, Poincaré Embeddings for Learning Hierarchical Representations, arXiv:1705.08039 [cs.AI] .

本分类目前不含有任何页面或媒体文件。