Learning embeddings of entities and relations is an efficient and versatile method to perform machine learning on relational data such as knowledge graphs. In this work, we propose holographic embeddings (HolE) to learn compositional vector space representations of entire knowledge graphs. The proposed method is related to holographic models of associative memory in that it employs circular correlation to create compositional representations. By using correlation as the compositional operator HolE can capture rich interactions but simultaneously remains efficient to compute, easy to train, and scalable to very large datasets. In extensive experiments we show that holographic embeddings are able to outperform state-of-the-art methods for link prediction in knowledge graphs and relational learning benchmark datasets.

%B Thirtieth AAAI Conference on Artificial Intelligence (AAAI-16) %C Phoenix, Arizona, USA %G eng %0 Journal Article %J Proceedings of the IEEE %D 2016 %T A Review of Relational Machine Learning for Knowledge Graphs %A Maximilian Nickel %A Kevin Murphy %A Tresp, Volker %A Gabrilovich, Evgeniy %XRelational machine learning studies methods for the statistical analysis of relational, or graph-structured, data. In this paper, we provide a review of how such statistical models can be “trained” on large knowledge graphs, and then used to predict new facts about the world (which is equivalent to predicting new edges in the graph). In particular, we discuss two fundamentally different kinds of statistical relational models, both of which can scale to massive data sets. The first is based on latent feature models such as tensor factorization and multiway neural networks. The second is based on mining observable patterns in the graph. We also show how to combine these latent and observable models to get improved modeling power at decreased computational cost. Finally, we discuss how such statistical models of graphs can be combined with text-based information extraction methods for automatically constructing knowledge graphs from the Web. To this end, we also discuss Google's knowledge vault project as an example of such combination.

%B Proceedings of the IEEE %V 104 %P 11 - 33 %8 Jan-01-2016 %G eng %U http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=7358050 %N 1 %! Proc. IEEE %R 10.1109/JPROC.2015.2483592 %0 Generic %D 2015 %T Holographic Embeddings of Knowledge Graphs %A Maximilian Nickel %A Lorenzo Rosasco %A Tomaso Poggio %K Associative Memory %K Knowledge Graph %K Machine Learning %XLearning embeddings of entities and relations is an efficient and versatile method to perform machine learning on relational data such as knowledge graphs. In this work, we propose holographic embeddings (HolE) to learn compositional vector space representations of entire knowledge graphs. The proposed method is related to holographic models of associative memory in that it employs circular correlation to create compositional representations. By using correlation as the compositional operator, HolE can capture rich interactions but simultaneously remains efficient to compute, easy to train, and scalable to very large datasets. In extensive experiments we show that holographic embeddings are able to outperform state-of-the-art methods for link prediction in knowledge graphs and relational learning benchmark datasets.

%8 11/16/2015 %G English %1 %2http://hdl.handle.net/1721.1/100203

%0 Generic %D 2015 %T A Review of Relational Machine Learning for Knowledge Graphs: From Multi-Relational Link Prediction to Automated Knowledge Graph Construction %A Maximilian Nickel %A Kevin Murphy %A Volker Tresp %A Evgeniy Gabrilovich %Xhttp://hdl.handle.net/1721.1/100193

%0 Book Section %B The Semantic Web – ISWC 2014 %D 2014 %T Querying Factorized Probabilistic Triple Databases %A Krompaß, Denis %A Maximilian Nickel %A Volker Tresp %B The Semantic Web – ISWC 2014 %S Lecture Notes in Computer Science %I Springer International Publishing %V 8797 %P 114-129 %@ 978-3-319-11914-4 %G eng %U http://dx.doi.org/10.1007/978-3-319-11915-1_8 %R 10.1007/978-3-319-11915-1_8 %0 Book Section %B Advances in Neural Information Processing Systems 27 %D 2014 %T Reducing the Rank in Relational Factorization Models by Including Observable Patterns %A Maximilian Nickel %A Jiang, Xueyan %A Volker Tresp %XTensor factorization has become a popular method for learning from multi-relational data. In this context, the rank of the factorization is an important parameter that determines runtime as well as generalization ability. To identify conditions under which factorization is an efficient approach for learning from relational data,we derive upper and lower bounds on the rank required to recover adjacency tensors.Based on our findings, we propose a novel additive tensor factorization modelto learn from latent and observable patterns on multi-relational data and present

a scalable algorithm for computing the factorization. We show experimentallyboth that the proposed additive model does improve the predictive performanceover pure latent variable methods and that it also reduces the required rank — andtherefore runtime and memory complexity — significantly.

%B Advances in Neural Information Processing Systems 27
%I Curran Associates, Inc.
%P 1179–1187
%G eng
%U http://papers.nips.cc/paper/5448-reducing-the-rank-in-relational-factorization-models-by-including-observable-patterns.pdf