Knowledge Graph Embedding Space

From GM-RKB
Jump to navigation Jump to search

A Knowledge Graph Embedding Space is a graph embedding space for a knowledge graph.



References

2023

2023

  • chat
    • Q: What is a neural knowledge graph embedding?
    • A: A neural knowledge graph embedding is a method that represents entities and relationships in a knowledge graph as numerical vectors (embeddings) in a high-dimensional space. These embeddings are learned using neural networks and capture the structure and meaning of the relationships in the graph. This representation enables various tasks such as link prediction and entity classification to be performed in a computationally efficient manner.

2023

  • (Suri et al., 2023) ⇒ Kunal Suri, Atul Singh, Prakhar Mishra, Swapna Sourav Rout, and Rajesh Sabapathy. (2023). “Language Models Sounds the Death Knell of Knowledge Graphs.” arXiv preprint arXiv:2301.03980
    • QUOTE: ... Healthcare has been dealing with an explosion in information about different types of drugs, diseases, and procedures. This paper argues that using Knowledge Graphs is not the best solution for solving problems in this domain. We present experiments using LLMs for the healthcare domain to demonstrate that language models provide the same functionality as knowledge graphs, thereby making knowledge graphs redundant.

2017

  • (Wang et al., 2017) ⇒ Quan Wang, Zhendong Mao, Bin Wang, and Li Guo. (2017). “Knowledge Graph Embedding: A Survey of Approaches and Applications.” IEEE Transactions on Knowledge and Data Engineering 29, no. 12
    • ABSTRACT:Knowledge graph (KG) embedding is to embed components of a KG including entities and relations into continuous vector spaces, so as to simplify the manipulation while preserving the inherent structure of the KG. It can benefit a variety of downstream tasks such as KG completion and relation extraction, and hence has quickly gained massive attention. In this article, we provide a systematic review of existing techniques, including not only the state-of-the-arts but also those with latest trends. Particularly, we make the review based on the type of information used in the embedding task. Techniques that conduct embedding using only facts observed in the KG are first introduced. We describe the overall framework, specific model design, typical training procedures, as well as pros and cons of such techniques. After that, we discuss techniques that further incorporate additional information besides facts. We focus specifically on the use of entity types, relation paths, textual descriptions, and logical rules. Finally, we briefly introduce how KG embedding can be applied to and benefit a wide variety of downstream tasks such as KG completion, relation extraction, question answering, and so forth.