Jointly Learning Knowledge Graph Embeddings, Fine Grain Entity Types and Language Models

Author/Creator

Author/Creator ORCID

Date

2020-01-20

Department

Computer Science and Electrical Engineering

Program

Computer Science

Citation of Original Publication

Rights

Distribution Rights granted to UMBC by the author.
This item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please see http://aok.lib.umbc.edu/specoll/repro.php or contact Special Collections at speccoll(at)umbc.edu

Abstract

The study aims to combine knowledge graph embedding models with fine-grain entity type prediction models to learn a better representation of entities, relationships and coarse to fine-grain entity types. These entity embeddings can be used to predict a variety of subsequent information, including new facts that should be in the knowledge graphs, dubious entries that might currently be in the knowledge graph erroneously, and types for that entity - from coarse-grained ones like "person" to fine-grained, and hierarchical ones like a "professional athlete" (rather than just an "athlete"). The study shows that the performance of learning knowledge graph embedding and fine grain entity types jointly is comparable to learning them independently. This could be useful for corpora and applications where the information present is ambiguous, missing or incomplete. Learned embeddings from this combined model could also help improve the performance of natural language processing tasks like language modeling. This work illustrates that the learning of real-valued representations of entities and relationships with a language model improves factual prediction and understanding of sequential patterns