TIB Leibniz Information Centre for Science and Technology
Jennifer D’Souza is a Postdoctoral Researcher at the TIB Leibniz Information Centre for Science and Technology in the R&D Department. Her research interests mainly include developing supervised machine learning techniques for natural language processing to facilitate text mining and automated information extraction. Her current primary research theme is knowledge graph construction from scientific text by NLP methods. Aside from which, she is also interested in scientometrics.
Talks and Events
The State of the Art Transformer Language Models for Knowledge Graph Construction from Text
Knowledge graphs (KG) play a crucial role in many modern applications. Industrial knowledge is scattered in large volumes of both structured and unstructured data sources and bringing them to a unified knowledge graph can bring a lot of value. However, automatically constructing a KG from natural language text is challenging due the ambiguity and impreciseness of the natural languages. Recently, many approaches have been proposed to transform natural language text to triples to construct KGs. Out of those, approaches based on transformer language models are predominantly leading in many subtasks related to knowledge graph construction such as entity and relation extraction. In this presentation, we will focus on the state of the art of transformer language model based methods, techniques and tools for constructing knowledge graphs from text, their capabilities, limitations and current challenges. It aims to summarize the research progress over the KG construction from text with a specific focus on the information acquisition branch entailing entity and relation extraction covering the state of the art transformer methods and tools. This will be useful for any practitioner who is interested in building knowledge graphs for their organizations.
- Natural Language Processing and Understanding
- Deep Learning for and with Knowledge Graphs
- Data Modeling