Overview
This interface presents DisamKB v1.5, a large general-domain knowledge base (KB) entirely from a large language model (LLM). It demonstrates the feasibility of large-scale KB construction from LLMs, while highlighting specific challenges arising around entity recognition, entity and property canonicalization, and taxonomy construction.
Based on GPT-4.1, DisamKB v1.5 contains 100 million triples for more than 6.1 million entities, at a cost 10x less than previous KBC projects. We also provide DisamKB v1.1 for download.
DisamKB is a landmark for two fields:- For NLP, for the first time, it provides constructive insights into the knowledge (or beliefs) of LLMs.
- For the Semantic Web, it shows novel ways forward for the long-standing challenge of general-domain KB construction.