LLMs and Knowledge Graphs: The Technological Siblings
LLMs like ChatGPT and knowledge graphs (KGs) like Wikidata are two of the most exciting AI technologies today.
On the surface, they seem quite distinct — LLMs generate remarkably human-like text while KGs structure facts as networks of entities and relationships.
However, LLMs and KGs have strong complementary strengths that make them essentially two sides of the same coin for building robust AI systems.
Both LLMs and KGs aim to capture and represent comprehensive knowledge about the world. LLMs like GPT-3 acquire world knowledge by pre-training on massive text corpora covering diverse topics and genres.
This allows them to generate fluent text on almost any subject.
KGs directly store factual knowledge as interconnected networks of real-world entities, their attributes, and relationships between them.
Though their knowledge representation formats differ, LLMs and KGs have the same goal of modeling knowledge.
0 Comments