Language Models Meet World Models

Dec 15, 2023

Speakers

About

Large language models (LMs) have achieved remarkable success in many language tasks. Recent works have also shown that knowledge of the world can emerge from large LMs [15], enabling large LMs to assist decision-making for embodied tasks [1, 7, 10, 14, 8]. However, the world knowledge exhibited by the current large LMs is often not robust and cannot be grounded in the physical environments without additional models. This hinders large LMs’ abilities to perform complex reasoning and planning tasks reliably. For example, in creating action plans to move blocks to a target state, GPT-3 achieves a success rate of only 1%, compared to 78% for humans [16]. On the other hand, humans perform deliberate reasoning and planning based on the mental model of the world (i.e., world model, WMs) that enables us to simulate actions and their effects on the world’s state [2, 5, 3]. WMs encoding the knowledge of the physical world can drastically improve the data efficiency and robustness of intelligent agents. However, WMs were typically studied in reinforcement learning and robotics, which are conceptually distinct from problems studied in language modeling. This gap indicates enormous new opportunities for connecting WMs and LMs, to enhance LM capabilities of reasoning/planning in both embodied and general settings, and address the aforementioned limitations. Emerging studies on the intersection of WMs and LMs have demonstrated promising results [9, 6, 18, 19, 12]. This tutorial aims to summarize and present a unified view of connecting WMs and LMs, and highlight the various opportunities of improved machine reasoning and planning based on (or even beyond) large LMs through world modeling. We will review recent works on learning WMs [5, 13] and on using them to further learn and perform embodied tasks [17, 4, 11]. We will show how LMs can utilize external WMs to compensate for their lack of grounded world knowledge [19, 12] and how LMs themselves can learn world models from embodied experiences that are beyond text data [10, 18] and use the internal WMs to guide complex reasoning [6].

Organizer

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NeurIPS 2023