[2410.07836] Masked Generative Priors Improve World Models Sequence Modelling Capabilities

View PDF file from the paper entitled Priors, improved the capabilities of international models modeling, by Christian Miu, Mircea Lica, Zarif IKRAM, Akihiro Nakano, Vedant Shah, Aniquet Rajiv Didolkar, Dianbo Liu, Anirudh Goyal and Vedant Dauwes
PDF HTML (experimental) view
a summary:Deep learning (RL) has become the main approach to creating artificial factors in complex environments. Model -based methods, which are RL styles with world models that expect environment dynamics, are among the promising trends to improve data efficiency, and is a critical step towards bridging the gap between research and publishing in the real world. In particular, global models enhance the efficiency of the sample by learning in imagination, which includes training the environmental sequence model for the environment in a self -controlled manner. Recently, convincing obstetric modeling has appeared as a more efficient and superior inductive bias and symbolic sequence generation. Depending on the structure of the world (Stochasty Models), we replace the traditional MLP before a previous convincing generation (for example, Maskgit Prior) and we offer a GIT storm. GIT-STORM shows great gains in RL tasks on the ATARI 100K standard. The approach through qualitative and quantitative analyzes on the DEPMIND control suite, which shows the effectiveness of the world -based models of transformers in this new field.
The application date
From: Christian Miu [view email]
[v1]
Thursday, Oct 10 2024 11:52:07 UTC (7,764 KB)
[v2]
Sun, Oct 13, 2024 19:38:53 UTC (9,308 KB)
[v3]
Monday, Oct 28 2024 14:46:43 UTC (9,308 KB)
[v4]
Monday, 2 December 2024 12:44:48 UTC (11,868 KB)
[v5]
Wed, 30 April 2025 17:22:52 UTC (11,849 KB)
Don’t miss more hot News like this! Click here to discover the latest in AI news!
2025-05-01 04:00:00