LLMs face limitations in playing certain games like Conway's Game of Life and struggle with reasoning tasks that require longer series of steps. They also exhibit the Reversal Curse, where their training data does not enable them to answer reverse-structured questions. However, with proper prompting and intermediate access to memory and computation, they can be trained to predict cellular automata to some extent.

•24m read time•From strangeloopcanon.com
Post cover image
Table of contents
Failure mode - Why can’t GPT learn Wordle?Another failure mode: Why can’t GPT learn Cellular Automata?Sidenote: attempts to teach transformers Cellular AutomataHow have we solved this so farHow much can LLMs really learn?LLMs cannot reset their own contextMore agents are all you needConclusions
2 Comments

Sort: