A new project proposal for Pivotal suggests training tensor-transformers on synthetic toy languages. By using known computational primitives like induction heads, researchers can control the data-generating process. This approach aims to isolate how model components interact. It offers a clearer path to understanding circuit compositionality than analyzing standard neural networks.