The AI field is celebrating benchmarks while the ceiling gets closer. Every few months, a new model comes out, and more often than not with more parameters than their previous versions. And it is being celebrated in the name of scaling, but scaling towards memorisation is antithetical to generalisation, which is ML's foundational goal. Increasing parameter count works, but it proves that the curr