Note
This is very very rough. More a rambling than a cohesive notebook of thoughts yet.
I think when people compare AI to the Python versus C versus assembly lineage, I think they miss something important, those mappings are provably correct1. When you map from language to code, something is missing, how do you know you’re doing something correctly?
You can write tests, but then those tests, are specified in normal language then translated to code again. Of course, AI is meant to have an accurate mapping to tell that translation is actually correct.
If we move up the abstraction layer, what will the new primitives to learn be?
Coding will never be useless
Even if you don’t have to think about memory management all the time with modern languages, you can’t escape the fact that memory is a thing that exists. If you ever want to do lower level things, you will have to deal with the fact that memory is an objective thing.
If you never learn about memory, you’re stuck in the higher abstraction playground that someone built for you.
Time horizons matter for abstractions
Perhaps one of the most important graphs of 2025 was METR’s plot of the time horizons that AI models could handle. One of the inductive type problems with AI is that although they work at the scale of a couple of hours, they fall apart at the scale of days. When you’re reading this, the timescales might change, but this is very unlike previous paradigm shifts where when you translated to a higher layer, you could expect that translation to work no matter how large you scaled. One of the oddities is that LLMs don’t seem to be fully invariant to time (get lazier during holidays???) or space like other pure math things. If your abstraction falls apart at long time horizons, can you just break one task up into lots of tiny ones that fit within the valid time horizon?
What is worth learning?
This is slightly orthogonal to using AI to learn since it is what targets are even worth aiming now that AI is very good at translating natural language into code? Will code become arithmetic where it still pays to learn it even though we have calculators?
In the limit, assuming no bugs and whatnot. You get what I mean though.↩︎