I like the analogy, I have a lot of trouble explaining to people that LLMs are anything more than just a “most likely next token” predictor. Because that is exactly what an LLM is, but on a level so abstract that it has abstracted away everything that is actually interesting about them lol