auto complete
It’s called lexical analysis or lexical tokenization. It existed long before LLMs, it doesn’t rely on stolen code, and doesn’t consume a small village’s worth of electricity. Superficial parallels with chatbots do not make it AI – it’s a fucking algorithm.