But an LLM as a node in a framework that can call a python library
Isnât how these systems are configured. Theyâre just not that sophisticated.
So much of what Sam Alton is doing is brute force, which is why he thinks he needs a $1T investment in new power to build his next iteration model.
Deepseek gets at the edges of this through their partitioned model. But youâre still asking a lot for a machine to intuit whether a query can be solved with some exigent python query the system has yet to identify.
It doesnât scale to AGI but it does reduce hallucinations
It has to scale to AGI, because a central premise of AGI is a system that can improve itself.
It just doesnât match the OpenAI development model, which is to just scrape and sort data hoping the Internet already has the solution to every problem.
outhouseperilous@lemmy.dbzer0.com â¨3⊠â¨days⊠ago
Youâd still be better off starting with a 50s language processor, then grafting on some API calls.
jsomae@lemmy.ml â¨2⊠â¨days⊠ago
in what context? LLMs are extremely good at bridging from natural language to API calls. I dare say itâs one of the few use cases that have decisively landed on âyes, this is something LLMs are actually good at.â Maybe not five nines of reliability, but language itself doesnât have five nines of reliability.