Comment on AGI achieved š¤
jsomae@lemmy.ml āØ5ā© āØdaysā© agoTransformers were pretty novel in 2017, I donāt know if they were really around before that.
Anyway, Iām doubtful that a larger corpus is whatās needed at this point. (Though that said, thereās a lot more text remaining in instant messager chat logs like discord that probably have yet to be integrated into LLMs. Not sure.) Iām also doubtful that scaling up is going to keep working, but it wouldnāt surprise that much me if it does keep working for a long while. My guess is that thereās some small tweaks to be discovered that really improve things a lot but still basically like like repetitive training as you put it.