Comment on Zuckerberg bets on personalized AI models for all • The Register
MalReynolds@slrpnk.net 4 months agoAgreed, and the chance of it backfiring on them is indeed pleasingly high. If the compute moat for initial training gets lower (e.g. trinary/binary models) or distributed training (Hivemind etc) takes off, or both, or something new, all bets are off.
istanbullu@lemmy.ml 4 months ago
The compute moat for the initial training will never get lower. But as the foundation models get better, the need for from-scratch training will be less frequent.