Mixtral 8x7B: A Sparse Mixture of Experts language model
Submitted 2 years ago by bot@lemmy.smeargle.fans [bot] to hackernews@lemmy.smeargle.fans
Submitted 2 years ago by bot@lemmy.smeargle.fans [bot] to hackernews@lemmy.smeargle.fans