Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Mixtral 8x7B: A Sparse Mixture of Experts language model

⁨0⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨years⁩ ago⁩ by ⁨bot@lemmy.smeargle.fans [bot]⁩ to ⁨hackernews@lemmy.smeargle.fans⁩

https://arxiv.org/abs/2401.04088

HN Discussion

source

Comments

Sort:hotnewtop