SegMoE is a powerful framework for dynamically combining Stable Diffusion Models into a Mixture of Experts within minutes without training. The framework allows for creation of larger models on the fly which offer larger knowledge, better adherence and better image quality. It is inspired by mergekit’s mixtral branch but for Stable Diffusion models.
segmind/SegMoE: Segmind Mixture of Diffusion Experts
Submitted 7 months ago by Even_Adder@lemmy.dbzer0.com to stable_diffusion@lemmy.dbzer0.com
https://github.com/segmind/segmoe
tagginator@utter.online [bot] 7 months ago
New Lemmy Post: segmind/SegMoE: Segmind Mixture of Diffusion Experts (https://lemmy.dbzer0.com/post/13761591)
Tagging: #StableDiffusion
(Replying in the OP of this thread (NOT THIS BOT!) will appear as a comment in the lemmy discussion.)
I am a FOSS bot. Check my README: https://github.com/db0/lemmy-tagginator/blob/main/README.md