ostris/Z-Image-De-Turbo · A De-distilled Z-Image-Turbo
Submitted 5 weeks ago by Even_Adder@lemmy.dbzer0.com to stable_diffusion@lemmy.dbzer0.com
https://huggingface.co/ostris/Z-Image-De-Turbo
Submitted 5 weeks ago by Even_Adder@lemmy.dbzer0.com to stable_diffusion@lemmy.dbzer0.com
https://huggingface.co/ostris/Z-Image-De-Turbo
scrubbles@poptalk.scrubbles.tech 5 weeks ago
Stupid question, what is a distilled model?
Zarxrax@lemmy.world 5 weeks ago
A distilled model is a more lightweight version of a full model which can run in fewer steps at slightly reduced quality.
Z-image-turbo is a distilled model, and the full version of the model will be released soon.
This post is referring to someone attempting to somehow undo the distillation to make an approximation of the full model, I guess. Which is basically pointless because as I said, the actual full model will release soon.
scrubbles@poptalk.scrubbles.tech 5 weeks ago
Thanks!
Even_Adder@lemmy.dbzer0.com 5 weeks ago
It’s basically when you use a larger model to train a smaller one. You use a dataset of data generated by the teacher model and ground truth data to train the student model, and by some strange alchemy I don’t quite understand you get a much smaller model that resembles the teacher model.
scrubbles@poptalk.scrubbles.tech 5 weeks ago
Thanks for explaining!