Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

ostris/Z-Image-De-Turbo · A De-distilled Z-Image-Turbo

⁨4⁩ ⁨likes⁩

Submitted ⁨⁨16⁩ ⁨hours⁩ ago⁩ by ⁨Even_Adder@lemmy.dbzer0.com⁩ to ⁨stable_diffusion@lemmy.dbzer0.com⁩

https://huggingface.co/ostris/Z-Image-De-Turbo

source

Comments

Sort:hotnewtop
  • scrubbles@poptalk.scrubbles.tech ⁨15⁩ ⁨hours⁩ ago

    Stupid question, what is a distilled model?

    source
    • Even_Adder@lemmy.dbzer0.com ⁨15⁩ ⁨hours⁩ ago

      It’s basically when you use a larger model to train a smaller one. You use a dataset of data generated by the teacher model and ground truth data to train the student model, and by some strange alchemy I don’t quite understand you get a much smaller model that resembles the teacher model.

      source
      • scrubbles@poptalk.scrubbles.tech ⁨12⁩ ⁨hours⁩ ago

        Thanks for explaining!

        source
    • Zarxrax@lemmy.world ⁨15⁩ ⁨hours⁩ ago

      A distilled model is a more lightweight version of a full model which can run in fewer steps at slightly reduced quality.

      Z-image-turbo is a distilled model, and the full version of the model will be released soon.

      This post is referring to someone attempting to somehow undo the distillation to make an approximation of the full model, I guess. Which is basically pointless because as I said, the actual full model will release soon.

      source