Comment on Gaze of the Golden Plains

j4k3@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

That dot pattern works here but usually means there are gen layers that are saturated or mismatched in odd ways. I get that all the time if I run LoRAs too high or a weight is too strong in the prompt. Could be that I always use a custom sampler with beta scheduler and have modified the code that scales and tiles QKV bias in pytorch code and the Python libraries that call it. Dunno, but I see this pattern a lot when playing with new LoRAs. I quit Flux though. It is too slow and the lack of negative prompts is a massive regression IMO. Maybe if it was configured like llama.cpp with the execution split between CPU/GPU it would be better, but GPU only sets my laptop on fire hot mode even with 16GB GPU.

source
Sort:hotnewtop