(clarityxchaos) (2025)
Image description: An antelope standing in the middle of a vast, golden savanna rendered in a pointillist style. The antelope has spiraled horns and a brown coat that blends in with the warm hues of the yellow and orange grass. The background features a clear blue sky with scattered white clouds, and distant mountains, with numerous trees visible on the horizon.
Full Generation Parameters:
Pointillist style painting rendered in vivid colors with a focus on distinct dots to define form and texture. A lone antelope stands against a backdrop of subtly textured savanna grasses its horns curving upward in an unexpected spiral pattern.
Steps: 6, Sampler: Euler, Seed: 454764497, VAE: FLUX1\ae.safetensors, Model: schnellmodeAesthetic_q4GGUF.gguf, Model hash: a87b2fa090, Lora_0 Model hash: 59f6a131cb, Lora_0 Model name: Pointillism_Style_FLUX.safetensors, Lora_0 Strength clip: 1.0, Lora_0 Strength model: 0.58
j4k3@lemmy.world 3 weeks ago
That dot pattern works here but usually means there are gen layers that are saturated or mismatched in odd ways. I get that all the time if I run LoRAs too high or a weight is too strong in the prompt. Could be that I always use a custom sampler with beta scheduler and have modified the code that scales and tiles QKV bias in pytorch code and the Python libraries that call it. Dunno, but I see this pattern a lot when playing with new LoRAs. I quit Flux though. It is too slow and the lack of negative prompts is a massive regression IMO. Maybe if it was configured like llama.cpp with the execution split between CPU/GPU it would be better, but GPU only sets my laptop on fire hot mode even with 16GB GPU.