Comment on Question about upcoming trends and hardware requirements

altima_neo@lemmy.zip ⁨2⁩ ⁨months⁩ ago

Basically, avoid AMD if you’re serious about it. Direct ML just can’t compete with cuda. Performance with stable diffusion on Nvidia blows away AMD.

A 4090 is as fast as it gets for consumer hardware. I’ve got a 3090, and it’s got the same amount of vram as a 4090 (24GB), but no where near as fast. So a 3090/TI would be a good budget option.

However, if you’re willing to wait, they’re saying Nvidia will be announcing the 5000 series in January. I’m not sure when they’ll release though. Plus there’s the whole stock problems with a new series launch. But the 5090 is rumored to have 32GB vram.

source
Sort:hotnewtop