Comment on Advice - Getting started with LLMs

<- View Parent
Zworf@beehaw.org ⁨6⁩ ⁨months⁩ ago

It depends on your prompt/context size too. The more you have the more memory you need. Try to check the memory usage of your GPU with GPU-Z with different models and scenarios.

source
Sort:hotnewtop