Comment on Advice - Getting started with LLMs
Zworf@beehaw.org 6 months agoIt depends on your prompt/context size too. The more you have the more memory you need. Try to check the memory usage of your GPU with GPU-Z with different models and scenarios.