Thx in advice.

ffhein
link
fedilink
English
10
edit-2
1M

For LLMs it entirely depends on what size models you want to use and how fast you want it to run. Since there’s diminishing returns to increasing model sizes, i.e. a 14B model isn’t twice as good as a 7B model, the best bang for the buck will be achieved with the smallest model you think has acceptable quality. And if you think generation speeds of around 1 token/second are acceptable, you’ll probably get more value for money using partial offloading.

If your answer is “I don’t know what models I want to run” then a second-hand RTX3090 is probably your best bet. If you want to run larger models, building a rig with multiple (used) RTX3090 is probably still the cheapest way to do it.

@TheBigBrother@lemmy.world
creator
link
fedilink
English
01M

deleted by creator

Max
link
fedilink
English
61M

I feel like this really depends on what hardware you have access too. What are you interested in doing?How long are you willing to wait for it to generate, and how good do you want it to be?

You can pull off like 0.5 word per second of one of the mistral models on the CPU with 32GB of RAM. The stabediffusion image models work okay with like 8-16GB of vram.

@istanbullu@lemmy.ml
link
fedilink
English
3
edit-2
1M

Automatic1111 for Stable Diffusion and Ollama for LLMs

Create a post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

  • 1 user online
  • 53 users / day
  • 89 users / week
  • 209 users / month
  • 866 users / 6 months
  • 1 subscriber
  • 1.4K Posts
  • 7.96K Comments
  • Modlog