I’m interested in hosting something like this, and I’d like to know experiences regarding this topic.

The main reason to host this for privacy reasons and also to integrate my own PKM data (markdown files, mainly).

Feel free to recommend me videos, articles, other Lemmy communities, etc.

@Haggunenons@lemmy.world
link
fedilink
English
110M

Mixtral is an amazing one that isn’t super slow or require incredible hardware foe a decent speed.

In general this guy has really good videos/tutorials for the latest tools.

Display Name
link
fedilink
English
110M

Not with success but I’m using huggingface since a couple of days. You may want to have a look into it

@AlphaAutist@lemmy.world
link
fedilink
English
110M

I haven’t tried any of them but I did just listen to a podcast the other week where they talk about LlamaGPT vs Ollama and other related tools. If you’re interested it’s episode 540: Uncensored AI on Linux by Linux Unplugged

@TCB13@lemmy.world
link
fedilink
English
-210M

“Uncensored” models are bullshit everything but uncensored. Just ask them for a Windows XP Pro key and you’ll see how uncensored they really are.

@TCB13@lemmy.world
link
fedilink
English
-110M

Yes, mostly https://gpt4all.io/ only to find out that even the “uncensored” models are bullshit and won’t even provide you with a Windows XP Pro key. That’s kind of my benchmark for models nowadays. :P

CashewNut [UK]
link
fedilink
English
210M

Will it tell you how to make meth?

amzd
link
fedilink
210M

ollama + codellama works perfect, I use it from neovim with a plug-in called gen-nvim I think

SuperiorOne
link
fedilink
English
210M

I’m actively using ollama with docker to run llama2:13b model. It’s generally works fine but heavy on resources as expected.

@hactar42@lemmy.world
link
fedilink
English
410M

I’ve played around with a few of them. I’ve found LM Studio the most robust and user friendly.

@hottari@lemmy.ml
link
fedilink
English
110M

Last time I checked this, out of all the options available Serge was the simplest to host and use. Though you need a beefy computer to get fast and/or good responses.

Create a post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

  • 1 user online
  • 76 users / day
  • 109 users / week
  • 241 users / month
  • 850 users / 6 months
  • 1 subscriber
  • 1.53K Posts
  • 8.72K Comments
  • Modlog