Is there a way to host an LLM in a docker container on my home server but still leverage the GPU on my main PC?
Is there a way to host an LLM in a docker container on my home server but still leverage the GPU on my main PC?