selfhosted
Selfhosted smeeps 3mo ago 90%

Uses for local AI?

Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.

Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?

53
55
Comments 55