Back to directory
OL

Ollama

Free

github-opensource

Open SourceLocal AICLI

Run open-source LLMs locally with a simple CLI. Supports LLaMA, Mistral, Gemma, and many more models.

Alternatives