C:\Users\manuel>ollama
Usage:
ollama [flags]
ollama [command]
Available Commands:
serve Start ollama
create Create a model from a Modelfile
show Show information for a model
run Run a model
pull Pull a model from a registry
push Push a model to a registry
list List models
ps List running models
cp Copy a model
rm Remove a model
help Help about any command
Flags:
-h, --help help for ollama
-v, --version Show version information
Use "ollama [command] --help" for more information about a command.
Start ollama
// with debug output
ollama serve
// in background
start /B ollama serve >NUL 2>&1
set /p content=
Generate a response
curl http://localhost:11434/api/generate -d '{
"model": "llama3",
"prompt":"Why is the sky blue?"
}'
Chat with a model
curl http://localhost:11434/api/chat -d '{
"model": "llama3",
"messages": [
{ "role": "user", "content": "why is the sky blue?" }
]
}'
Stop ollama
tasklist | findstr ollama
ollama app.exe 12580 Console 1 19.184 K
ollama.exe 12628 Console 1 44.324 K
ollama_llama_server.exe 11772 Console 1 5.174.068 K
taskkill /IM ollama_llama_server.exe
=====Links=====
* [[https://ollama.com/|Homepage]]
* [[https://www.gpu-mart.com/blog/how-to-install-and-use-ollama-webui-on-windows|Installing Open Web UI on Windows]]