Dies ist eine alte Version des Dokuments!
C:\Users\manuel>ollama Usage: ollama [flags] ollama [command] Available Commands: serve Start ollama create Create a model from a Modelfile show Show information for a model run Run a model pull Pull a model from a registry push Push a model to a registry list List models ps List running models cp Copy a model rm Remove a model help Help about any command Flags: -h, --help help for ollama -v, --version Show version information Use "ollama [command] --help" for more information about a command.
start /B ollama serve >NUL 2>&1 ollama run llama3
Generate a response
curl http://localhost:11434/api/generate -d '{ "model": "llama3.1", "prompt":"Why is the sky blue?" }'
Chat with a model
curl http://localhost:11434/api/chat -d '{ "model": "llama3.1", "messages": [ { "role": "user", "content": "why is the sky blue?" } ] }'
tasklist | findstr ollama ollama app.exe 12580 Console 1 19.184 K ollama.exe 12628 Console 1 44.324 K ollama_llama_server.exe 11772 Console 1 5.174.068 K taskkill /IM ollama_llama_server.exe