1. ollama 无法使用所有的cpu ? Ollama is only using 4 of my 8 Cores
-
Ollama does not have an option parameter to define the number of CPUs (it does for GPUs), but you can try setting num_threads to a value much higher than 8 (default value), and see how it works for you:
-
curl --location 'http://127.0.0.1:11434/api/chat' \ --header 'Content-Type: application/json' \ --data '{ "model": "llama3", "messages": [ { "role": "user", "content": "why is the sky blue?" } ], "options":{ "num_thread": 16 } }'
-
https://github.com/ollama/ollama/blob/main/docs/api.md#generate-request-with-options
2. ollama 使用第三方包
待更新...