1,下载ollama
curl -fsSL https://ollama.com/install.sh | sh
2,下载deepseek-r1
ollama run deepseek-r1:1.5b
3,查看是否下载成功
ollama list
4,运行大模型,测试是否正常
ollama run deepseek-r1:1.5b
5,用python的ollama库操作大模型
1)下载ollama库
pip install ollama
2)测试效果(包括流式输出,上下文联系)
import ollama
import subprocess
conversation_history = []
# 流式输出 上下文生成
def api_stream_chat(text: str, modelName: str):
print('-----------------------------------------')
print('提问:', text)
conversation_history.append(
{
"role": "user",
"content": text
}
)
stream = ollama.chat(
stream=True,
model=modelName,
messages=conversation_histor