本文介绍用自己编译ollama-webui,链接本地ollama,方便自己添加其他功能
环境
系统:CentOS-7
CPU: 14C28T
内存:32G
显卡:Tesla P40 24G
驱动: 535
CUDA: 12.2
Ollama: 0.3.0
本地ollama
参考
[第二十四篇-Ollama-在线安装](https://blog.csdn.net/hai4321/article/details/138241623)
验证
http://192.168.31.222:11434/
Ollama is running
安装nodejs
https://registry.npmmirror.com/binary.html?path=node/v18.20.2/
node-v18.20.2-linux-x64.tar.gz
tar -zxf node-v18.20.2-linux-x64.tar.gz
vim /etc/profile
export NODE_HOME=/opt/soft/node-v18.20.2-linux-x64/bin
export PATH=$PATH