Kylin Linux V10 C++版本环境搭建【Install a Newer GCC】- version `GLIBCXX_3.4.25‘ not found

🥇 版权: 本文由【墨理学AI】原创首发、各位读者大大、敬请查阅、感谢三连

0-9

新服务器 ollama 运行时 遇到报错如下

  • ollama: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.25’ not found
ollama list

ollama: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.25' not found (required by ollama)

  • sudo yum install devtoolset-8-gcc devtoolset-8-gcc-c++ devtoolset-8-libstdc++-devel
  • https://grok.com/chat/2d2ec2d7-5452-4b0b-9578-732cccea23f3
  • 安装之后
Installed:
  devtoolset-8-binutils-2.30-55.el7.2.x86_64     devtoolset-8-gcc-8.3.1-3.2.el7.x86_64     devtoolset-8-gcc-c++-8.3.1-3.2.el7.x86_64     devtoolset-8-libstdc++-devel-8.3.1-3.2.el7.x86_64     devtoolset-8-runtime-8.1-1.el7.x86_64     environment-modules-4.2.5-3.ky10.x86_64    
  scl-utils-1:2.0.2-8.ky10.x86_64               

Complete!

/usr/lib64/ 目录下始终缺乏 GLIBCXX_3.4.24

  • 这个即使手动编译安装 gcc-8.5.0 依然没有
strings /usr/lib64/libstdc++.so.6 | grep GLIBCXX

GLIBCXX_3.4
GLIBCXX_3.4.1
GLIBCXX_3.4.2
GLIBCXX_3.4.3
GLIBCXX_3.4.4
GLIBCXX_3.4.5
GLIBCXX_3.4.6
GLIBCXX_3.4.7
GLIBCXX_3.4.8
GLIBCXX_3.4.9
GLIBCXX_3.4.10
GLIBCXX_3.4.11
GLIBCXX_3.4.12
GLIBCXX_3.4.13
GLIBCXX_3.4.14
GLIBCXX_3.4.15
GLIBCXX_3.4.16
GLIBCXX_3.4.17
GLIBCXX_3.4.18
GLIBCXX_3.4.19
GLIBCXX_3.4.20
GLIBCXX_3.4.21
GLIBCXX_3.4.22
GLIBCXX_3.4.23
GLIBCXX_3.4.24
GLIBCXX_DEBUG_MESSAGE_LENGTH

grok3 给出的解决方法,已经验证

最终采纳 Step 7: Alternative - Install a Newer GCC from Source

If devtoolset-8 or the system’s libstdc++ still doesn’t provide GLIBCXX_3.4.25, you can install GCC 8.5.0 from source to ensure you get libstdc++.so.6.0.25.

  1. Download and Extract:

    wget https://ftp.gnu.org/gnu/gcc/gcc-8.5.0/gcc-8.5.0.tar.gz
    tar -xzf gcc-8.5.0.tar.gz
    cd gcc-8.5.0
    
  2. Install Dependencies:

    sudo yum install gmp-devel mpfr-devel libmpc-devel
    
  3. Configure and Build:

    ./configure --prefix=/usr/local/gcc-8.5.0 --disable-multilib --enable-languages=c,c++
    make -j$(nproc)
    sudo make install
    
  4. Update LD_LIBRARY_PATH:

    export LD_LIBRARY_PATH=/usr/local/gcc-8.5.0/lib64:$LD_LIBRARY_PATH
    
  5. Verify:

    strings /usr/local/gcc-8.5.0/lib64/libstdc++.so.6 | grep GLIBCXX
    

    Ensure GLIBCXX_3.4.25 is present, then run:

    # 启动服务
    ollama server
    ollama list
    

Step 8: Kylin Linux-Specific Considerations

Kylin Linux V10 is based on a Red Hat/CentOS-like system, but its repositories may be customized. If yum install devtoolset-8-libstdc++-devel fails or doesn’t provide the expected library, check Kylin’s official repositories or documentation for newer GCC packages. You can also try enabling the Kylin equivalent of the CentOS SCL repository:

sudo yum install kylin-release-scl

Then retry installing devtoolset-8 packages.


Step 9: Test and Troubleshoot

After applying one of the above solutions, test ollama:

ollama list

If the error persists, provide the output of the following commands to help diagnose further:

strings /usr/lib64/libstdc++.so.6 | grep GLIBCXX
rpm -qa | grep libstdc++
ldd $(which ollama) | grep libstdc++
find / -name "libstdc++.so*" 2>/dev/null

Notes

  • Conda Issue: The Conda environment is a likely culprit, as it often overrides system libraries. Deactivating Conda or updating libstdcxx-ng is a quick fix to test.
  • System Library Safety: Avoid replacing /usr/lib64/libstdc++.so.6 directly unless necessary, as it may break other applications. Using LD_LIBRARY_PATH is safer.
  • GCC 8.3.1: This version should provide GLIBCXX_3.4.25, so the issue may be a missing libstdc++-devel package or Conda interference.

If you share the outputs of the commands above, I can pinpoint the exact issue and provide a more targeted solution.


flash-attn 安装需要更高版本的 GCC (We need GCC 9 or later)

 pip install -U flash-attn --no-build-isolation 
 
 /home/ai/big_data/anaconda3/envs/qwen/lib/python3.12/site-packages/torch/include/c10/util/C++17.h:13:2: 错误:#error "You're trying to build PyTorch with a too old version of GCC. We need GCC 9 or later."

给当前用户安装 gcc (GCC) 11.4.0


# 下载安装高版本 C++

wget https://ftp.gnu.org/gnu/gcc/gcc-11.4.0/gcc-11.4.0.tar.gz

tar -xzf gcc-11.4.0.tar.gz 

cd gcc-11.4.0/

./configure --prefix=/home/ai/usr/local/gcc11_4 --disable-multilib --enable-languages=c,c++

# 根据电脑性能、网速,会执行大概 10 - 20分钟

make -j$(nproc)

make install






sudo yum install gmp-devel mpfr-devel libmpc-devel



# 查看 libstdc++.so.6 是否 含有 GLIBCXX_3.4.25
strings /home/ai/usr/local/gcc11_4/lib64/libstdc++.so.6 | grep GLIBCXX


vim ~/.bashrc

export PATH=/home/ai/usr/local/gcc11_4/bin:/home/ai/usr/local/gcc11_4/lib64:$PATH
export LD_LIBRARY_PATH=/home/ai/usr/local/gcc11_4/lib64:$LD_LIBRARY_PATH
source ~/.bashrc


# 安装成功

gcc --version
gcc (GCC) 11.4.0
Copyright © 2021 Free Software Foundation, Inc.
本程序是自由软件;请参看源代码的版权声明。本软件没有任何担保;
包括没有适销性和某一专用目的下的适用性担保。
(base) [ai@localhost ~]$ g++ --version
g++ (GCC) 11.4.0
Copyright © 2021 Free Software Foundation, Inc.
本程序是自由软件;请参看源代码的版权声明。本软件没有任何担保;
包括没有适销性和某一专用目的下的适用性担保。

ollama.service 环境变量中配置 新安装的 C++ 库路径


✅ 解决方案:让 Ollama 服务使用你自己的 libstdc++.so.6

方法:在 systemd 服务中设置 LD_LIBRARY_PATH,优先加载你的新版 libstdc++


✅ 步骤 1:修改 Ollama systemd 服务文件

sudo vim /etc/systemd/system/ollama.service

修改 [Service] 部分,加入你的 libstdc++ 路径:

[Service]
ExecStart=/usr/bin/ollama serve
User=ai
Group=ai
Environment=HOME=/home/ai
Environment=OLLAMA_MODELS=/data/ollama/models
Environment=LD_LIBRARY_PATH=/home/ai/usr/local/gcc11_4/lib64:/usr/local/cuda/lib64:/usr/local/cuda/extras/CUPTI/lib64
Restart=always
RestartSec=3

工作站 ollama 服务配置

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"
Environment="CUDA_VISIBLE_DEVICES=0"
#Environment="PATH=/opt/excadb/15/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/local/cuda/bin"
Environment="OLLAMA_HOST=0.0.0.0"

[Install]
WantedBy=default.target

🔥 关键点:

  • LD_LIBRARY_PATH 包含你的 gcc11_4/lib64 路径
  • 顺序重要:把 /home/ai/usr/local/gcc11_4/lib64 放在最前面,确保优先加载

✅ 步骤 3:重载 systemd 并重启服务 - 启动服务

sudo systemctl daemon-reload
sudo systemctl restart ollama
sudo systemctl status ollama

设置开机自启

sudo systemctl enable ollama

立即启动服务

sudo systemctl start ollama

停止服务

sudo systemctl stop vllm-web.service

✅ 步骤 4:查看日志确认是否成功

sudo journalctl -u ollama -f

✅ 成功的日志应该包含:

INFO[0000] Starting Ollama server...

而不是 GLIBCXX_3.4.25 not found

❤️ 一起学AI


  • ❤️ 如果文章对你有些许帮助、蟹蟹各位读者大大点赞、评论鼓励博主的每一分认真创作

9-9

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

墨理学AI

不必打赏,关注博主公众号即可

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值