解决keras中protobuf版本不匹配引起的一个错误

报错如下:

If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:
 1. Downgrade the protobuf package to 3.20.x or lower.
 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

解决过程:

pip show protobuf,发现protobuf版本5.29.4,明显过高了。

根据提示,对protobuf进行降级

pip install protobuf==3.20.0

ok,错误没了!

:\Miniconda\envs\lianxi\python.exe -X pycache_prefix=C:\Users\ZDW\AppData\Local\JetBrains\PyCharm2025.1\cpython-cache "D:/PyCharm 2025.1.2/plugins/python-ce/helpers/pydev/pydevd.py" --multiprocess --qt-support=auto --client 127.0.0.1 --port 50014 --file D:\CXLX\.py\venv\fly.py 已连接到 pydev 调试器(内部版本号 251.26094.141)Traceback (most recent call last): File "<frozen importlib._bootstrap>", line 991, in _find_and_load File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 671, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 783, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "E:\Miniconda\envs\lianxi\lib\site-packages\tensorflow\__init__.py", line 478, in <module> importlib.import_module("keras.optimizers") File "E:\Miniconda\envs\lianxi\lib\importlib\__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\__init__.py", line 25, in <module> from keras import models File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\models\__init__.py", line 3, in <module> from keras.models import experimental File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\models\experimental\__init__.py", line 3, in <module> from keras.src.models.sharpness_aware_minimization import SharpnessAwareMinimization File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\__init__.py", line 21, in <module> from keras.src import models File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\models\__init__.py", line 18, in <module> from keras.src.engine.functional import Functional File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\engine\functional.py", line 25, in <module> from keras.src import backend File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\backend.py", line 35, in <module> from keras.src.engine import keras_tensor File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\engine\keras_tensor.py", line 19, in <module> from keras.src.utils import object_identity File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\utils\__init__.py", line 53, in <module> from keras.src.utils.feature_space import FeatureSpace File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\utils\feature_space.py", line 20, in <module> from keras.src.engine import base_layer File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\engine\base_layer.py", line 43, in <module> from keras.src.saving.legacy.saved_model import layer_serialization File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\saving\legacy\saved_model\layer_serialization.py", line 23, in <module> from keras.src.saving.legacy.saved_model import save_impl File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\saving\legacy\saved_model\save_impl.py", line 34, in <module> from keras.src.saving.legacy.saved_model import load as keras_load File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\src\saving\legacy\saved_model\load.py", line 29, in <module> from keras.protobuf import saved_metadata_pb2 File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\protobuf\saved_metadata_pb2.py", line 16, in <module> from keras.protobuf import versions_pb2 as keras_dot_protobuf_dot_versions__pb2 File "E:\Miniconda\envs\lianxi\lib\site-packages\keras\protobuf\versions_pb2.py", line 36, in <module> _descriptor.FieldDescriptor( File "E:\Miniconda\envs\lianxi\lib\site-packages\google\protobuf\descriptor.py", line 553, in __new__ _message.Message._CheckCalledFromGeneratedFile() TypeError: Descriptors cannot be created directly. If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0. If you cannot immediately regenerate your protos, some other possible workarounds are: 1. Downgrade the protobuf package to 3.20.x or lower. 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower). More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates python-BaseException
07-02
Traceback (most recent call last): File "/home/xianbei/CNN-VehicleType/test_vehicle_type.py", line 17, in <module> from keras.preprocessing.image import img_to_array File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/keras/__init__.py", line 3, in <module> from keras import __internal__ File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/keras/__internal__/__init__.py", line 3, in <module> from keras.__internal__ import backend File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/keras/__internal__/backend/__init__.py", line 3, in <module> from keras.src.backend import _initialize_variables as initialize_variables File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/keras/src/__init__.py", line 21, in <module> from keras.src import applications File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/keras/src/applications/__init__.py", line 18, in <module> from keras.src.applications.convnext import ConvNeXtBase File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/keras/src/applications/convnext.py", line 26, in <module> import tensorflow.compat.v2 as tf File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/__init__.py", line 41, in <module> from tensorflow.python.tools import module_util as _module_util File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/python/__init__.py", line 53, in <module> from tensorflow.core.framework.graph_pb2 import * File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/core/framework/graph_pb2.py", line 16, in <module> from tensorflow.core.framework import function_pb2 as tensorflow_dot_core_dot_framework_dot_function__pb2 File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/core/framework/function_pb2.py", line 16, in <module> from tensorflow.core.framework import attr_value_pb2 as tensorflow_dot_core_dot_framework_dot_attr__value__pb2 File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/core/framework/attr_value_pb2.py", line 16, in <module> from tensorflow.core.framework import tensor_pb2 as tensorflow_dot_core_dot_framework_dot_tensor__pb2 File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/core/framework/tensor_pb2.py", line 16, in <module> from tensorflow.core.framework import resource_handle_pb2 as tensorflow_dot_core_dot_framework_dot_resource__handle__pb2 File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/core/framework/resource_handle_pb2.py", line 16, in <module> from tensorflow.core.framework import tensor_shape_pb2 as tensorflow_dot_core_dot_framework_dot_tensor__shape__pb2 File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/core/framework/tensor_shape_pb2.py", line 36, in <module> _descriptor.FieldDescriptor( File "/home/xianbei/anaconda3/envs/py38/lib/python3.8/site-packages/google/protobuf/descriptor.py", line 621, in __new__ _message.Message._CheckCalledFromGeneratedFile() TypeError: Descriptors cannot be created directly. If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0. If you cannot immediately regenerate your protos, some other possible workarounds are: 1. Downgrade the protobuf package to 3.20.x or lower. 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).
最新发布
07-03
--------------------------------------------------------------------------- ModuleNotFoundError Traceback (most recent call last) <ipython-input-9-2d915d6ffaa1> in <module>() 5 from sklearn.model_selection import train_test_split 6 from sklearn.metrics import classification_report, confusion_matrix ----> 7 from tensorflow.keras import layers, Model, Sequential 8 from tensorflow.keras.preprocessing.image import ImageDataGenerator, load_img, img_to_array 9 from tensorflow.keras.callbacks import EarlyStopping, LearningRateScheduler D:\Anaconda3\lib\site-packages\tensorflow\__init__.py in <module>() 35 import typing as _typing 36 ---> 37 from tensorflow.python.tools import module_util as _module_util 38 from tensorflow.python.util.lazy_loader import LazyLoader as _LazyLoader 39 D:\Anaconda3\lib\site-packages\tensorflow\python\__init__.py in <module>() 40 41 # Bring in subpackages. ---> 42 from tensorflow.python import data 43 from tensorflow.python import distribute 44 # from tensorflow.python import keras D:\Anaconda3\lib\site-packages\tensorflow\python\data\__init__.py in <module>() 19 20 # pylint: disable=unused-import ---> 21 from tensorflow.python.data import experimental 22 from tensorflow.python.data.ops.dataset_ops import AUTOTUNE 23 from tensorflow.python.data.ops.dataset_ops import Dataset D:\Anaconda3\lib\site-packages\tensorflow\python\data\experimental\__init__.py in <module>() 94 95 # pylint: disable=unused-import ---> 96 from tensorflow.python.data.experimental import service 97 from tensorflow.python.data.experimental.ops.batching import dense_to_ragged_batch 98 from tensorflow.python.data.experimental.ops.batching import dense_to_sparse_batch D:\Anaconda3\lib\site-packages\tensorflow\python\data\experimental\service\__init__.py in <module>() 417 """ 418 --> 419 from tensorflow.python.data.experimental.ops.data_service_ops import distribute 420 from tensorflow.python.data.experimental.ops.data_service_ops import from_dataset_id 421 from tensorflow.python.data.experimental.ops.data_service_ops import register_dataset D:\Anaconda3\lib\site-packages\tensorflow\python\data\experimental\ops\data_service_ops.py in <module>() 20 from tensorflow.core.protobuf import data_service_pb2 21 from tensorflow.python import tf2 ---> 22 from tensorflow.python.data.experimental.ops import compression_ops 23 from tensorflow.python.data.experimental.service import _pywrap_server_lib 24 from tensorflow.python.data.experimental.service import _pywrap_utils D:\Anaconda3\lib\site-packages\tensorflow\python\data\experimental\ops\compression_ops.py in <module>() 14 # ============================================================================== 15 """Ops for compressing and uncompressing dataset elements.""" ---> 16 from tensorflow.python.data.util import structure 17 from tensorflow.python.ops import gen_experimental_dataset_ops as ged_ops 18 D:\Anaconda3\lib\site-packages\tensorflow\python\data\util\structure.py in <module>() 18 import itertools 19 ---> 20 import wrapt 21 22 from tensorflow.python.data.util import nest ModuleNotFoundError: No module named 'wrapt' 给我一个正确的解决方案
06-14
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值