Python: BERT Error - Some weights of the model checkpoint at were not used when initializing BertMod

本文探讨了在使用transformers库时,遇到BERT模型初始化警告的情况,解释了原因,如任务转移或架构差异,并提供了设置不显示警告的方法。解决方法提示无需担心,只需针对下游任务进行训练即可。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

在调用transformers预训练模型库时出现以下信息:

Some weights of the model checkpoint at bert-base-multilingual-cased were not used when initializing BertForTokenClassification_: ['cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.bias', 'cls.seq_relationship.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.seq_relationship.weight']
- This IS expected if you are initializing BertForTokenClassification_ from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertForTokenClassification_ from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of BertForTokenClassification_ were not initialized from the model checkpoint at bert-base-multilingual-cased and are newly initialized: ['classifier.weight', 'classifier.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

解决方法:

这不是一个错误,但这个警告意味着在你的训练过程中,你没有使用pooler来计算损失。因此,如果是这种情况,则无需担心。

可以通过以下方式设置不显示此警告

from transformers import logging

logging.set_verbosity_warning()

或者在在训练后加载模型,则不会看到此错误消息

评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值