• 使用transformers过程中出现的bug


    1. The following model_kwargs are not used by the model: ['encoder_hidden_states', 'encoder_attention_mask'] (note: typos in the generate arguments will also show up in this list)

    使用text_decoder就出现上述错误,这是由于transformers版本不兼容导致的

    1. from transformers import AutoModel, AutoConfig, BertGenerationDecoder
    2. decoder_config = AutoConfig.from_pretrained(args['text_checkpoint'])
    3. text_decoder = BertGenerationDecoder(config=decoder_config)
    4. output = self.text_decoder.generate(input_ids=cls_input_ids,
    5. encoder_hidden_states=encoder_hidden_states,
    6. encoder_attention_mask=encoder_attention_mask,
    7. max_length=self.args['max_seq_length'],
    8. do_sample=True,
    9. num_beams=self.args['beam_size'],
    10. length_penalty=1.0, use_cache=True,
    11. )

    解决办法:将transformer的版本换到以下范围, 4.15.0<=transformers<4.22.0,transformers>=4.25.0

    比如:pip install transformers==4.25.1 or pip install transformers==4.20.1

    2. No module named 'transformers.generation_beam_constraints' (其中transformers==4.28.1)

    (1)解决办法

    将:from transformers import generation_beam_constraints

    改为:from transformers.generation import beam_constraints

    (2)其他例子

    有问题的代码:

    1. # 可以在transformers == 4.23.1版本上面运行
    2. from transformers.generation_beam_constraints import Constraint
    3. from transformers.generation_beam_search import BeamScorer, BeamSearchScorer
    4. from transformers.generation_logits_process import (
    5. EncoderNoRepeatNGramLogitsProcessor,
    6. ForcedBOSTokenLogitsProcessor,
    7. ForcedEOSTokenLogitsProcessor,
    8. HammingDiversityLogitsProcessor,
    9. InfNanRemoveLogitsProcessor,
    10. LogitsProcessorList,
    11. MinLengthLogitsProcessor,
    12. NoBadWordsLogitsProcessor,
    13. NoRepeatNGramLogitsProcessor,
    14. PrefixConstrainedLogitsProcessor,
    15. RepetitionPenaltyLogitsProcessor,
    16. TemperatureLogitsWarper,
    17. TopKLogitsWarper,
    18. TopPLogitsWarper,
    19. )
    20. from transformers.generation_stopping_criteria import (
    21. MaxLengthCriteria,
    22. MaxTimeCriteria,
    23. StoppingCriteria,
    24. StoppingCriteriaList,
    25. validate_stopping_criteria,
    26. )

    修正后的代码:

    1. # 可以在transformers == 4.28.1版本上面运行
    2. from transformers.generation.beam_constraints import Constraint
    3. from transformers.generation.beam_search import BeamScorer, BeamSearchScorer
    4. from transformers.generation.logits_process import (
    5. EncoderNoRepeatNGramLogitsProcessor,
    6. ForcedBOSTokenLogitsProcessor,
    7. ForcedEOSTokenLogitsProcessor,
    8. HammingDiversityLogitsProcessor,
    9. InfNanRemoveLogitsProcessor,
    10. LogitsProcessorList,
    11. MinLengthLogitsProcessor,
    12. NoBadWordsLogitsProcessor,
    13. NoRepeatNGramLogitsProcessor,
    14. PrefixConstrainedLogitsProcessor,
    15. RepetitionPenaltyLogitsProcessor,
    16. TemperatureLogitsWarper,
    17. TopKLogitsWarper,
    18. TopPLogitsWarper,
    19. )
    20. from transformers.generation.stopping_criteria import (
    21. MaxLengthCriteria,
    22. MaxTimeCriteria,
    23. StoppingCriteria,
    24. StoppingCriteriaList,
    25. validate_stopping_criteria,
    26. )

  • 相关阅读:
    目标检测应用场景和发展趋势
    消费品的「轻重」权衡术
    使用显著性检测的可见光和红外图像的两尺度图像融合(Matlab代码实现)
    微信APP支付完整版
    【SA8295P 源码分析 (一)】06 - SA8295P XBL Loader 阶段 sbl1_main_ctl 函数代码分析
    Codeforces Round #818 (Div.2)F(最大流)
    pytorch初学笔记(六):DataLoader的使用
    centos中nacos设置开机自启动
    RPC服务与HTTP服务的区别是什么
    WireGuard 组网教程:快速构建安全高效的私密网络并实现内网穿透
  • 原文地址:https://blog.csdn.net/qq_34950042/article/details/133851978