• 使用transformers过程中出现的bug


    1. The following model_kwargs are not used by the model: ['encoder_hidden_states', 'encoder_attention_mask'] (note: typos in the generate arguments will also show up in this list)

    使用text_decoder就出现上述错误,这是由于transformers版本不兼容导致的

    1. from transformers import AutoModel, AutoConfig, BertGenerationDecoder
    2. decoder_config = AutoConfig.from_pretrained(args['text_checkpoint'])
    3. text_decoder = BertGenerationDecoder(config=decoder_config)
    4. output = self.text_decoder.generate(input_ids=cls_input_ids,
    5. encoder_hidden_states=encoder_hidden_states,
    6. encoder_attention_mask=encoder_attention_mask,
    7. max_length=self.args['max_seq_length'],
    8. do_sample=True,
    9. num_beams=self.args['beam_size'],
    10. length_penalty=1.0, use_cache=True,
    11. )

    解决办法:将transformer的版本换到以下范围, 4.15.0<=transformers<4.22.0,transformers>=4.25.0

    比如:pip install transformers==4.25.1 or pip install transformers==4.20.1

    2. No module named 'transformers.generation_beam_constraints' (其中transformers==4.28.1)

    (1)解决办法

    将:from transformers import generation_beam_constraints

    改为:from transformers.generation import beam_constraints

    (2)其他例子

    有问题的代码:

    1. # 可以在transformers == 4.23.1版本上面运行
    2. from transformers.generation_beam_constraints import Constraint
    3. from transformers.generation_beam_search import BeamScorer, BeamSearchScorer
    4. from transformers.generation_logits_process import (
    5. EncoderNoRepeatNGramLogitsProcessor,
    6. ForcedBOSTokenLogitsProcessor,
    7. ForcedEOSTokenLogitsProcessor,
    8. HammingDiversityLogitsProcessor,
    9. InfNanRemoveLogitsProcessor,
    10. LogitsProcessorList,
    11. MinLengthLogitsProcessor,
    12. NoBadWordsLogitsProcessor,
    13. NoRepeatNGramLogitsProcessor,
    14. PrefixConstrainedLogitsProcessor,
    15. RepetitionPenaltyLogitsProcessor,
    16. TemperatureLogitsWarper,
    17. TopKLogitsWarper,
    18. TopPLogitsWarper,
    19. )
    20. from transformers.generation_stopping_criteria import (
    21. MaxLengthCriteria,
    22. MaxTimeCriteria,
    23. StoppingCriteria,
    24. StoppingCriteriaList,
    25. validate_stopping_criteria,
    26. )

    修正后的代码:

    1. # 可以在transformers == 4.28.1版本上面运行
    2. from transformers.generation.beam_constraints import Constraint
    3. from transformers.generation.beam_search import BeamScorer, BeamSearchScorer
    4. from transformers.generation.logits_process import (
    5. EncoderNoRepeatNGramLogitsProcessor,
    6. ForcedBOSTokenLogitsProcessor,
    7. ForcedEOSTokenLogitsProcessor,
    8. HammingDiversityLogitsProcessor,
    9. InfNanRemoveLogitsProcessor,
    10. LogitsProcessorList,
    11. MinLengthLogitsProcessor,
    12. NoBadWordsLogitsProcessor,
    13. NoRepeatNGramLogitsProcessor,
    14. PrefixConstrainedLogitsProcessor,
    15. RepetitionPenaltyLogitsProcessor,
    16. TemperatureLogitsWarper,
    17. TopKLogitsWarper,
    18. TopPLogitsWarper,
    19. )
    20. from transformers.generation.stopping_criteria import (
    21. MaxLengthCriteria,
    22. MaxTimeCriteria,
    23. StoppingCriteria,
    24. StoppingCriteriaList,
    25. validate_stopping_criteria,
    26. )

  • 相关阅读:
    MongoDB 简介
    Android系统10 RK3399 init进程启动(三十四) 常见Property属性
    使用百度飞桨EasyDL实现AI文章自动分类
    C++学习笔记(四): 类、头文件、对象
    前端---认识HTML
    127. 单词接龙
    Unity UI Toolkit学习笔记-C# 中创建自定义ui
    Windows环境下Scala编程环境搭建
    护眼灯买哪种好,五款热门专业护眼台灯推荐
    【Kaggle】如何有效避免OOM(out of memory)和漫长的炼丹过程
  • 原文地址:https://blog.csdn.net/qq_34950042/article/details/133851978