Skip to content

[Question] Error loading inference after model fine-tuning 模型微调后加载时出错 #40

Open
@LittleGreenYuan

Description

@LittleGreenYuan

What is your question?

问题:模型微调后加载时出错
Question: Error loading inference after model fine-tuning

RuntimeError: SequenceLabelingPipeline: SequenceLabelingModel: Error(s) in loading state_dict for SequenceLabelingModel:
Missing key(s) in state_dict: "embedder.transformer_model.embeddings.position_ids".

What have you tried?

我完全按照官方教程进行的微调,没有进行额外的代码修改,自动保存的微调模型无法加载。
I completely followed the official tutorial for fine-tuning without making any additional code modifications, and the automatically saved fine-tuning model cannot be loaded.

Code (if necessary)

Code:
python```
from modelscope.pipelines import pipeline
from modelscope.utils.constant import Tasks
p = pipeline(
Tasks.named_entity_recognition,
'/root/experiments/experiments/CMeEE-cmeee/20240227_093521/output_best'
)
result = p('对儿童SARST细胞亚群的研究表明,与成人SARS相比,儿童细胞下降不明显,证明上述推测成立。')
print(result)


### What's your environment?

- AdaSeq Version (e.g., 1.0 or master): 0.6.6
- ModelScope Version (e.g., 1.0 or master): 1.12.0
- PyTorch Version (e.g., 1.12.1): 2.2.1
- OS (e.g., Ubuntu 20.04): Ubuntu 20.04
- Python version: 3.11
- CUDA/cuDNN version: 
- GPU models and configuration: RTX3090
- Any other relevant information:


### Code of Conduct

- [X] I agree to follow this project's Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions