Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问预训练lora微调没有使用template参数,合并lora权重时需要指定template吗 #4185

Closed
1 task done
DataNinja42298 opened this issue Jun 10, 2024 · 1 comment
Labels
solved This problem has been already solved

Comments

@DataNinja42298
Copy link

Reminder

  • I have read the README and searched the existing issues.

System Info

model

model_name_or_path: /nfs//Llama-3-8B
adapter_name_or_path: /nfs/
/pretrain_llama3_0524/checkpoint-58000
template: llama3
finetuning_type: lora

export

export_dir: /nfs/***/llama3_8B_pt
export_size: 2
export_device: cpu
export_legacy_format: false

Reproduction

model

model_name_or_path: /nfs//Llama-3-8B
adapter_name_or_path: /nfs/
/pretrain_llama3_0524/checkpoint-58000
template: llama3
finetuning_type: lora

export

export_dir: /nfs/***/llama3_8B_pt
export_size: 2
export_device: cpu
export_legacy_format: false

Expected behavior

No response

Others

No response

@github-actions github-actions bot added the pending This problem is yet to be addressed label Jun 10, 2024
@hiyouga
Copy link
Owner

hiyouga commented Jun 10, 2024

可以指定 template: empty

@hiyouga hiyouga added solved This problem has been already solved and removed pending This problem is yet to be addressed labels Jun 10, 2024
@hiyouga hiyouga closed this as completed Jun 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved
Projects
None yet
Development

No branches or pull requests

2 participants