We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
你好,cama参考链接(https://github.com/zjunlp/KnowLM)里除了全量的预训练权重(ZhiXi-13B-Diff),还有LoRA指令微调权重(ZhiXi-13B-LoRA),请问如果想使用QiZhen-CaMA-13B-Checkpoint-12400,是基于全量的预训练权重(ZhiXi-13B-Diff)合并得到的完整cama权重来做,还是也要用到cama的LoRA指令微调权重(ZhiXi-13B-LoRA)
The text was updated successfully, but these errors were encountered:
需要与全量预训练的原始CAMA一起使用,不需要zhixi-13b-lora
Sorry, something went wrong.
No branches or pull requests
你好,cama参考链接(https://github.com/zjunlp/KnowLM)里除了全量的预训练权重(ZhiXi-13B-Diff),还有LoRA指令微调权重(ZhiXi-13B-LoRA),请问如果想使用QiZhen-CaMA-13B-Checkpoint-12400,是基于全量的预训练权重(ZhiXi-13B-Diff)合并得到的完整cama权重来做,还是也要用到cama的LoRA指令微调权重(ZhiXi-13B-LoRA)
The text was updated successfully, but these errors were encountered: