We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
在运行命令行对话demo时报错
No response
1.从github 中clone好chatglm3代码 2.从huggingface下载模型chatglm3-6b-base 3.配置好环境 4.运行python cli_demo.py
- OS: Ubuntu 20.04 - Python:3.7 - Transformers: 4.26.1 - PyTorch:2.30 - CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :True
The text was updated successfully, but these errors were encountered:
Is it solved?
Sorry, something went wrong.
cli_demo.py build_prompt 方法有 bug,可以修改一下
def build_prompt(history): prompt = "欢迎使用 ChatGLM-6B 模型,输入内容即可进行对话,clear 清空对话历史,stop 终止程序" for item in history: query = item["role"] response = item["content"] prompt += f"\n\n用户:{query}" prompt += f"\n\nChatGLM-6B:{response}" return prompt
No branches or pull requests
Is there an existing issue for this?
Current Behavior
在运行命令行对话demo时报错
Expected Behavior
No response
Steps To Reproduce
1.从github 中clone好chatglm3代码
2.从huggingface下载模型chatglm3-6b-base
3.配置好环境
4.运行python cli_demo.py
Environment
Anything else?
No response
The text was updated successfully, but these errors were encountered: