We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
llamafactory
'''
No response
The text was updated successfully, but these errors were encountered:
训练及测试使用数据集是包含tool_function的sharegpt格式 "new_test": { "file_name": "new_test.json", "formatting": "sharegpt", "columns": { "messages": "conversations", "tools": "tools" } }
Sorry, something went wrong.
可以使用环境变量 DISABLE_VERSION_CHECK
DISABLE_VERSION_CHECK
可以使用环境变量DISABLE_VERSION_CHECK
已经执行了export DISABLE_VERSION_CHECK=1,但是还是报相同的错
我也遇到这个问题,LLaMA-Factory要求transformers>=4.41.2,<=4.46.1,试一下transformers==4.41.2版本,这个好像可以。
No branches or pull requests
Reminder
System Info
llamafactory
version: 0.9.2.dev0Reproduction
'''
[rank0]: Traceback (most recent call last):
[rank0]: File "/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/launcher.py", line 23, in
[rank0]: launch()
[rank0]: File "/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/launcher.py", line 19, in launch
[rank0]: run_exp()
[rank0]: File "/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/train/tuner.py", line 59, in run_exp
[rank0]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank0]: File "/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 128, in run_sft
[rank0]: predict_results = trainer.predict(dataset_module["eval_dataset"], metric_key_prefix="predict", **gen_kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/trainer_seq2seq.py", line 259, in predict
[rank0]: return super().predict(test_dataset, ignore_keys=ignore_keys, metric_key_prefix=metric_key_prefix)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/trainer.py", line 4042, in predict
[rank0]: output = eval_loop(
[rank0]: ^^^^^^^^^^
[rank0]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/trainer.py", line 4158, in evaluation_loop
[rank0]: losses, logits, labels = self.prediction_step(model, inputs, prediction_loss_only, ignore_keys=ignore_keys)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/train/sft/trainer.py", line 130, in prediction_step
[rank0]: loss, generated_tokens, _ = super().prediction_step( # ignore the returned labels (may be truncated)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/trainer_seq2seq.py", line 331, in prediction_step
[rank0]: generated_tokens = self.model.generate(**generation_inputs, **gen_kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
[rank0]: return func(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/generation/utils.py", line 2215, in generate
[rank0]: result = self._sample(
[rank0]: ^^^^^^^^^^^^^
[rank0]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/generation/utils.py", line 3209, in _sample
[rank0]: model_kwargs = self._update_model_kwargs_for_generation(
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/root/.cache/huggingface/modules/transformers_modules/chatglm3-6b/modeling_chatglm.py", line 871, in _update_model_kwargs_for_generation
[rank0]: model_kwargs["past_key_values"] = self._extract_past_from_model_output(
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: TypeError: GenerationMixin._extract_past_from_model_output() got an unexpected keyword argument 'standardize_cache_format'
[rank1]: Traceback (most recent call last):
[rank1]: File "/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/launcher.py", line 23, in
[rank1]: launch()
[rank1]: File "/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/launcher.py", line 19, in launch
[rank1]: run_exp()
[rank1]: File "/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/train/tuner.py", line 59, in run_exp
[rank1]: run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
[rank1]: File "/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/train/sft/workflow.py", line 128, in run_sft
[rank1]: predict_results = trainer.predict(dataset_module["eval_dataset"], metric_key_prefix="predict", **gen_kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/trainer_seq2seq.py", line 259, in predict
[rank1]: return super().predict(test_dataset, ignore_keys=ignore_keys, metric_key_prefix=metric_key_prefix)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/trainer.py", line 4042, in predict
[rank1]: output = eval_loop(
[rank1]: ^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/trainer.py", line 4158, in evaluation_loop
[rank1]: losses, logits, labels = self.prediction_step(model, inputs, prediction_loss_only, ignore_keys=ignore_keys)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/train/sft/trainer.py", line 130, in prediction_step
[rank1]: loss, generated_tokens, _ = super().prediction_step( # ignore the returned labels (may be truncated)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/trainer_seq2seq.py", line 331, in prediction_step
[rank1]: generated_tokens = self.model.generate(**generation_inputs, **gen_kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
[rank1]: return func(*args, **kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/generation/utils.py", line 2215, in generate
[rank1]: result = self._sample(
[rank1]: ^^^^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/transformers/generation/utils.py", line 3209, in _sample
[rank1]: model_kwargs = self._update_model_kwargs_for_generation(
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/.cache/huggingface/modules/transformers_modules/chatglm3-6b/modeling_chatglm.py", line 871, in _update_model_kwargs_for_generation
[rank1]: model_kwargs["past_key_values"] = self._extract_past_from_model_output(
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: TypeError: GenerationMixin._extract_past_from_model_output() got an unexpected keyword argument 'standardize_cache_format'
E1231 14:59:58.227000 140023665247424 torch/distributed/elastic/multiprocessing/api.py:826] failed (exitcode: 1) local_rank: 0 (pid: 5387) of binary: /root/miniconda3/bin/python
Traceback (most recent call last):
File "/root/miniconda3/bin/torchrun", line 8, in
sys.exit(main())
^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/init.py", line 347, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/torch/distributed/run.py", line 879, in main
run(args)
File "/root/miniconda3/lib/python3.12/site-packages/torch/distributed/run.py", line 870, in run
elastic_launch(
File "/root/miniconda3/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 132, in call
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 263, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
/root/autodl-tmp/SHR-LLaMA-Factory/src/llamafactory/launcher.py FAILED
Failures:
[1]:
time : 2024-12-31_14:59:58
host : autodl-container-e9e04d9e7a-7d8eafec
rank : 1 (local_rank: 1)
exitcode : 1 (pid: 5388)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
Root Cause (first observed failure):
[0]:
time : 2024-12-31_14:59:58
host : autodl-container-e9e04d9e7a-7d8eafec
rank : 0 (local_rank: 0)
exitcode : 1 (pid: 5387)
error_file: <N/A>
traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
'''
Expected behavior
No response
Others
No response
The text was updated successfully, but these errors were encountered: