Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

codes trying to find a wrapper_config which does not exist #2

Open
rabeehk opened this issue Oct 5, 2021 · 1 comment
Open

codes trying to find a wrapper_config which does not exist #2

rabeehk opened this issue Oct 5, 2021 · 1 comment

Comments

@rabeehk
Copy link

rabeehk commented Oct 5, 2021

Hi
I am running this command

python3 cli.py           --method pet           --arch_method default           --data_dir /data/home/rkarimi/codes/fewshot/internship/fewshot/temp/FewNLU/data/fewglue/FewGLUE/BoolQ          --pattern_ids 0           --model_type albert           --model_name_or_path albert-xxlarge-v2           --dataset_name superglue           --task_name boolq          --output_dir /fsx/rkarimi/test/test1           --do_eval           --do_train           --per_gpu_eval_batch_size 4           --per_gpu_train_batch_size 4           --gradient_accumulation_steps 1           --max_seq_length 128           --max_steps 10           --sampler_seed 1           --seed 1           --warmup_step_ratio 0           --learning_rate 1e-5           --repetitions 1           --use_cloze           --few_shot_setting dev32_split           --every_eval_ratio 0.02           --cv_k 4           --split_ratio 0.5           --fix_deberta --overwrite_output_dir 

getting this error, thanks for your help

Traceback (most recent call last):
  File "cli.py", line 584, in <module>
    main()
  File "cli.py", line 518, in main
    results=iterative_run(dataprovider, eval_data, wrapper_config, train_eval_config, unlabeled_data, aug_data, output_dir=args.output_dir)
  File "cli.py", line 248, in iterative_run
    results=run(dataprovider, eval_data, wrapper_config, train_eval_config, output_dir, unlabeled_data, aug_data, save_unlabeled_logits=False)
  File "cli.py", line 395, in run
    wrapper = TransformerModelWrapper.from_pretrained(pattern_iter_output_dir)
  File "/data/home/rkarimi/codes/fewshot/internship/fewshot/temp/FewNLU/fewnlu/wrapper.py", line 128, in from_pretrained
    wrapper.config = wrapper._load_config(path)
  File "/data/home/rkarimi/codes/fewshot/internship/fewshot/temp/FewNLU/fewnlu/wrapper.py", line 168, in _load_config
    with open(os.path.join(path, CONFIG_NAME), 'r') as f:
FileNotFoundError: [Errno 2] No such file or directory: '/fsx/rkarimi/test/test1/p0/f0-i0/wrapper_config.json'
@zhouj8553
Copy link
Collaborator

Thanks for your attention. This is because the max step is too small and no checkpoints have been saved. Sorry that we did not realize this situation. Now we have added judgment to the code to avoid this failure mode. You can redownload it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants