You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using distributed or parallel set-up in script?: No
Who can help?
While testing bigcode/santacoder-fast-inference model on openai_human_eval dataset. I am getting the following warning. Is there something to be concerned about?
anaconda3/envs/NLPWorkSpace/lib/python3.9/site-packages/transformers-4.27.0.dev0-py3.9.egg/transformers/models/gpt_bigcode/modeling_gpt_bigcode.py:259: UserWarning: FALLBACK path has been
taken inside: runCudaFusionGroup. This is an indication that codegen Failed for some reason.
To debug try disable codegen fallback path via setting the env variable `export PYTORCH_NVFUSER_DISABLE=fallback`
(Triggered internally at /opt/conda/conda-bld/pytorch_1670525551200/work/torch/csrc/jit/codegen/cuda/manager.cpp:331.)
attn_weights = upcast_masked_softmax(attn_weights, attention_mask, mask_value, unscale, softmax_dtype)
An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
My own task or dataset (give details below)
Reproduction
Running inference on OpenAI's HumanEval dataset leads to this warning. Specifically when I use temperature = 0.2 and top_p = 0.2 in model.generate method
Expected behavior
No Warning
The text was updated successfully, but these errors were encountered:
Looks like Jit failed for an unknown reason. I don't think that will break anything, but it probably means the kernels aren't fused and things will run a bit slower. I can't really reproduce this, can you try running with the suggested environment variable to get the actual error?
System Info
transformers
version: 4.27.0.dev0Who can help?
While testing
bigcode/santacoder-fast-inference
model onopenai_human_eval
dataset. I am getting the following warning. Is there something to be concerned about?@mayank31398 @joel
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Running inference on OpenAI's HumanEval dataset leads to this warning. Specifically when I use
temperature = 0.2
andtop_p = 0.2
inmodel.generate
methodExpected behavior
No Warning
The text was updated successfully, but these errors were encountered: