Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The generate commit message feature does not work #126

Open
arman-async opened this issue Oct 3, 2024 · 8 comments
Open

The generate commit message feature does not work #126

arman-async opened this issue Oct 3, 2024 · 8 comments

Comments

@arman-async
Copy link

This feature returns the following error during invocation:

"
nable to generate commit message: ConnectError: [unknown] CreateFile .... The filename, directory name, or volume label syntax is incorrect.
"
As you can see in the image:
image

@kamil-kubiczek
Copy link

kamil-kubiczek commented Nov 24, 2024

Same.
My logs from VS code Codium output:

(VSCode) 2024-11-24 11:30:15.643 [INFO]: Completion request succeeded (367.85ms)
(VSCode) 2024-11-24 11:30:15.644 [INFO]: No completions were generated
(VSCode) 2024-11-24 11:30:15.644 [INFO]: provideInlineCompletionItems request succeeded (368.95ms)
E1124 11:30:54.883775 16676 rpcs_commit_messages.go:76] Failed to generate commit message: error grabbing LLM response: stream error

@vadimmelnicuk
Copy link

Same, getting the following error:
E1229 08:44:19.090288 20548 rpcs_commit_messages.go:77] Failed to generate commit message: error grabbing LLM response: stream error

@Aryxst
Copy link

Aryxst commented Jan 8, 2025

same

@rogaha
Copy link

rogaha commented Jan 14, 2025

same here

@rogaha
Copy link

rogaha commented Jan 14, 2025

any updates?

@rogaha
Copy link

rogaha commented Jan 14, 2025

"E0114 02:05:26.323993 87537 embedding.go:42] GetEmbeddings error: Post \"https://inference.codeium.com/exa.api_server_pb.ApiServerService/GetEmbeddings\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)\n",
      "E0114 02:28:09.567187 87537 completion.go:216] Completion 28209 from source PROVIDER_SOURCE_AUTOCOMPLETE error: context deadline exceeded\n",
      "E0114 02:28:09.567214 87537 completions_state_manager.go:834] Error streaming completions: context deadline exceeded\n",
      "E0114 09:21:39.450583 87537 telemetry.go:66] RecordCompletions error: unary response has zero messages\n",
      "E0114 10:38:50.765038 87537 rpcs_commit_messages.go:77] Failed to generate commit message: error grabbing LLM response: stream error\n",
      "E0114 10:42:21.370987 87537 rpcs_commit_messages.go:77] Failed to generate commit message: error grabbing LLM response: stream error\n",
      "E0114 10:45:22.636228 87537 rpcs_commit_messages.go:77] Failed to generate commit message: error grabbing LLM response: stream error\n",
      "E0114 10:49:08.951320 87537 rpcs_commit_messages.go:77] Failed to generate commit message: error grabbing LLM response: stream error\n"
    ]
  }

@Aryxst
Copy link

Aryxst commented Jan 14, 2025

It looks like we are all getting the same error messages, if they haven't fixed it maybe for someone it works?

@rogaha
Copy link

rogaha commented Jan 14, 2025

moved to copilot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants