Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to support GitHub Copilot Chat? #84

Open
pidgeon777 opened this issue Oct 9, 2024 · 2 comments
Open

Is it possible to support GitHub Copilot Chat? #84

pidgeon777 opened this issue Oct 9, 2024 · 2 comments

Comments

@pidgeon777
Copy link

As the titles says.

References:

https://docs.github.com/en/copilot/quickstart

https://github.com/CopilotC-Nvim/CopilotChat.nvim

@pidgeon777
Copy link
Author

It is important to note that GitHub Copilot Chat supports various models, including GPT-4, GPT-4o, o1, and o1-mini. For more detailed information and updates, you can refer to the GitHub Copilot Chat changelog.

Given the widespread use and active subscriptions to GitHub Copilot Chat among developers, it would be highly beneficial for the lsp-ai plugin to incorporate support for these models. This integration would enhance the development experience by leveraging the advanced capabilities of GitHub Copilot Chat.

@arunoruto
Copy link

arunoruto commented Nov 26, 2024

As much as I remember, Copilot is based on OpenAIs ChatGPT. I remember helix-gpt having a GitHub Copilot integration. I looked a bit more at the code and found out, that they are using the https://api.githubcopilot.com/chat/completions endpoint. I guess you can just use the open_ai examples and replace the endpoint with the one from githubcopilot. Here is some more information about the API:

After doing more analysis, I determined that the API endpoint at https://api.githubcopilot.com/chat/completions was nothing more than an OpenAI API proxy, with a different auth wrapper (which authenticated using the Copilot bearer token instead of an OpenAI key). I could query any model, with nearly identical behavior to the real OpenAI API.

I tried it out with helix, but I am getting the following error:

2024-11-26T21:55:58.707 helix_lsp::transport [ERROR] lsp-ai err <- "ERROR lsp_ai::transformer_worker: generating response: error decoding response body: expected value at line 1 column 1\n"
2024-11-26T21:55:58.707 helix_lsp::transport [ERROR] lsp-ai err <- "\n"
2024-11-26T21:55:58.707 helix_lsp::transport [ERROR] lsp-ai err <- "Caused by:\n"
2024-11-26T21:55:58.707 helix_lsp::transport [ERROR] lsp-ai err <- "    expected value at line 1 column 1\n"
2024-11-26T21:55:58.707 helix_lsp::transport [ERROR] lsp-ai <- InternalError: error decoding response body: expected value at line 1 column 1

EDIT My current lsp-ai config for helix can be found in flake repo. You can kinda read like json, just replace the = with : and remove the ; at the end;

EDIT 2 I looked a bit more into the whole Open AI API compatibility. While api.githubcopilot.com is being used now, it used to be a different endpoint until the beginning of 2024. According to GitHubs changelog, they used to have copilot-proxy.githubusercontent.com as their API endpoint. This is also referenced in this line of helix-gpt.
What does this mean? Chat should work with the https://api.githubcopilot.com/chat/completions endpoint, but completion will not. Doing some manual curl commands, the /chat/completions endpoint expects a messages entry in the data, while for completions a prompt is used. I tried manually submitting a request to https://copilot-proxy.githubusercontent.com/v1/engines/copilot-codex/completions as described by helix-gpt, but my token isn't the right one for such requests... Looking at the code, the copilot API token is used to obtain a session token, which is then used further to make requests.
tldr;

  • Copilot chat should work using https://api.githubcopilot.com/chat/completions as an endpoint and the output of gh auth token as the token value. I used set -gx GH_AUTH_TOKEN $(gh auth token) to set the env variable in fish.
  • Completions need a different endpoint which requires the prompt property to be set.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants