You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
so it is described here https://code.visualstudio.com/api/extension-guides/language-model
"Once you've built the prompt for the language model, you first select the language model you want to use with the selectChatModels method. This method returns an array of language models that match the specified criteria. If you are implementing a chat participant, we recommend that you instead use the model that is passed as part of the request object in your chat request handler. This ensures that your extension respects the model that the user chose in the chat model dropdown. Then, you send the request to the language model by using the sendRequest method."
Since I cannot seem to make it work with latest VS Code update lets wait a few weeks until it will get out of insiders and into standard VS Code version before we open it up
🎯 Aim of the feature
Currently in preview GitHub Copilot added the functionality to select the LLM model you want to use in the response
We should check if we may adapt SPFx Toolkit Chat participant to also use it. Currently we use GPT 4o
📷 Images (if possible) with expected result
No response
🤔 Additional remarks or comments
No response
The text was updated successfully, but these errors were encountered: