Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change model on the fly #415

Merged
merged 23 commits into from
Jun 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
227d18f
added a feature where the model parameter in openai_params can now be…
barnii77 Mar 23, 2024
c81ab76
on opening chat window, the model was not being collapsed for the set…
barnii77 Mar 23, 2024
e1c5a05
testing using debug output
barnii77 Mar 23, 2024
b5293e4
removed debug out again because it seems to now work magically?
barnii77 Mar 23, 2024
2240812
debugging
barnii77 Mar 23, 2024
45b4972
debugging
barnii77 Mar 23, 2024
4373c35
still debugging
barnii77 Mar 23, 2024
984492c
if the model is determined by a function, just display <dynamic> in s…
barnii77 Mar 23, 2024
04180ba
debugging
barnii77 Mar 23, 2024
6f5aa42
still debugging
barnii77 Mar 23, 2024
d62599a
had value, key instead of key, value in a for loop because I dont kno…
barnii77 Mar 23, 2024
43dbe14
seems to be working, testing it now
barnii77 Mar 23, 2024
ef70038
debug output for model
barnii77 Mar 23, 2024
bad2508
typo in toMessages function in settings.lua, fixed now
barnii77 Mar 23, 2024
64ec802
more debugging
barnii77 Mar 23, 2024
a38eca7
still debugging :(
barnii77 Mar 23, 2024
2f4ff37
vim.inspect missing
barnii77 Mar 23, 2024
fe25603
the plugin is tested and working, you can now switch models dynamical…
barnii77 Mar 23, 2024
295e26f
reformatted the config sample to be more readable. you can now pass a…
barnii77 Mar 23, 2024
2955f4f
finally, removed all debug notifications
barnii77 Mar 23, 2024
47701e1
Update api.lua
PaperTarsier692 Mar 27, 2024
64413d2
Merge pull request #1 from PaperTarsier692/patch-1
barnii77 Mar 27, 2024
62fc6fb
removed a goto statement by refactoring the code because goto does no…
barnii77 Mar 29, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 41 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -140,11 +140,50 @@ or if you are using [lazy.nvim](https://github.com/folke/lazy.nvim):

## Configuration

`ChatGPT.nvim` comes with the following defaults, you can override them by
passing config as setup param
`ChatGPT.nvim` comes with the following defaults, you can override them by passing config as setup param

https://github.com/jackMort/ChatGPT.nvim/blob/f1453f588eb47e49e57fa34ac1776b795d71e2f1/lua/chatgpt/config.lua#L10-L182

### Example Configuration

A simple configuration of the chat model could look something like this:
```lua
{
"jackMort/ChatGPT.nvim",
event = "VeryLazy",
config = function()
require("chatgpt").setup({
-- this config assumes you have OPENAI_API_KEY environment variable set
openai_params = {
-- NOTE: model can be a function returning the model name
-- this is useful if you want to change the model on the fly
-- using commands
-- Example:
-- model = function()
-- if some_condition() then
-- return "gpt-4-1106-preview"
-- else
-- return "gpt-3.5-turbo"
-- end
-- end,
model = "gpt-4-1106-preview",
frequency_penalty = 0,
presence_penalty = 0,
max_tokens = 4095,
temperature = 0.2,
top_p = 0.1,
n = 1,
}
})
end,
dependencies = {
"MunifTanjim/nui.nvim",
"nvim-lua/plenary.nvim",
"nvim-telescope/telescope.nvim"
}
}
```

### Secrets Management

Providing the OpenAI API key via an environment variable is dangerous, as it
Expand Down
15 changes: 12 additions & 3 deletions lua/chatgpt/api.lua
Original file line number Diff line number Diff line change
@@ -1,16 +1,24 @@
local job = require("plenary.job")
local Config = require("chatgpt.config")
local logger = require("chatgpt.common.logger")
local Utils = require("chatgpt.utils")

local Api = {}

function Api.completions(custom_params, cb)
local params = vim.tbl_extend("keep", custom_params, Config.options.openai_params)
local openai_params = Utils.collapsed_openai_params(Config.options.openai_params)
local params = vim.tbl_extend("keep", custom_params, openai_params)
Api.make_call(Api.COMPLETIONS_URL, params, cb)
end

function Api.chat_completions(custom_params, cb, should_stop)
local params = vim.tbl_extend("keep", custom_params, Config.options.openai_params)
local openai_params = Utils.collapsed_openai_params(Config.options.openai_params)
local params = vim.tbl_extend("keep", custom_params, openai_params)
-- the custom params contains <dynamic> if model is not constant but function
-- therefore, use collapsed openai params (with function evaluated to get model) if that is the case
if params.model == "<dynamic>" then
params.model = openai_params.model
end
local stream = params.stream or false
if stream then
local raw_chunks = ""
Expand Down Expand Up @@ -90,7 +98,8 @@ function Api.chat_completions(custom_params, cb, should_stop)
end

function Api.edits(custom_params, cb)
local params = vim.tbl_extend("keep", custom_params, Config.options.openai_edit_params)
local openai_params = Utils.collapsed_openai_params(Config.options.openai_params)
local params = vim.tbl_extend("keep", custom_params, openai_params)
if params.model == "text-davinci-edit-001" or params.model == "code-davinci-edit-001" then
vim.notify("Edit models are deprecated", vim.log.levels.WARN)
Api.make_call(Api.EDITS_URL, params, cb)
Expand Down
93 changes: 45 additions & 48 deletions lua/chatgpt/code_edits.lua
Original file line number Diff line number Diff line change
Expand Up @@ -350,57 +350,54 @@ M.edit_with_instructions = function(output_lines, bufnr, selection, ...)
-- cycle windows
for _, popup in ipairs({ input_window, output_window, settings_panel, help_panel, instructions_input }) do
for _, mode in ipairs({ "n", "i" }) do
if mode == "i" and (popup == input_window or popup == output_window) then
goto continue
end

popup:map(mode, Config.options.edit_with_instructions.keymaps.cycle_windows, function()
-- #352 is a bug where active_panel is something not in here, maybe an
-- old window or something, lost amongst the global state
local possible_windows = {
input_window,
output_window,
settings_panel,
help_panel,
instructions_input,
unpack(open_extra_panels),
}

-- So if active_panel isn't something we expect it to be, make it do be.
if not inTable(possible_windows, active_panel) then
active_panel = instructions_input
end

local active_panel_is_in_extra_panels = inTable(open_extra_panels, active_panel)
if active_panel == instructions_input then
vim.api.nvim_set_current_win(input_window.winid)
active_panel = input_window
vim.api.nvim_command("stopinsert")
elseif active_panel == input_window and mode ~= "i" then
vim.api.nvim_set_current_win(output_window.winid)
active_panel = output_window
vim.api.nvim_command("stopinsert")
elseif active_panel == output_window and mode ~= "i" then
if #open_extra_panels == 0 then
vim.api.nvim_set_current_win(instructions_input.winid)
if not (mode == "i" and (popup == input_window or popup == output_window)) then
popup:map(mode, Config.options.edit_with_instructions.keymaps.cycle_windows, function()
-- #352 is a bug where active_panel is something not in here, maybe an
-- old window or something, lost amongst the global state
local possible_windows = {
input_window,
output_window,
settings_panel,
help_panel,
instructions_input,
unpack(open_extra_panels),
}

-- So if active_panel isn't something we expect it to be, make it do be.
if not inTable(possible_windows, active_panel) then
active_panel = instructions_input
else
vim.api.nvim_set_current_win(open_extra_panels[1].winid)
active_panel = open_extra_panels[1]
end
elseif active_panel_is_in_extra_panels then
-- next index with wrap around and 0 for instructions_input
local next_index = (active_panel_is_in_extra_panels + 1) % (#open_extra_panels + 1)
if next_index == 0 then
vim.api.nvim_set_current_win(instructions_input.winid)
active_panel = instructions_input
else
vim.api.nvim_set_current_win(open_extra_panels[next_index].winid)
active_panel = open_extra_panels[next_index]

local active_panel_is_in_extra_panels = inTable(open_extra_panels, active_panel)
if active_panel == instructions_input then
vim.api.nvim_set_current_win(input_window.winid)
active_panel = input_window
vim.api.nvim_command("stopinsert")
elseif active_panel == input_window and mode ~= "i" then
vim.api.nvim_set_current_win(output_window.winid)
active_panel = output_window
vim.api.nvim_command("stopinsert")
elseif active_panel == output_window and mode ~= "i" then
if #open_extra_panels == 0 then
vim.api.nvim_set_current_win(instructions_input.winid)
active_panel = instructions_input
else
vim.api.nvim_set_current_win(open_extra_panels[1].winid)
active_panel = open_extra_panels[1]
end
elseif active_panel_is_in_extra_panels then
-- next index with wrap around and 0 for instructions_input
local next_index = (active_panel_is_in_extra_panels + 1) % (#open_extra_panels + 1)
if next_index == 0 then
vim.api.nvim_set_current_win(instructions_input.winid)
active_panel = instructions_input
else
vim.api.nvim_set_current_win(open_extra_panels[next_index].winid)
active_panel = open_extra_panels[next_index]
end
end
end
end, {})
::continue::
end, {})
end
end
end

Expand Down
14 changes: 12 additions & 2 deletions lua/chatgpt/flows/chat/base.lua
Original file line number Diff line number Diff line change
Expand Up @@ -517,7 +517,7 @@ function Chat:toMessages()
role = "assistant"
end
local content = {}
if self.params.model == "gpt-4-vision-preview" then
if Utils.collapsed_openai_params(self.params).model == "gpt-4-vision-preview" then
for _, line in ipairs(msg.lines) do
table.insert(content, createContent(line))
end
Expand Down Expand Up @@ -736,7 +736,17 @@ function Chat:get_layout_params()
end

function Chat:open()
self.settings_panel = Settings.get_settings_panel("chat_completions", self.params)
local displayed_params = Utils.table_shallow_copy(self.params)
-- if the param is decided by a function and not constant, write <dynamic> for now
-- TODO: if the current model should be displayed, the settings_panel would
-- have to be constantly modified or rewritten to be able to manage a function
-- returning the model as well
for key, value in pairs(self.params) do
if type(value) == "function" then
displayed_params[key] = "<dynamic>"
end
end
self.settings_panel = Settings.get_settings_panel("chat_completions", displayed_params)
self.help_panel = Help.get_help_panel("chat")
self.sessions_panel = Sessions.get_panel(function(session)
self:set_session(session)
Expand Down
24 changes: 24 additions & 0 deletions lua/chatgpt/utils.lua
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,30 @@ local M = {}

local ESC_FEEDKEY = vim.api.nvim_replace_termcodes("<ESC>", true, false, true)

---@param tbl table
---@return table
function M.table_shallow_copy(tbl)
local copy = {}
for key, value in pairs(tbl) do
copy[key] = value
end
return copy
end

--- A function that collapses the openai params.
--- This means all the parameters of the openai_params that can be either constants or functions
--- will be set to constants by evaluating the functions.
---@param openai_params table
---@return table
function M.collapsed_openai_params(openai_params)
local collapsed = M.table_shallow_copy(openai_params)
-- use copied version of table so the original model value remains a function and can still change
if type(collapsed.model) == "function" then
collapsed.model = collapsed.model()
end
return collapsed
end

function M.split(text)
local t = {}
for str in string.gmatch(text, "%S+") do
Expand Down