-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Documentation request: Ollama + Helix from scratch #57
Comments
Thank you for your thoughts. This is not a problem I will be able to 100% solve myself. I appreciate that you value the project enough to share your ideas, and I agree with you on most points. The current configuration guide is awful. For users to be successful they must be aware of and understand how language servers work, how to add them to their editor, and how to debug them. This is an unrealistic expectation. You are right a good guide going from zero to in editor chatting / completion would be incredibly helpful. I will get this out. I don’t think we can emulate helix’s batteries included methodology here. There are too many moving pieces, and to a certain extent, I don’t think it’s bad it requires some thought to configure as this a powerful useful tool for developers if used correctly. What I want is to create a online configurator that asks a series of questions like:
What do you think of an online configurator that not only outputs (in the case of helix) the languages.toml but explains what it does, how to use it, etc….? We can extend this to work for Neovim, and other popular text editors. I want to remove the barrier of having to read through the configuration section of the wiki unless they want to get into the weeds. The logging is awful right now. I have a pr coming out this weekend (I think) that creates our own logging file which also includes all completion requests sent to llms in a nice user readable format. It’s a massive improvement. I’m still debating whether to make it the default log file, or if it should be a command line parameter that enables it. It will exist in the users cache directory (~/.cache on linux). Once again thank you for your thoughts! It’s conversations like this that will make this project great. |
RE: A guide that produces a setup for you-- that'd be super neat. Surely the simplest workflow is to push the LLM setup side of things onto something like Ollama no? As they're working on a pretty simple:
We can push the more difficult part entirely there in the short term, thereby alleviating the burden of documenting here. If this is still going on in a few weeks now when I likely next get a chance to write some code I'll throw in a few PRs, but it looks like your project already has significant interest and someone else will likely pick it up. |
Hey sorry to drop by I don't know if this is the correct place to voice my problem, but I want to start by thanking you, your project looks amazing and as a hardcore helix-user I've been missing on LLM inside helix since they started to become popular, so I'm really excited to get your lsp-working. But currently I'm facing some issues and I'd be really great if someone could guide me or share his configuration for helix. So currently I've installed codellama through this command : I've decided to use codellama 7b installed everything through the ollama CLI and I was able to run the model locally: Next I've configured the lsp in my language.toml as follow
And I've added that configuration to my language
So far so good everything seems about what I would expect from adding an lsp into language.toml appart from the bit of extra configuration. Now I think I must be missing a step because the lsp crashes and fails to complete anything. I've tried a bunch of different tweaks but maybe I'm missing something really obvious or maybe the model can't work with this configuration. I've tried to debug with the log_file and even exported the
So if you have any solution I'd be glad to hear it, just for information I'm on Ubuntu 24.04, and I'm using |
I'm glad you like it! Don't apologize the documentation is kind of rough right now, I'm happy to help. Can you set the env variable export LSP_AI_LOG=DEBUG And then share your helix logs? |
Yes of course this is what I get
|
I don't see any errors in the logs. Can you type in your editor? Use it until you would expect to see a completion and share those logs. |
helix.log |
I just tested your configuration on my computer and it is working fine for me. Your logs show a bunch of completion requests to the language server and we can see the language server calling out to ollama: 2024-08-22T11:07:11.218 helix_lsp::transport [ERROR] lsp-ai err <- " INFO dispatch_request{request=Completion(CompletionRequest { id: RequestId(I32(1)), params: CompletionParams { text_document_position: TextDocumentPositionParams { text_document: TextDocumentIdentifier { uri: Url { scheme: \"file\", cannot_be_a_base: false, username: \"\", password: None, host: None, port: None, path: \"/home/pollivie/workspace/school/CPP-42/CPP03/ex00/ClapTrap.cpp\", query: None, fragment: None } }, position: Position { line: 71, character: 4 } }, work_done_progress_params: WorkDoneProgressParams { work_done_token: None }, partial_result_params: PartialResultParams { partial_result_token: None }, context: Some(CompletionContext { trigger_kind: Invoked, trigger_character: None }) } })}:do_generate{prompt=ContextAndCode(ContextAndCodePrompt { context: \"\", code: \"/* ************************************************************************** */\\n/* */\\n/* ::: :::::::: */\\n/* ClapTrap.cpp :+: :+: :+: */\\n/* +:+ +:+ +:+ */\\n/* By: pollivie <pollivie.student.42.fr> +#+ +:+ +#+ */\\n/* +#+#+#+#+#+ +#+ */\\n/* Created: 2024/08/21 13:30:54 by pollivie #+# #+# */\\n/* Updated: 2024/08/21 13:30:54 by pollivie ### ########.fr */\\n/* */\\n/* ************************************************************************** */\\n\\n#include \\\"ClapTrap.hpp\\\"\\n\\nClapTrap::ClapTrap() {\\n}\\n\\nClapTrap::ClapTrap(std::string name, unsigned int hit_points, unsigned int energy_points, unsigned int attack_damage)\\n : _name(name), _hit_points(hit_points), _energy_points(energy_points), _attack_damage(attack_damage) {\\n}\\n\\nClapTrap::ClapTrap(ClapTrap const &other) {\\n\\t*this = other;\\n}\\n\\nClapTrap &ClapTrap::operator=(const ClapTrap &other) {\\n\\tif (this != &other) {\\n\\t\\tthis->_name = other._name;\\n\\t\\tthis->_hit_points = other._hit_points;\\n\\t\\tthis->_energy_points = other._energy_points;\\n\\t\\tthis->_attack_damage = other._attack_damage;\\n\\t}\\n\\treturn (*this);\\n}\\n\\nbool ClapTrap::isAlive(void) const {\\n\\treturn (_hit_points != 0);\\n}\\n\\nbool ClapTrap::hasEnoughEnergy(void) const {\\n\\treturn (_energy_points != 0);\\n}\\n\\nvoid ClapTrap::attack(const std::string &target) {\\n\\tstd::cout << \\\"ClapTrap \\\" << _name;\\n\\tif (!isAlive()) {\\n\\t\\tstd::cout << \\\", can't attack because he is dead\\\";\\n\\t} else if (hasEnoughEnergy()) {\\n\\t\\tstd::cout << \\\" attacks \\\" << target << \\\", causing \\\" << _attack_damage << \\\" points of damage!\\\";\\n\\t\\t_energy_points -= 1;\\n\\t} else {\\n\\t\\tstd::cout << \\\", can't attack because he lacks energy\\\";\\n\\t}\\n\\tstd::cout << std::endl;\\n}\\n\\nvoid ClapTrap::takeDamage(const unsigned int amount) {\\n\\tstd::cout << \\\"ClapTrap \\\" << _name;\\n\\tif (!isAlive()) {\\n\\t\\tstd::cout << \\\" can't be attacked because he is dead\\\" << std::endl;\\n\\t} else if (amount >= _hit_points) {\\n\\t\\t_hit_points = 0;\\n\\t\\tstd::cout << \\\" took \\\" << amount << \\\" of damage\\\";\\n\\t\\tstd::cout << \\\" and is now dead\\\" << std::endl;\\n\\t} else {\\n\\t\\t_hit_points -= amount;\\n\\t\\tstd::cout << \\\" took \\\" << amount << \\\" of damage\\\";\\n\\t\\tstd::cout << \\\" and is now left with\\\" << _hit_points << \\\" points of health\\\" << std::endl;\\n\\t}\\n}\\n\\nvoid\", selected_text: None }) params=Object {\"max_context\": String(\"2000\"), \"option\": Object {\"num_predicts\": String(\"32\")}}}: lsp_ai::transformer_backends::ollama: Calling Ollama compatible completions API with parameters:\n"
2024-08-22T11:07:11.218 helix_lsp::transport [ERROR] lsp-ai err <- "{\n"
2024-08-22T11:07:11.218 helix_lsp::transport [ERROR] lsp-ai err <- " \"keep_alive\": null,\n"
2024-08-22T11:07:11.218 helix_lsp::transport [ERROR] lsp-ai err <- " \"model\": \"codellama:7b\",\n"
2024-08-22T11:07:11.218 helix_lsp::transport [ERROR] lsp-ai err <- " \"options\": {},\n"
2024-08-22T11:07:11.218 helix_lsp::transport [ERROR] lsp-ai err <- " \"prompt\": \"\\n\\n/* ************************************************************************** */\\n/* */\\n/* ::: :::::::: */\\n/* ClapTrap.cpp :+: :+: :+: */\\n/* +:+ +:+ +:+ */\\n/* By: pollivie <pollivie.student.42.fr> +#+ +:+ +#+ */\\n/* +#+#+#+#+#+ +#+ */\\n/* Created: 2024/08/21 13:30:54 by pollivie #+# #+# */\\n/* Updated: 2024/08/21 13:30:54 by pollivie ### ########.fr */\\n/* */\\n/* ************************************************************************** */\\n\\n#include \\\"ClapTrap.hpp\\\"\\n\\nClapTrap::ClapTrap() {\\n}\\n\\nClapTrap::ClapTrap(std::string name, unsigned int hit_points, unsigned int energy_points, unsigned int attack_damage)\\n : _name(name), _hit_points(hit_points), _energy_points(energy_points), _attack_damage(attack_damage) {\\n}\\n\\nClapTrap::ClapTrap(ClapTrap const &other) {\\n\\t*this = other;\\n}\\n\\nClapTrap &ClapTrap::operator=(const ClapTrap &other) {\\n\\tif (this != &other) {\\n\\t\\tthis->_name = other._name;\\n\\t\\tthis->_hit_points = other._hit_points;\\n\\t\\tthis->_energy_points = other._energy_points;\\n\\t\\tthis->_attack_damage = other._attack_damage;\\n\\t}\\n\\treturn (*this);\\n}\\n\\nbool ClapTrap::isAlive(void) const {\\n\\treturn (_hit_points != 0);\\n}\\n\\nbool ClapTrap::hasEnoughEnergy(void) const {\\n\\treturn (_energy_points != 0);\\n}\\n\\nvoid ClapTrap::attack(const std::string &target) {\\n\\tstd::cout << \\\"ClapTrap \\\" << _name;\\n\\tif (!isAlive()) {\\n\\t\\tstd::cout << \\\", can't attack because he is dead\\\";\\n\\t} else if (hasEnoughEnergy()) {\\n\\t\\tstd::cout << \\\" attacks \\\" << target << \\\", causing \\\" << _attack_damage << \\\" points of damage!\\\";\\n\\t\\t_energy_points -= 1;\\n\\t} else {\\n\\t\\tstd::cout << \\\", can't attack because he lacks energy\\\";\\n\\t}\\n\\tstd::cout << std::endl;\\n}\\n\\nvoid ClapTrap::takeDamage(const unsigned int amount) {\\n\\tstd::cout << \\\"ClapTrap \\\" << _name;\\n\\tif (!isAlive()) {\\n\\t\\tstd::cout << \\\" can't be attacked because he is dead\\\" << std::endl;\\n\\t} else if (amount >= _hit_points) {\\n\\t\\t_hit_points = 0;\\n\\t\\tstd::cout << \\\" took \\\" << amount << \\\" of damage\\\";\\n\\t\\tstd::cout << \\\" and is now dead\\\" << std::endl;\\n\\t} else {\\n\\t\\t_hit_points -= amount;\\n\\t\\tstd::cout << \\\" took \\\" << amount << \\\" of damage\\\";\\n\\t\\tstd::cout << \\\" and is now left with\\\" << _hit_points << \\\" points of health\\\" << std::endl;\\n\\t}\\n}\\n\\nvoid\",\n"
2024-08-22T11:07:11.218 helix_lsp::transport [ERROR] lsp-ai err <- " \"raw\": true,\n"
2024-08-22T11:07:11.218 helix_lsp::transport [ERROR] lsp-ai err <- " \"stream\": false\n"
2024-08-22T11:07:11.218 helix_lsp::transport [ERROR] lsp-ai err <- "}\n" What are you typing in your file? Is your file just a ton of asterisks? The weird thing is I am not seeing any responses from ollama. Have you tested that you can run codellama:7b locally? Also make sure to upgrade to the latest version of lsp-ai if you have not:
As it has some better logging features. |
Oh I've figured it out, I use a school header everywhere in my projects and I think that must confuse the model response or making it a lot slower than it should be, I'm not used to completion taking this long but I guess this make sense on my hardware I only have an rx6700s. So not a top of the line NVIDIA gpu. So I just need to wait for 10s and I actually get a suggestion. Sorry that I bothered you with my logs and thanks for taking the time, I think I'm going to try different models and see if one is snappy enough. |
So, the existing example/xyz.tomls aren't exactly easy to get going in the helix philosophy of 'batteries included".
I think we have two issues:
Perhaps suggesting that users get started with a local
.helix/languages.toml
which if alongside their source will get merged with their system'slanguages.toml
as defined by wherever they set the$HELIX_RUNTIME
to. This will be especially problematic to debug as, whatever languge they choose to addlsp-ai
to as an additional lsp will really blast their$HELIX_LOG
(as it will at a minimum have multiple lsp's output in there)Having a bind setup for triggering autocomplete (or chat?) will involve them adding things to their regular
config.toml
, you cannot (at least on latest helix) make a keybinding in alanguages.toml
.Perhaps it'd be worth (rather than providing dozens of snippets for supported models in json across the configuration in favour of one or two thorough examples that take users from soup to nuts.
As this is somewhat positioned as a copilot alternative, I worry that the project will miss out on a great deal of users and such as a result of the amount of understanding required to get this tol running.
Hope you keep it up, this is a great idea!
The text was updated successfully, but these errors were encountered: