Skip to content

Commit

Permalink
README
Browse files Browse the repository at this point in the history
  • Loading branch information
olegklimov committed Oct 20, 2023
1 parent 7a2bcd4 commit 5a1ed5c
Show file tree
Hide file tree
Showing 2 changed files with 69 additions and 7 deletions.
72 changes: 67 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,36 @@
# Code Scratchpads
# Refact LSP Server

This code converts high level code completion calls to low level prompts and converts result back. This is useful for many IDE plugins (VS Code, JB) as a common code that handles the low level.
This code converts high level code completion or chat calls into low level LLM prompts and converts results back.

It's written in Rust, compiles into the `refact-lsp` binary. This binary is bunlded with
[VS Code](https://github.com/smallcloudai/refact-vscode/)
[JetBrains IDEs](https://github.com/smallcloudai/refact-intellij)
[VS Classic](https://github.com/smallcloudai/refact-vs-classic/),
[Sublime Text](https://github.com/smallcloudai/refact-sublime/),
and
[Qt Creator](https://github.com/smallcloudai/refact-qtcreator)
plugins.

It's a great way to organize code for the plugins, because it can absorb all the common logic, such as cache, debounce,
telemetry, scratchpads for different models.


## Compiling and Running

Depending on which API key you have handy, or maybe you have Refact self-hosting server:

```
cargo build && target/debug/refact-lsp --address-url Refact --api-key YYYY --http-port 8001 --lsp-port 8002 --logs-stderr
cargo build && target/debug/refact-lsp --address-url HF --api-key hf_XXXX --http-port 8001 --lsp-port 8002 --logs-stderr
cargo build && target/debug/refact-lsp --address-url http://127.0.0.1:8008/ --http-port 8001 --lsp-port 8002 --logs-stderr
```

Try `--help` for more options.


## Usage

Simple example:
HTTP example:

```
curl http://127.0.0.1:8001/v1/code-completion -k \
Expand All @@ -20,7 +45,6 @@ curl http://127.0.0.1:8001/v1/code-completion -k \
},
"multiline": true
},
"model": "bigcode/starcoder",
"stream": false,
"parameters": {
"temperature": 0.1,
Expand All @@ -31,5 +55,43 @@ curl http://127.0.0.1:8001/v1/code-completion -k \

Output is `[{"code_completion": "\n return \"Hello World!\"\n"}]`.

To check out more examples, look at [code_scratchpads/tests/test_api.py](code_scratchpads/tests/test_api.py).
[LSP example](examples/lsp_completion.ts)


## Telemetry

The flags `--basic-telemetry` and `--snippet-telemetry` control what telemetry is sent. To be clear: without
these flags, no telemetry is sent. Those flags are typically controlled from IDE plugin settings.

Basic telemetry means counters and error messages without information about you or your code. It is "compressed"
into `.cache/refact/telemetry/compressed` folder, then from time to time it's sent and moved
to `.cache/refact/telemetry/sent` folder.

"Compressed" means similar records are joined together, increasing the counter. "Sent" means the rust binary
communicates with a HTTP endpoint specified in caps (see Caps section below) and sends .json file exactly how
you see it in `.cache/refact/telemetry`. The files are human-readable.

When using Refact self-hosted server, telemetry goes to the self-hosted server, not to the cloud.


## Caps File

The `--address-url` parameter controls the behavior of this program by a lot. The address is first used
to construct `$URL/coding_assistant_caps.json` address to fetch the caps file. Furthermore, there are
compiled-in caps files you can use by magic addresses "Refact" and "HF".

The caps file defines which models are running, where to send the telemetry, how to download a
tokenizer, where is the endpoint to access actual language models. To read more, check out
compiled-in caps in [caps.rs](src/caps.rs).


## Tests

The one to run often is [test_edge_cases.py](tests/test_edge_cases.py).

You can also run [measure_humaneval_fim.py](tests/measure_humaneval_fim.py) for your favorite model.


## Credits

The initial version of this project was written by looking at llm-ls by @McPatate. He's a Rust fan who inspired this project!
4 changes: 2 additions & 2 deletions src/caps.rs
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ const HF_DEFAULT_CAPS: &str = r#"
}
"#;

const SMC_DEFAULT_CAPS: &str = r#"
const REFACT_DEFAULT_CAPS: &str = r#"
{
"cloud_name": "Refact",
"endpoint_template": "https://inference.smallcloud.ai/v1/completions",
Expand All @@ -219,7 +219,7 @@ pub async fn load_caps(
buffer = HF_DEFAULT_CAPS.to_string();
report_url = "<compiled-in-caps-hf>".to_string();
} else if cmdline.address_url == "Refact" {
buffer = SMC_DEFAULT_CAPS.to_string();
buffer = REFACT_DEFAULT_CAPS.to_string();
report_url = "<compiled-in-caps-smc>".to_string();
} else if not_http {
let base: PathBuf = PathBuf::from(cmdline.address_url.clone());
Expand Down

0 comments on commit 5a1ed5c

Please sign in to comment.