Skip to content

Commit

Permalink
Merge pull request #117 from B-urb/feature/default-fields
Browse files Browse the repository at this point in the history
Feature/default fields
  • Loading branch information
B-urb authored Oct 22, 2024
2 parents 28680b4 + 4f92be2 commit 32fb366
Show file tree
Hide file tree
Showing 7 changed files with 381 additions and 47 deletions.
29 changes: 16 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,19 +46,22 @@ With these prerequisites met, you are now ready to proceed with the installation
The application requires setting environment variables for its configuration. Below is a table describing each environment variable, indicating whether it is required or optional, its default value (if any), and a brief description:


| Environment Variable | Required | Default Value | Description |
|--------------------------|---------|----------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `PAPERLESS_TOKEN` | Yes | None | The authentication token for accessing the Paperless API. |
| `PAPERLESS_BASE_URL` | Yes | None | The base URL for the Paperless API. |
| `PAPERLESS_FILTER` | NO | "NOT tagged=true" | Filter string that filters the documents to be fetched from paperless |
| `LANGUAGE` | No | "EN" | Allow to use translated base prompts (Support: EN, DE) |
| `OLLAMA_HOST` | No | "localhost" | The hostname where the Ollama service is running. |
| `OLLAMA_PORT` | No | "11434" | The port on which the Ollama service is accessible. |
| `OLLAMA_SECURE_ENDPOINT` | No | "false" | Whether to use HTTPS (`true`) or HTTP (`false`) for Ollama. |
| `OLLAMA_MODEL` | No | "llama2:13b" | The specific Ollama model to be used for processing. |
| `BASE_PROMPT` | No | see [Example Prompt](example/example.prompt) | Prompt given to the model, for requesting metadata.<br/> Should contain the custom fields in paperless that you want doclytics. |
| `LOG_LEVEL` | No | INFO | Log level |
| `MODE` | No | 0 | :warning: **Experimental**: Mode of operation. <br/> 0 = NoCreate (Doclytics does not create custom fields automatically in Paperless), 1 = Create (Doclytics automatically creates custom fields that do not exist in Paperless). All fields will be created as type "Text" at the moment. In stable support, the type will be inferred. |
| Environment Variable | Required | Default Value | Description |
|---------------------------|---------|----------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `PAPERLESS_TOKEN` | Yes | None | The authentication token for accessing the Paperless API. |
| `PAPERLESS_BASE_URL` | Yes | None | The base URL for the Paperless API. |
| `PAPERLESS_FILTER` | NO | "NOT tagged=true" | Filter string that filters the documents to be fetched from paperless |
| `LANGUAGE` | No | "EN" | Allow to use translated base prompts (Support: EN, DE) |
| `OLLAMA_HOST` | No | "localhost" | The hostname where the Ollama service is running. |
| `OLLAMA_PORT` | No | "11434" | The port on which the Ollama service is accessible. |
| `OLLAMA_SECURE_ENDPOINT` | No | "false" | Whether to use HTTPS (`true`) or HTTP (`false`) for Ollama. |
| `OLLAMA_MODEL` | No | "llama2:13b" | The specific Ollama model to be used for processing. |
| `BASE_PROMPT` | No | see [Example Prompt](example/example.prompt) | Prompt given to the model, for requesting metadata.<br/> Should contain the custom fields in paperless that you want doclytics. |
| `LOG_LEVEL` | No | INFO | Log level |
| `MODE` | No | 0 | :warning: **Experimental**: Mode of operation. <br/> 0 = NoAnalyze(Doclytics does nothing for this field type), 1 = NoCreate (Doclytics does not create custom fields automatically in Paperless), 2 = Create (Doclytics automatically creates custom fields that do not exist in Paperless). All fields will be created as type "Text" at the moment. In stable support, the type will be inferred. |
| `DOCLYTICS_TAGS` | No | 0 | :warning: **Experimental**: Mode of operation. <br/> 0 = NoAnalyze(Doclytics does nothing for this field type), 1 = NoCreate (Doclytics does not create custom fields automatically in Paperless), 2 = Create (Doclytics automatically creates custom fields that do not exist in Paperless). All fields will be created as type "Text" at the moment. In stable support, the type will be inferred. |
| `DOCLYTICS_DOCTYPE` | No | 0 | :warning: **Experimental**: Mode of operation. <br/> 0 = NoAnalyze(Doclytics does nothing for this field type), 1 = NoCreate (Doclytics does not create custom fields automatically in Paperless), 2 = Create (Doclytics automatically creates custom fields that do not exist in Paperless). All fields will be created as type "Text" at the moment. In stable support, the type will be inferred. |
| `DOCLYTICS_CORRESPONDENT` | No | 0 | :warning: **Experimental**: Mode of operation. <br/> 0 = NoAnalyze(Doclytics does nothing for this field type), 1 = NoCreate (Doclytics does not create custom fields automatically in Paperless), 2 = Create (Doclytics automatically creates custom fields that do not exist in Paperless). All fields will be created as type "Text" at the moment. In stable support, the type will be inferred. |



Expand Down
33 changes: 33 additions & 0 deletions src/error.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
use std::fmt;

#[derive(Debug)]
pub enum ResponseError {
Io(std::io::Error),
ParseBody(std::num::ParseIntError),
RequestError(std::io::Error),
Other(String),
}

// Step 2: Implement std::fmt::Display
impl fmt::Display for ResponseError {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
ResponseError::Io(ref err) => write!(f, "IO error: {}", err),
ResponseError::ParseBody(ref err) => write!(f, "Parse error: {}", err),
ResponseError::RequestError(ref err) => write!(f, "Parse error: {}", err),
ResponseError::Other(ref err) => write!(f, "Other error: {}", err),
}
}
}

impl std::error::Error for ResponseError {
fn source(&self) -> Option<&(dyn std::error::Error + 'static)> {
match *self {
ResponseError::Io(ref err) => Some(err),
ResponseError::ParseBody(ref err) => Some(err),
ResponseError::RequestError(ref err) => Some(err),
ResponseError::Other(_) => None,
}
}
}

5 changes: 1 addition & 4 deletions src/llm_api.rs
Original file line number Diff line number Diff line change
@@ -1,15 +1,12 @@
use ollama_rs::generation::completion::GenerationResponse;
use ollama_rs::generation::completion::request::GenerationRequest;
use ollama_rs::Ollama;
use crate::Document;

pub async fn generate_response(
ollama: &Ollama,
model: &String,
prompt_base: &String,
document: &Document,
prompt: String,
) -> std::result::Result<GenerationResponse, Box<dyn std::error::Error>> {
let prompt = format!("{} {}", document.content, prompt_base);
let res = ollama
.generate(GenerationRequest::new(model.clone(), prompt))
.await;
Expand Down
Loading

0 comments on commit 32fb366

Please sign in to comment.