Skip to content

Commit

Permalink
docs: improved docs
Browse files Browse the repository at this point in the history
  • Loading branch information
JayGhiya committed Oct 7, 2024
1 parent 07c4deb commit 08c6674
Show file tree
Hide file tree
Showing 4 changed files with 258 additions and 273 deletions.
8 changes: 8 additions & 0 deletions code-confluence/README
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Code Confluence Documentation


## HOW TO RUN LOCALLY?

1. Install dependencies: `yarn install`
2. Build the project: `yarn build`
3. Run the project: `yarn start`
189 changes: 132 additions & 57 deletions code-confluence/docs/quickstart/how-to-run.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,78 +2,153 @@
sidebar_position: 2
---

# How to Run
# # Quick Start Guide

Before you can run the unoplat code confluence tool on your local machine, you need to set up the necessary configuration file. This file tells tool where your codebase is located, where to output the analysis results, and other essential information. Below is a step-by-step guide to setting up your configuration.
Welcome to **Unoplat Code Confluence**! This guide will help you quickly set up and start using our platform to enhance your codebase management and collaboration.

## Table of Contents

1. [Introduction](#introduction)
2. [Prerequisites](#prerequisites)
3. [1. Graph Database Setup](#1-graph-database-setup)
- [Installation](#installation)
4. [2. Generate Summary and Ingest Codebase](#2-generate-summary-and-ingest-codebase)
- [Ingestion Configuration](#ingestion-configuration)
- [Running the Unoplat Code Confluence Ingestion Utility](#running-the-unoplat-code-confluence-ingestion-utility)
5. [3. Setup Chat Interface](#3-setup-chat-interface)
- [Query Engine Configuration](#query-engine-configuration)
- [Launch Query Engine](#launch-query-engine)
6. [Troubleshooting](#troubleshooting)



---

## Introduction

**Unoplat Code Confluence** empowers developers to effortlessly navigate and understand complex codebases. By leveraging a graph database and an intuitive chat interface, our platform enhances collaboration and accelerates onboarding.

## Prerequisites

Before you begin, ensure you have the following installed on your system:

- [Docker](https://www.docker.com/get-started)
- [Pipx](https://github.com/pypa/pipx)
- [Poetry](https://python-poetry.org/)

```bash
{
"local_workspace_path": "your path to codebase",
"output_path": "directory path for markdown output",
"output_file_name": "name of markdown output (example - xyz.md)",
"codebase_name": "name of your codebase",
"programming_language": "programming language type(example- java or python)",
"repo":
{
"download_url": "archguard/archguard",
"download_directory": "download directory for arcguard tool"
},
"api_tokens":
{
"github_token": "your github pat for downloading arcguard"
},
"llm_provider_config":
{
"openai":
{
"api_key": "YourApiKey",
"model": "gpt-3.5-turbo-16k",
"model_type" : "chat",
"max_tokens": 1024,
"temperature": 0.0
}
},
"logging_handlers":
[
{
"sink": "~/Documents/unoplat/app.log",
"format": "<green>{time:YYYY-MM-DD at HH:mm:ss}</green> | <level>{level}</level> | <cyan>{name}</cyan>:<cyan>{function}</cyan>:<cyan>{line}</cyan> - <level>{message}</level>",
"rotation": "10 MB",
"retention": "10 days",
"level": "INFO"
}
]
}
pipx install poetry
```

## 1. Graph Database Setup

**Configuration Note:** Do not change the download_url and keep the programming_language to java or python (as right now only java & python are supported)
### Installation

**LLM Provider Config**
1. **Run the Neo4j Container**

Model Providers Supported:
- OpenAI
- Together AI
- Anyscale
- AWS Anthropic
- Cohere (currently not working, issue already created and will be addressed soon)
- Ollama
```bash
docker run \
--name neo4j-container \
--restart always \
--publish 7474:7474 \
--publish 7687:7687 \
--env NEO4J_AUTH=neo4j/Ke7Rk7jB:Jn2Uz: \
--volume /Users/jayghiya/Documents/unoplat/neo4j-data:/data \
--volume /Users/jayghiya/Documents/unoplat/neo4j-plugins/:/plugins \
neo4j:5.23.0
```


## 2. Generate Summary and Ingest Codebase

For configuration inside `llm_provider_config`, refer to the [Dspy Model Provider Documentation](https://github.com/stanfordnlp/dspy/blob/main/docs/model_providers.md).
### Ingestion Configuration

If you're looking for credits, sign up on [Together AI](https://www.together.ai/) and get $25 to run Code Confluence on a repository of your choice. You can also use Ollama as an alternative.
```json
{
"local_workspace_path": "/Users/jayghiya/Documents/unoplat/textgrad/textgrad",
"output_path": "/Users/jayghiya/Documents/unoplat",
"output_file_name": "unoplat_textgrad.md",
"codebase_name": "textgrad",
"programming_language": "python",
"repo": {
"download_url": "archguard/archguard",
"download_directory": "/Users/jayghiya/Documents/unoplat"
},
"api_tokens": {
"github_token": "Your github pat token"
},
"llm_provider_config": {
"openai": {
"api_key": "Your openai api key",
"model": "gpt-4o-mini",
"model_type": "chat",
"max_tokens": 512,
"temperature": 0.0
}
},
"logging_handlers": [
{
"sink": "~/Documents/unoplat/app.log",
"format": "<green>{time:YYYY-MM-DD at HH:mm:ss}</green> | <level>{level}</level> | <cyan>{name}</cyan>:<cyan>{function}</cyan>:<cyan>{line}</cyan> | <magenta>{thread.name}</magenta> - <level>{message}</level>",
"rotation": "10 MB",
"retention": "10 days",
"level": "DEBUG"
}
],
"parallisation": 3,
"sentence_transformer_model": "jinaai/jina-embeddings-v3",
"neo4j_uri": "bolt://localhost:7687",
"neo4j_username": "neo4j",
"neo4j_password": "Ke7Rk7jB:Jn2Uz:"

}```

**Together Example:**
### Running the Unoplat Code Confluence Ingestion Utility

1.**Installation**

```bash
pipx install 'git+https://github.com/unoplat/unoplat-code-confluence.git@main#subdirectory=unoplat-code-confluence'
```

2.**Run the Ingestion Utility**

```bash
"llm_provider_config":
unoplat-code-confluence --config /path/to/your/config.json
```


## 3. Setup Chat Interface

### Query Engine Configuration

```json
{
"together":
{
"api_key": "YourApiKey",
"model": "zero-one-ai/Yi-34B-Chat"
}
"sentence_transformer_model": "jinaai/jina-embeddings-v3",
"neo4j_uri": "bolt://localhost:7687",
"neo4j_username": "neo4j",
"neo4j_password": "your neo4j password",
"provider_model_dict": {
"model_provider" : "openai/gpt-4o-mini",
"model_provider_args": {
"api_key": "your openai api key",
"max_tokens": 500,
"temperature": 0.0
}
}
}
```

### Launch Query Engine

1. **Installation**

```bash
pipx install 'git+https://github.com/unoplat/unoplat-code-confluence.git@main#subdirectory=unoplat-code-confluence-query-engine'
```

2. **Run the Query Engine**

```bash
unoplat-code-confluence-query-engine --config /path/to/your/config.json
```
16 changes: 4 additions & 12 deletions code-confluence/src/components/HomepageFeatures/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,7 @@ const FeatureList = [
Svg: require('@site/static/img/collab.svg').default,
description: (
<>
Experience deterministic code parsing like never before. Unoplat Code Confluence extends CHAPI with ANTLR grammar
to parse codebases across any architecture and programming language. Gain a consistent and precise understanding of complex codebases,
enabling your team to navigate and comprehend projects with unparalleled clarity.
Unlock deterministic code parsing with Unoplat Code Confluence. Powered by CHAPI and ANTLR, it seamlessly parses any architecture and programming language. Gain clear, consistent insights into complex codebases, enabling your team to navigate and understand projects with exceptional clarity.
</>
),
},
Expand All @@ -19,22 +17,16 @@ const FeatureList = [
Svg: require('@site/static/img/documentation.svg').default,
description: (
<>
Accelerate onboarding and enhance collaboration. Our platform utilizes state-of-the-art LLM pipelines
to generate detailed objectives and summaries for every function, class, package, and the entire codebase in a depth-first search
fashion. This comprehensive documentation is created automatically, reducing onboarding time to almost zero and empowering
cross-team synergy.
Automatically generate comprehensive documentation covering every aspect of your codebase—including functions, classes, imports, relationships, and more. Our platform organizes this information into an intuitive graph structure that mirrors how developers think, reducing onboarding time to nearly zero and empowering your team to collaborate seamlessly with all the details they need.
</>
),
},
{
title: 'Focus on What Matters',
title: 'Engage With Your Code Effortlessly',
Svg: require('@site/static/img/happy_remote.svg').default,
description: (
<>
Engage with your codebase/s through grounded and context-aware chat. By ingesting code information into a graph database,
Unoplat Code Confluence provides an optimal representation of code relationships. Interact with your codebase using intuitive TUI
and chat with your codebase using advanced LLM pipelines and Graph Retrieval Augmented Generation (GraphRAG),
allowing for intuitive querying to retrieve context-specific information swiftly.
Interact with one or multiple codebases seamlessly through smart, context-aware chat. Unoplat Code Confluence organizes your code into an intuitive, interconnected system, allowing you to ask questions and instantly access the insights you need. Enhance collaboration and boost your productivity with effortless code exploration tailored to your needs.
</>
),
},
Expand Down
Loading

0 comments on commit 08c6674

Please sign in to comment.