Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Support for Local Hosting of LLM API Endpoint with TextGenWebUI Integration #100

Closed
2good4hisowngood opened this issue Sep 26, 2023 · 1 comment

Comments

@2good4hisowngood
Copy link

2good4hisowngood commented Sep 26, 2023

Title: Support for Local Hosting of LLM API Endpoint with TextGenWebUI Integration

Feature Request

Description:
This feature request proposes the integration of TextGenWebUI with ChatDev to facilitate local hosting of Language Model (LLM) API endpoints. By incorporating TextGenWebUI, we can provide a seamless interface for users to interact with local LLMs, facilitating model management and selection. This feature, in tandem with the previously proposed multiple LLM AI API endpoints feature, would significantly expand the capabilities of ChatDev, empowering users to orchestrate both local and remote LLMs, and utilize them for diverse tasks based on their expertise.

Feature Details:

TextGenWebUI Integration:

  • Integrate the TextGenWebUI interface within ChatDev, allowing users to effortlessly manage and interact with local LLMs.

List Models:

  • Enable users to view a list of models available to the endpoint, aiding in model selection and understanding the capabilities of each model.

Model Loading and Unloading:

  • Provide a mechanism to unload and load models on the fly. This capability can be especially beneficial when swapping between different models for specific tasks or when managing system resources.

Parameter Configuration:

  • Offer settings for configuring chat parameters, including temperature and other relevant options, to fine-tune the behavior of the local LLM.
    Model Selector:
  • If combined with the model selection feature request, users could orchestrate local models by unloading/loading different models to provide specialized expertise for varying tasks.

Local Hosting Support:

  • Ensure seamless integration and support for local hosting of the LLM API endpoint, allowing users to easily set up and manage their local LLM instances.

Expected Benefits:
Incorporating this feature will provide users with enhanced flexibility in managing and utilizing LLMs. It fosters experimentation with different model configurations and reduces reliance on external services, which can lead to cost savings. Moreover, with local hosting, users can optimize performance, reduce latency, and have better control over data privacy.

Additional Notes:
Given the potential complexity of integrating TextGenWebUI, it would be beneficial to provide comprehensive documentation and tutorials to aid users in the setup and management process. It is also crucial to ensure compatibility across different platforms and operating systems.

References:
Links to the TextGenWebUI repository and any other related resources will be included here to provide guidance on crafting the required scripts for endpoint interaction.

https://github.com/oobabooga/text-generation-webui/blob/main/api-examples/api-example-model.py
https://github.com/oobabooga/text-generation-webui/blob/main/api-examples/api-example-chat-stream.py

Listing the api's models available:

import requests

HOST = '0.0.0.0:5000'

def model_api(request):
    response = requests.post(f'http://{HOST}/api/v1/model', json=request)
    return response.json()

model_api({'action': 'list'})['result']

Related Issues/Pull Requests:
multi-llm support #98

Assignees:
Once this ticket is approved, I will gather all the necessary links and resources.

Thank you for taking this feature request into consideration. Integrating TextGenWebUI with ChatDev will undoubtedly provide a significant boost to the platform's capabilities, enriching the experience for users managing and experimenting with Language Model AI API endpoints.

@sammcj
Copy link

sammcj commented Nov 26, 2023

This would be so nice to have and would align with #27

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants