-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integrate capabilities of hosting LLMs in DAN #47
Comments
It would be cool to include LLM in DAN. I have been experimenting with ChatGPT and other open-source LLM implementations (actually much of DAN's code is done by ChatGPT). In my opinion, the question is not whether DAN should support LLM, but rather when and how to provide that support. Currently, we are working on supporting Stable Diffusion. From a step-by-step perspective, we should first focus on completing the current tasks before considering expanding to other AI functions. |
I believe that LLM represents the current direction of AI, and can serve as everyone's personal AI assistant. We should even have our own ChatGPT. DAN requires a higher perspective of AI, and an architecture that can integrate various open-source LLM packages including SD. I plan to integrate them into my AI server after procuring new machines. Recently, I have been exploring and experimenting with more open-source LLMs. |
With these many open sourced LLMs, plus new training acceleration methods like deepspeed, it is not hard to image many persons and startups will have the need train their own chatbots based on these priors. Probably we should make DAN more flexible to integrate different open source models, supporting inference and finetuning. |
LLMs has much more users than XtoImages, no doubt. |
I started a new project OpenDAN-Personal-AI-Server-OS which is designed to be dedicated OS for AI apps, and LLMs will be supported natively. |
In additional to the computer vision, the other main area of AI is NLP. These days LLMs like ChatGPT is so hot, creating new possibility. There is huge demand to train and deploy these models. It could attract huge interests if we could support open source LLMs like Alpca, Vicuna, Dolly, etc.
The text was updated successfully, but these errors were encountered: