Universal LLM Provider Connector for Java
When I first started exploring LLMs and Neural Networks in Python, experimenting was easy. But when I switched back to Java—the language I trust for its scalability and performance—I hit a roadblock. There weren’t any simple tools to help me work seamlessly with multiple LLM providers.
This had to be fixed.
The result? Hosp-AI
A library designed for quick prototyping with LLMs, and fully compatible with production-ready frameworks like Spring Boot.
Thanks to Adalflow , the inspiration behind building this library.
- Fork the Repo
- Create a Branch - name it based on issue-fix, documentation, feature
- Pull a PR
- Tools Support
- Support for following LLM providers: OpenAI, Anthropic, Groq, Ollama
- PromptBuilder to build complex prompts
- Support to customize default client implementations (Flexible approach for integrating with frameworks like SpringBoot)
- Image Support
- Stream Response
- Add jitpack repository in pom file
<repositories> <repository> <id>jitpack.io</id> <url>https://jitpack.io</url> </repository> </repositories>
- Add hosp-ai dependency (check latest version)
<dependency> <groupId>com.github.r7b7</groupId> <artifactId>hosp-ai</artifactId> <version>v1.0.0-alpha.2</version> </dependency>
For working examples and tutorials - visit Wiki