MuleSoft Inference Connector provides access to Inference Offering for Large Language Models i.e. Groq, Hugging Face, Github Models, etc.
The MAC Inference Connector supports the following Inference Offerings:
- GitHub Models
- Hugging Face
- Ollama
- Groq AI
- Portkey
- OpenRouter
- Cerebras
- Nvidia
- Together.ai
- Fireworks
- DeepInfra
- The maximum supported version for Java SDK is JDK 17. You can use JDK 17 only for running your application.
- Compilation with Java SDK must be done with JDK 8.
<dependency>
<groupId>io.github.mulesoft-ai-chain-project</groupId>
<artifactId>mule4-inference-connector</artifactId>
<version>0.2.0</version>
<classifier>mule-plugin</classifier>
</dependency>
To use this connector, first build and install the connector into your local maven repository.
Then add the following dependency to your application's pom.xml
:
<dependency>
<groupId>com.mulesoft.connectors</groupId>
<artifactId>mule4-inference-connector</artifactId>
<version>0.2.0</version>
<classifier>mule-plugin</classifier>
</dependency>
You can also make this connector available as an asset in your Anyooint Exchange.
This process will require you to build the connector as above, but additionally you will need
to make some changes to the pom.xml
. For this reason, we recommend you fork the repository.
Then, follow the MuleSoft documentation to modify and publish the asset.
- Check out the complete documentation in mac-project.ai
- Learn from the Getting Started YouTube Playlist
- 🌐 Website: mac-project.ai
- 📺 YouTube: @MuleSoft-MAC-Project
- 💼 LinkedIn: MAC Project Group