MCPHost Anywhere: Remote Ollama & Standardized Provider Interface #48
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Features Added for Pull Request
Here's a summary of the features I've added to mcphost:
Remote Ollama Instance Support
Unified Command Line Interface
--llm-url
: Universal URL parameter for all providers--api-key
: Universal API key parameter for all LLM providers--anthropic-url
,--openai-url
, and--ollama-url
--anthropic-api-key
and--openai-api-key
Improved Error Handling
Add server Mode
Documentation Updates
Code Refinements
These changes make mcphost more user-friendly, especially when working with different LLM providers, and significantly improve its ability to work with remote Ollama servers in distributed setups. The server mode is robust with proper timeout handling, making it suitable for production environments.