Skip to content

Conversation

devin-ai-integration[bot]
Copy link
Contributor

@devin-ai-integration devin-ai-integration bot commented Sep 27, 2025

Fix issue #3609: Add URL validation for Ollama connections

Summary

Adds early URL validation specifically for Ollama connections to provide clear, actionable error messages when users specify invalid URLs like http://192.168.0.300:11434. Previously, users would get a cryptic "No route to host" error from litellm after the request failed.

Key Changes:

  • Added _validate_base_url() method to validate URL format and IP addresses
  • Extended _validate_call_params() to validate Ollama base_url and api_base parameters
  • Added comprehensive test coverage for invalid IPs, malformed URLs, and valid URLs
  • Only validates URLs for Ollama models to avoid affecting other LLM providers

Before: litellm.APIConnectionError: OllamaException - [Errno 65] No route to host
After: Invalid Ollama base_url: 'http://192.168.0.300:11434'. Please check that the URL format is correct and the IP address is valid. Example: 'http://localhost:11434' or 'http://192.168.1.100:11434'

Review & Testing Checklist for Human

⚠️ 3 critical items to verify:

  • Test the reported bug scenario: Create an LLM with model="ollama/llama3.1" and base_url="http://192.168.0.300:11434" and verify it shows the new clear error message instead of the cryptic litellm error
  • Test edge cases in validation logic: Try various URL formats (IPv6, internationalized domains, unusual ports) and IP addresses (edge cases like 192.168.0.1, 192.168.1.255) to ensure the validation logic handles them correctly
  • Verify no regressions: Test existing working Ollama configurations (localhost, valid private IPs, domain names) to ensure they still work without throwing validation errors

Notes

  • The IP detection uses a heuristic (all(part.isdigit() for part in hostname.split('.')) and len(hostname.split('.')) == 4) which may have edge cases worth testing
  • Model detection uses simple string matching ("ollama" in self.model.lower()) - verify this doesn't accidentally affect other providers
  • Non-Ollama models should completely skip this validation even with invalid URLs

Requested by: João ([email protected]) via Slack
Devin session: https://app.devin.ai/sessions/c740336126bb41529cf6799c59db01a7

- Add _validate_base_url method to LLM class to validate Ollama URLs
- Integrate URL validation into _validate_call_params for Ollama models
- Validate IP address format and reject invalid IPs like 192.168.0.300
- Provide clear, helpful error messages for invalid URLs
- Add comprehensive tests covering invalid IPs, malformed URLs, and valid URLs
- Only validate URLs for Ollama models to avoid breaking other providers
- Fixes litellm.APIConnectionError with unclear 'No route to host' messages

Co-Authored-By: João <[email protected]>
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

- Fix whitespace in docstring blank lines (ruff W293)
- Add null check before calling _validate_base_url to fix mypy type error
- Ensure url_to_validate is not None before validation

Co-Authored-By: João <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants