We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
在使用本地api(如llama.cpp的api)的时候如果此项目不是局域网本地部署的话,浏览器就会阻止非https的请求: Mixed Content: The page at 'https://vercel.ddaiai.com/#/chat/1**********1' was loaded over HTTPS, but requested an insecure resource 'http://192.168.2.73:8086/v1/chat/completions'. This request has been blocked; the content must be served over HTTPS. 导致无法连接大模型:
{ "message": "fetch error, pleace check url", "url": "http://192.168.2.73:8086/v1/chat/completions", "code": "fetch_error" }
浏览器的报错只能在控制台看到,我当时以为是别的什么错误,调试了很久的后端,希望可以在用户设置接口地址的时候如果不是https的话进行警告,提示用户使用https反代本地api
The text was updated successfully, but these errors were encountered:
把key 放在服务端 有可能被收集
Sorry, something went wrong.
No branches or pull requests
在使用本地api(如llama.cpp的api)的时候如果此项目不是局域网本地部署的话,浏览器就会阻止非https的请求:
Mixed Content: The page at 'https://vercel.ddaiai.com/#/chat/1**********1' was loaded over HTTPS, but requested an insecure resource 'http://192.168.2.73:8086/v1/chat/completions'. This request has been blocked; the content must be served over HTTPS.
导致无法连接大模型:
浏览器的报错只能在控制台看到,我当时以为是别的什么错误,调试了很久的后端,希望可以在用户设置接口地址的时候如果不是https的话进行警告,提示用户使用https反代本地api
The text was updated successfully, but these errors were encountered: