Skip to content
This repository has been archived by the owner on Nov 9, 2024. It is now read-only.

a stable proxy solution of chat.openai.com which can bypass cloudflare with browser

License

Notifications You must be signed in to change notification settings

ikechan8370/node-chatgpt-proxy

Repository files navigation

node-chatgpt-proxy

Docker Pulls Docker Image Size (latest by date) GitHub top language

A simple reverse proxy for chat.openai.com which can bypass cloudflare protection.

This project has been used by chatgpt-plugin for months, and huge improvements in performance and stability have been made to it.

Give me a star if this project helps or you are using any service supported by it.

How to use

with source code

prepare your config.json file, for example:

{
  "chromePath": "C:\\Program Files (x86)\\Microsoft\\Edge\\Application\\msedge.exe",
  "proxy": "http://127.0.0.1:7890"
}

and then use npm run start to start the server.

with docker

just run docker run -d -p [PORT]:3000 --name node-chatgpt-proxy --shm-size=1gb geyinchi/node-chatgpt-proxy

Check log: docker exec -it node-chatgpt-proxy tail -100f /var/log/node-chatgpt-proxy.log

Check new version and update: docker pull geyinchi/node-chatgpt-proxy && docker stop node-chatgpt-proxy && docker rm node-chatgpt-proxy && docker run -d -p [PORT ]:3000 --name node-chatgpt-proxy --shm-size=1gb geyinchi/node-chatgpt-proxy

about the API

Currently, only http://[ip]:[port]/backend-api/conversation is supported, other APIs need to be tested by yourself.

You can use the default 3.5 model without a token. After logging in, you can use the gpt-4o model.

update: Now OpenAI format is supported: http://localhost:3000/v1/chat/completions , support stream

API key can be accessToken or nextToken

中文

一个简单的chat.openai.com的反代,基于浏览器绕过Cloudflare防护。

本项目目前用于chatgpt-plugin插件,经中等规模用户长期使用,稳定性不错。

如果这个项目对你有帮助或者你就在使用基于这个项目的其他开源项目,请给我一个免费的Star,谢谢!

如何使用

源码部署

准备一个config.json文件,例如:

{
  "chromePath": "C:\\Program Files (x86)\\Microsoft\\Edge\\Application\\msedge.exe",
  "proxy": "http://127.0.0.8:7890"
}

然后使用npm run start启动服务。

使用docker

直接运行 docker run -d -p [PORT]:3000 --name node-chatgpt-proxy --shm-size=1gb geyinchi/node-chatgpt-proxy

查看日志: docker exec -it node-chatgpt-proxy tail -100f /var/log/node-chatgpt-proxy.log

检查新版本并更新: docker pull geyinchi/node-chatgpt-proxy && docker stop node-chatgpt-proxy && docker rm node-chatgpt-proxy && docker run -d -p [端口号]:3000 --name node-chatgpt-proxy --shm-size=1gb geyinchi/node-chatgpt-proxy

可以参考这篇博客:https://ikechan8370.com/archives/da-jian-chatgpt-guan-fang-fan-xiang-dai-li

关于API

目前支持http://[ip]:[port]/backend-api/conversation,其他API请自行测试。

可以不传token,将使用未登录模式,默认的3.5模型。登陆后可使用gpt-4o.

更新:现在支持OpenAI格式:http://localhost:3000/v1/chat/completions ,支持流式(stream: true)

api key可以填写accessToken 或 nextToken

About

a stable proxy solution of chat.openai.com which can bypass cloudflare with browser

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published