We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
在zhile-io大佬的pandora项目中,提供了一种免消耗免翻墙调用的方法,可以参考这个科普帖子. 我主页的项目都是用它实现的,认为不错,所以推荐给大家
各位可以先尝试一下这个程序, 但是请勿过多使用我的api, 毕竟不能同时服务2个客户端
import openai import time def fake_api(query,max,a,tem): #用户输入,最大token,是否流式输出,温度 openai.api_key = "fk-yU3UOSY13E9WsAkKBTrkq1KiiqIsAXBs_J6OTh_yvJM" # 使用假的 API 密钥 openai.api_base = "https://ai.fakeopen.com/v1/" start_time = time.time() # 记录开始时间 response = openai.ChatCompletion.create( model='gpt-3.5-turbo', messages=[ {'role': 'user', 'content': query} ], temperature=tem, max_tokens=max, stream=True # 开启流式输出 ) result = "" # 创建一个空字符串来保存流式输出的结果 for chunk in response: # 确保字段存在 if 'choices' in chunk and 'delta' in chunk['choices'][0]: chunk_msg = chunk['choices'][0]['delta'].get('content', '') result += chunk_msg # 将输出内容附加到结果字符串上 if a: print(chunk_msg, end='', flush=True) time.sleep(0.05) return result # 返回流式输出的完整结果 if __name__ == '__main__': while True: query = input("You: ") full_result = fake_api(query,1500,True,1) # 将结果保存到 full_result 变量中
望采纳!
The text was updated successfully, but these errors were encountered:
谢谢!
Sorry, something went wrong.
想请问您在使用Pool Token时 还会ratelimit吗
No branches or pull requests
在zhile-io大佬的pandora项目中,提供了一种免消耗免翻墙调用的方法,可以参考这个科普帖子. 我主页的项目都是用它实现的,认为不错,所以推荐给大家
各位可以先尝试一下这个程序, 但是请勿过多使用我的api, 毕竟不能同时服务2个客户端
望采纳!
The text was updated successfully, but these errors were encountered: