Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support xAI #430

Merged
merged 1 commit into from
Nov 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ bilingual_book_maker 是一个 AI 翻译工具,使用 ChatGPT 帮助用户制
- 可以使用彩云进行翻译 `--model caiyun --caiyun_key ${caiyun_key}`
- 可以使用 Gemini 进行翻译 `--model gemini --gemini_key ${gemini_key}`
- 可以使用腾讯交互翻译(免费)进行翻译`--model tencentransmart`
- 可以使用[xAI](https://x.ai)进行翻译`--model xai --xai_key ${xai_key}`
- 可以使用 [Ollama](https://github.com/ollama/ollama) 自托管模型进行翻译,使用 `--ollama_model ${ollama_model_name}`
- 如果 ollama server 不运行在本地,使用 `--api_base http://x.x.x.x:port/v1` 指向 ollama server 地址
- 使用 `--test` 命令如果大家没付费可以加上这个先看看效果(有 limit 稍微有些慢)
Expand Down
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ Find more info here for using liteLLM: https://github.com/BerriAI/litellm/blob/m
- If you want to use a specific model alias with Gemini (eg `gemini-1.5-flash-002` or `gemini-1.5-flash-8b-exp-0924`), you can use `--model gemini --model_list gemini-1.5-flash-002,gemini-1.5-flash-8b-exp-0924`. `--model_list` takes a comma-separated list of model aliases.
- Support [Claude](https://console.anthropic.com/docs) model, use `--model claude --claude_key ${claude_key}`
- Support [Tencent TranSmart](https://transmart.qq.com) model (Free), use `--model tencentransmart`
- Support [xAI](https://x.ai) model, use `--model xai --xai_key ${xai_key}`
- Support [Ollama](https://github.com/ollama/ollama) self-host models, use `--ollama_model ${ollama_model_name}`
- If ollama server is not running on localhost, use `--api_base http://x.x.x.x:port/v1` to point to the ollama server address
- Use `--test` option to preview the result if you haven't paid for the service. Note that there is a limit and it may take some time.
Expand Down
10 changes: 10 additions & 0 deletions book_maker/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -122,6 +122,14 @@ def main():
help="You can get Groq Key from https://console.groq.com/keys",
)

# for xAI
parser.add_argument(
"--xai_key",
dest="xai_key",
type=str,
help="You can get xAI Key from https://console.x.ai/",
)

parser.add_argument(
"--test",
dest="test",
Expand Down Expand Up @@ -376,6 +384,8 @@ def main():
API_KEY = options.gemini_key or env.get("BBM_GOOGLE_GEMINI_KEY")
elif options.model == "groq":
API_KEY = options.groq_key or env.get("BBM_GROQ_API_KEY")
elif options.model == "xai":
API_KEY = options.xai_key or env.get("BBM_XAI_API_KEY")
else:
API_KEY = ""

Expand Down
2 changes: 2 additions & 0 deletions book_maker/translator/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
from book_maker.translator.groq_translator import GroqClient
from book_maker.translator.tencent_transmart_translator import TencentTranSmart
from book_maker.translator.custom_api_translator import CustomAPI
from book_maker.translator.xai_translator import XAIClient

MODEL_DICT = {
"openai": ChatGPTAPI,
Expand All @@ -25,5 +26,6 @@
"groq": GroqClient,
"tencentransmart": TencentTranSmart,
"customapi": CustomAPI,
"xai": XAIClient,
# add more here
}
20 changes: 20 additions & 0 deletions book_maker/translator/xai_translator.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
from openai import OpenAI
from .chatgptapi_translator import ChatGPTAPI
from os import linesep
from itertools import cycle


XAI_MODEL_LIST = [
"grok-beta",
]


class XAIClient(ChatGPTAPI):
def __init__(self, key, language, api_base=None, **kwargs) -> None:
super().__init__(key, language)
self.model_list = XAI_MODEL_LIST
self.api_url = str(api_base) if api_base else "https://api.x.ai/v1"
self.openai_client = OpenAI(api_key=key, base_url=self.api_url)

def rotate_model(self):
self.model = self.model_list[0]
Loading