-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLM相关 #73
Comments
Ollama安装国内下载参考
curl -fsSL https://ollama.com/install.sh -o ollama_install.sh 替换下载链接: #!/bin/bash
# 文件路径
FILE="ollama_install.sh"
# 修改 URL
sed -i 's|https://ollama.com/download/ollama-linux-${ARCH}${VER_PARAM}|https://github.moeyy.xyz/https://github.com/ollama/ollama/releases/download/v0.3.4/ollama-linux-amd64|g' $FILE
sed -i 's|https://ollama.com/download/ollama-linux-amd64-rocm.tgz${VER_PARAM}|https://github.moeyy.xyz/https://github.com/ollama/ollama/releases/download/v0.3.4/ollama-linux-amd64-rocm.tgz|g' $FILE autodl下载参考
source /etc/network_turbo
sudo apt update
sudo apt install systemd systemctl lshw
curl -fsSL https://ollama.com/install.sh | sh
# systemctl start ollama.service
# kill -9 [ollama process number]
http_proxy=127.0.0.1:7890 https_proxy=127.0.0.1:7890 OLLAMA_MODELS=/root/autodl-tmp/ollama ollama serve
http_proxy=127.0.0.1:7890 https_proxy=127.0.0.1:7890 OLLAMA_MODELS=/root/autodl-tmp/ollama ollama run deepseek-r1:70b |
GPT Academic安装source /etc/network_turbo
http_proxy=127.0.0.1:7890 https_proxy=127.0.0.1:7890 git clone --depth=1 https://github.com/binary-husky/gpt_academic.git
cd gpt_academic
cp config.py config_private.py
vim config_private.py
conda init
source ~/.bashrc
conda create -n gptac_venv python=3.11
conda activate gptac_venv
python -m pip install -r requirements.txt
python main.py |
LLMschatbot arena Webschatgpt next web Appschatbox RAGRAGFlow Othersdeepclaude |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
在线使用
中转API key提供方:
All in one:https://www.closechat.org/ ,速度快,并且是o3-mini,但是太贵
某宝还买了中转api,速度慢,只能o1-mini。直连key效果还行
前端:chatbox,桌面端/移动端都可用
本地部署
显卡:apple m3 silicon
后端:ollama
前端:gpt_academic
The text was updated successfully, but these errors were encountered: