The steps for two platforms are the same.
-
Open the local mode in the extension settings (no need to login).
-
Start Ollama server (other OpenAI compatible APIs are also supported) with the following command (keep the server running in background):
export OLLAMA_ORIGINS="*" ollama run codegeex4 ollama serve
-
Enter the api address and model name in local mode settings. Then enjoy coding with CodeGeeX4!