Skip to content

Add magpie support llama cpp ollama #1356

Add magpie support llama cpp ollama

Add magpie support llama cpp ollama #1356

Workflow file for this run

name: Benchmarks
on:
push:
branches:
- "main"
- "develop"
pull_request:
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
benchmarks:
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: "3.12"
# Looks like it's not working very well for other people:
# https://github.com/actions/setup-python/issues/436
# cache: "pip"
# cache-dependency-path: pyproject.toml
- uses: actions/cache@v4
id: cache
with:
path: ${{ env.pythonLocation }}
key: ${{ runner.os }}-python-${{ env.pythonLocation }}-${{ hashFiles('pyproject.toml') }}-benchmarks-v00
- name: Install dependencies
if: steps.cache.outputs.cache-hit != 'true'
run: ./scripts/install_dependencies.sh
- name: Run benchmarks
uses: CodSpeedHQ/action@v3
with:
token: ${{ secrets.CODSPEED_TOKEN }}
run: pytest tests/ --codspeed