Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initialize uv with lock file #181

Open
wants to merge 6 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 12 additions & 5 deletions .github/workflows/benchmarks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,13 +28,20 @@ jobs:
uses: actions/setup-python@v5
with:
python-version: 3.8
- name: Install dependencies (including dev dependencies at frozen version)
- name: Set up uv
uses: astral-sh/[email protected]
with:
version: 0.5.22
python-version: ${{ matrix.python-version }}
- name: Install dependencies from uv lock
if: ${{ inputs.from-lock == 'true' }}
# NOTE: We're asserting that the lockfile is up to date
run: uv sync --locked --extra dev
shell: bash
- name: Install disk-objectstore
# I'm using pip install -e to make sure that the coverage properly traces the runs
# also of the concurrent tests (maybe we can achieve this differently)
run: |
python -m pip install --upgrade pip
pip install -e .[optionaltests]
pip install -r requirements.lock
run: uv pip install -e .[dev]
- name: Run benchmarks
run: pytest --benchmark-only --benchmark-json output.json
- name: Store benchmark result
Expand Down
20 changes: 14 additions & 6 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

name: Continuous integration

on: [push, pull_request]
on: [push] # TODO put this back

jobs:
pre-commit:
Expand Down Expand Up @@ -39,13 +39,21 @@ jobs:
run: |
.github/workflows/setup-ssh-localhost.sh
ssh -v localhost
- name: Install dependencies (including dev dependencies at frozen version)

- name: Set up uv
uses: astral-sh/[email protected]
with:
version: 0.5.22
python-version: ${{ matrix.python-version }}
- name: Install dependencies from uv lock
if: ${{ inputs.from-lock == 'true' }}
# NOTE: We're asserting that the lockfile is up to date
run: uv sync --locked --extra dev --extra progressbar --extra optionaltests --extra examples
shell: bash
- name: Install disk-objectstore
# I'm using pip install -e to make sure that the coverage properly traces the runs
# also of the concurrent tests (maybe we can achieve this differently)
run: |
python -m pip install --upgrade pip
pip install -e .[progressbar,optionaltests]
pip install -r requirements.lock
run: uv pip install -e .[dev,progressbar,optionaltests,examples]
- name: Test with pytest
# No need to run the benchmarks, they will run in a different workflow
# Also, run in very verbose mode so if there is an error we get a complete diff
Expand Down
17 changes: 12 additions & 5 deletions .github/workflows/concurrency.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,13 +30,20 @@ jobs:
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies (including dev dependencies at frozen version)
- name: Set up uv
uses: astral-sh/[email protected]
with:
version: 0.5.22
python-version: ${{ matrix.python-version }}
- name: Install dependencies from uv lock
if: ${{ inputs.from-lock == 'true' }}
# NOTE: We're asserting that the lockfile is up to date
run: uv sync --locked --extra dev
shell: bash
- name: Install disk-objectstore
# I'm using pip install -e to make sure that the coverage properly traces the runs
# also of the concurrent tests (maybe we can achieve this differently)
run: |
python -m pip install --upgrade pip
pip install -e .[optionaltests]
pip install -r requirements.lock
run: uv pip install -e .[dev]
- name: Test with pytest
# Run only the concurrency tests, and repeating them 5 times to increase the chance that, if there is an issue
# only happening rarely, we notice it
Expand Down
4 changes: 4 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ Source = 'https://github.com/aiidateam/disk-objectstore'

[project.optional-dependencies]
dev = [
'psutil',
'coverage',
'pre-commit',
'pytest',
Expand Down Expand Up @@ -139,3 +140,6 @@ commands = pytest {posargs}
description = Run CLI
commands = dostore {posargs}
"""

[tool.uv]
required-version = ">=0.5.21"
73 changes: 0 additions & 73 deletions requirements.lock

This file was deleted.

4 changes: 4 additions & 0 deletions tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,10 @@ def temp_dir():
:return: The path to the directory
:rtype: str
"""
import gc

gc.collect()

try:
dirpath = tempfile.mkdtemp()
yield Path(dirpath)
Expand Down
12 changes: 12 additions & 0 deletions tests/test_container.py
Original file line number Diff line number Diff line change
Expand Up @@ -1153,6 +1153,10 @@ def test_get_objects_stream_closes(temp_container, generate_random_data):
when it goes out of scope - so I add also the test that, inside the loop, at most one more file is open.
The final check seems to always pass even if I forget to do close some file.
"""
import gc

gc.collect()

data = generate_random_data()
# Store
obj_md5s = _add_objects_loose_loop(temp_container, data)
Expand Down Expand Up @@ -1277,6 +1281,14 @@ def test_get_objects_meta_doesnt_open(
temp_container, generate_random_data
): # pylint: disable=invalid-name
"""Test that get_objects_meta does not open any file."""
# TEMPORARY FIX: As described in issue aiida-team/aiida-core/issues/6739
# the container does not properly close its connection so we have to
# enforce the garbage collector to delete unerferenced resources. This
# should be fixed with PR #179
import gc

gc.collect()

data = generate_random_data()
# Store
obj_md5s = _add_objects_loose_loop(temp_container, data)
Expand Down
Loading
Loading