Skip to content

Commit

Permalink
Merge pull request #37 from athril/main
Browse files Browse the repository at this point in the history
Automations for sparc.client
  • Loading branch information
athril authored Jan 31, 2024
2 parents 87c7056 + 0a419f6 commit 3f3727c
Show file tree
Hide file tree
Showing 12 changed files with 273 additions and 91 deletions.
51 changes: 30 additions & 21 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
@@ -1,56 +1,65 @@
# This workflow will install Python dependencies, run tests and lint with a single version of Python
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions

name: CI
name: CI

on:
[push]

push:
tags:
- 'v*' # Execute if version is tagged
branches:
- main # Or if it is push to main
pull_request: # Or if it is any pull_request
workflow_call: # Or if the call comes from another workflow (reusability)

permissions:
contents: read

jobs:
build:

runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Set up Python 3.10
uses: actions/setup-python@v3
uses: actions/setup-python@v5
with:
python-version: "3.10"

- name: Setup OpenGL
run: |
sudo apt-get install libopengl0 libglu1-mesa
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -e '.[test]'
- name: Lint with flake8
run: |
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
- name: Generate coverage report
run: |
PYTHONPATH=src
pytest --cov=./src tests/
pytest --cov=./src tests/ --cov-report term --cov-report=xml:coverage.xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
with:
verbose: true
- name: Install pypa/build
run: >-
python -m
pip install
build
--user
- name: Build a binary wheel and a source tarball
run: >-
python -m
build
--sdist
--wheel
--outdir dist/
directory: ./coverage/reports/
env_vars: OS,PYTHON
fail_ci_if_error: true
files: ./coverage.xml
flags: unittests
name: codecov-umbrella
token: ${{ secrets.CODECOV_TOKEN }}

- name: Check if the package builds
run: |
python -m pip install -U pip build
python -m build
3 changes: 2 additions & 1 deletion .github/workflows/black-action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,13 @@ jobs:
name: runner / black
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Check files using the black formatter
uses: rickstaa/action-black@v1
id: action_black
with:
black_args: "."

- name: Annotate diff changes using reviewdog
if: steps.action_black.outputs.is_formatted == 'true'
uses: reviewdog/action-suggester@v1
Expand Down
97 changes: 97 additions & 0 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
# Build the package, release new version and publish it to PyPI after tests pass
# References:
# - https://discourse.jupyter.org/t/use-github-workflows-to-automatically-publish-to-pypi-when-new-tags-are-created/14941/2
# - https://dev.to/this-is-learning/manually-trigger-a-github-action-with-workflowdispatch-3mga
# - https://maciek.land/blog/automatic-releases-with-github-actions
# - https://www.loopwerk.io/articles/2021/automating-changelog/


name: Release new version
on:
workflow_dispatch:
inputs:
version:
description: "Please specify version, e.g.: v0.3.5"
default: "v0.0.0"
jobs:
verify:
runs-on: ubuntu-latest
steps:
- name: Check permissions
uses: actions-cool/check-user-permission@v2
id: checkUser
with:
require: 'admin'
check:
needs: verify
uses: ./.github/workflows/CI.yml

publish:
needs: check
runs-on: ubuntu-latest

# Specifying a GitHub environment for PyPI release is optional, but strongly encouraged
environment:
name: pypi
url: https://pypi.org/p/sparc.client
permissions:
# IMPORTANT: this permission is mandatory for trusted publishing on PyPI
id-token: write
contents: write
packages: write
steps:
- name: Set up Python 3.10
uses: actions/setup-python@v4
with:
python-version: '3.10'

- name: Checkout source
uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Configure git user
run: |
git config user.email "[email protected]"
git config user.name "GitHub Actions (run by ${{ github.actor }})"
- name: Tag commit
uses: tvdias/[email protected]
with:
repo-token: ${{ github.token }}
tag: ${{ github.event.inputs.version }}

- name: Update CHANGELOG
id: changelog
uses: requarks/changelog-action@v1
with:
token: ${{ github.token }}
tag: ${{ github.event.inputs.version }}

- name: Commit and push CHANGELOG.md
uses: EndBug/add-and-commit@v9
with:
add: CHANGELOG.md
message: "chore: Update CHANGELOG.md"
pull: '--rebase --autostash'
tag: '-a ${{ github.event.inputs.version }} -m ${{ github.event.inputs.version }} --force'
tag_push: '--force'

- name: Build package
run: |
python -m pip install -U pip build
python -m build
- name: Publish Distribution to PyPI
uses: pypa/gh-action-pypi-publish@release/v1

- name: Create Release
uses: ncipollo/[email protected]
with:
allowUpdates: true
draft: false
makeLatest: true
name: ${{ github.event.inputs.version }}
tag: ${{ github.event.inputs.version }}
body: ${{ steps.changelog.outputs.changes }}
token: ${{ github.token }}
10 changes: 7 additions & 3 deletions .github/workflows/reviewdog.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,28 +6,32 @@ on:

jobs:
reviewdog:

runs-on: ubuntu-latest

steps:

steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v4

- uses: actions/setup-python@v3
- uses: actions/setup-python@v5
with:
python-version: '3.10'

- name: Install
run: |
python -m pip install -U pip
pip install --progress-bar off -U .[checking]
- name: Apply formatters
run: |
pip install black blackdoc isort
black .
blackdoc .
isort .
- name: Reviewdog
uses: reviewdog/action-suggester@v1
with:
tool_name: formatters
github_token: ${{ secrets.GITHUB_TOKEN }}
14 changes: 11 additions & 3 deletions .github/workflows/sphinx.yml
Original file line number Diff line number Diff line change
@@ -1,14 +1,22 @@
name: Sphinx build

on: push
on:
push:
branches:
- main # Run if it is a push to main

jobs:
check:
uses: ./.github/workflows/CI.yml

build:
needs: check
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python 3.10
uses: actions/setup-python@v3
uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Build HTML
Expand Down
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,9 @@ share/python-wheels/
*.egg
MANIFEST

# dynamic versioning in _version.py (automatically built for PyPI only)
src/sparc/client/_version.py

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
Expand Down
110 changes: 63 additions & 47 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,51 +1,67 @@
v0.2.0
======
New functionalities:
* Added O2SparcService service:
* Introduced a `O2SparcSolver` class which is the main class for running computational jobs on o²S²PARC. This class holds the following methods:
* `submit_job`
* `get_job_progress`
* `job_done`
* `get_results`
* `get_job_log`
* Introduced `get_solver` method to `O2SparcService` which returns a `O2SparcSolver` object
* Scaffold Retrieval:
* Introducing the ability to use `sparc.client` to retrieve scaffolds or scaffold descriptions.
* The retrieved scaffold or scaffold description files can now be converted to a commonly used mesh format, such as VTK.
* Reuse of packages from the mapping tools codebase ensures efficient and standardized mesh conversion.
* MBF Segmentation Export:
* Added support for exporting MBF Segmentation data to a commonly used mesh format, like VTK.
* Segmentation Data Analysis:
* New functionality to analyze a given segmentation data file for suitability in the mapping tools fitting workflow, and provide a clear and informative report.
* Updated Documentation:
* Added the [SPARC Python Zinc Client tutorial](https://github.com/nih-sparc/sparc.client/blob/main/docs/tutorial-zinc.ipynb) to reflect the features related to Zinc.


v0.1.0
======
Fixes:
* download multiple files from Pennsieve #12
* pennsieve Download file API #14
* Github action updates: Reviewdog should run whenever a PR is modified after opening #15
* new tutorial in Jupyter Notebook
* README.md update


v0.0.2
======
# CHANGELOG

## [v0.2.0]

### :sparkles: New features

- Added O2SparcService service:

* Introduced a `O2SparcSolver` class which is the main class for running computational jobs on o²S²PARC. This class holds the following methods:
* `submit_job`
* `get_job_progress`
* `job_done`
* `get_results`
* `get_job_log`

- Introduced `get_solver` method to `O2SparcService` which returns a `O2SparcSolver` object

- Scaffold Retrieval:

* Introducing the ability to use `sparc.client` to retrieve scaffolds or scaffold descriptions.
* The retrieved scaffold or scaffold description files can now be converted to a commonly used mesh format, such as VTK.
* Reuse of packages from the mapping tools codebase ensures efficient and standardized mesh conversion.

- MBF Segmentation Export:

* Added support for exporting MBF Segmentation data to a commonly used mesh format, like VTK.

- Segmentation Data Analysis:

* New functionality to analyze a given segmentation data file for suitability in the mapping tools fitting workflow, and provide a clear and informative report.

- Updated Documentation:

* Added the [SPARC Python Zinc Client tutorial](https://github.com/nih-sparc/sparc.client/blob/main/docs/tutorial-zinc.ipynb) to reflect the features related to Zinc.


## [v0.1.0]

### :bug: Bug Fixes

- download multiple files from Pennsieve #12
- pennsieve Download file API #14
- Github action updates: Reviewdog should run whenever a PR is modified after opening #15
- new tutorial in Jupyter Notebook
- README.md update

## [v0.0.2]

Alpha2 release of Python Sparc Client.
Major updates:
* Code coverage at 100%
* Sphinx documentation with Github Pages

### :sparkles: New features

- Code coverage at 100%
- Sphinx documentation with Github Pages

## [v0.0.1]

v0.0.1
======
Alpha release of Python Sparc Client.
Basic functionalities:
* automatic/manual module loading
* ServiceBase class for adding new modules
* Pennsieve module with basic functionalities:
* listing datasets, files, records
* downloading files
* Basic API support (GET/POST)

### :sparkles: New features

- automatic/manual module loading
- ServiceBase class for adding new modules
- Pennsieve module with basic functionalities:
* listing datasets, files, records
* downloading files
* Basic API support (GET/POST)
Loading

0 comments on commit 3f3727c

Please sign in to comment.