Skip to content

Commit

Permalink
develop into master v1.1.1 (#645)
Browse files Browse the repository at this point in the history
* Fix workflow cache (#632)

* Copy workflow changes from PR #584

* Update whats_new.rst

* Remove "always deploy" used for testing

* Restore change to pytest instead of unittest

* Use mne data cache in tests too

* Remove duplicate restore cache (#634)

* Fix example Hinss2021 (#636)

* Dataset summary in CSV and leaderboard links (#635)

* Add summary tables as csv

* Change BaseDataset to include the dataset's summary table in the docstring

* Try update summary table in doc

* Add test for dataset summary table

* Fix future annotations

* Prepare summary tables before doc

* Make build dir if not exist

* Add rstate table

* Replace tables with CSVs in doc

* Add PapersWithCOde column

* Fix initial white space

* Fix formatting of ints

* Remove tables from docstrings

* Add missing DemonsP300 row

* Update whats_new.rst

* Remove Shin2017B leaderboard

* Move PWC link outside of docstring table

* Add Liu2024 Dataset (#619)

* including Liu2024 Dataset

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* Function data_path

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* data_infos and get_single_subject_data functions

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* Data Description

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* updating get_single_subject fct and data_path & adding encoding fct

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* Finishing the code

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* updating docstrings for data_path

* Updating dataset_summary and updating the get_single_subject fct to handle the case of existing file in path_electrodes

* adapting the return of get_single_subject_data fct

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* Adding dataset description and preload = True when reading the data in the get_single_subject fct

* fix: codespell

* fix: changing to static method the encoding

* repushing and resolving pre-commit conflicts

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* fix: changing the mapping

* fix: changing the unmatching between the trigger and the events from the csv

* ehn: using pylint to improve the code (remove not used variables, and change the module);

* modifying the python version in the pre-commit file

* adding description to enhancements

* adjusting the encoding

* adjusting comments and dataset description

* solving channels types and names issues & correcting encoding

* Correcting the number of trials per class and the total trials

* Correcting data description

* Adding the possibility to exclude/include break and instructions events

* Correcting code to include/exclude instr and break events

* Remove table from docstring

Signed-off-by: PierreGtch <[email protected]>

* Add CSV row

---------

Signed-off-by: PierreGtch <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: bruAristimunha <[email protected]>
Co-authored-by: PierreGtch <[email protected]>
Co-authored-by: Pierre Guetschel <[email protected]>

* Add scripts to publish results on PaperWithCode (#561)

* Add scripts to publish results on paperwithcode

* Create tasks manually instead (API error 403 forbidden)

* Try to create the evaluation first and other things, not working...

* Update whats_new.rst

* Fix example from API's README; still results in a "403 Forbidden"

* Fix task_id

* Allow passing multiple results files

* Save commands used as comments

* Add comment

---------

Signed-off-by: PierreGtch <[email protected]>
Signed-off-by: Bru <[email protected]>
Co-authored-by: Bru <[email protected]>

* [pre-commit.ci] pre-commit autoupdate (#631)

* [pre-commit.ci] pre-commit autoupdate

updates:
- [github.com/pre-commit/pre-commit-hooks: v4.5.0 → v4.6.0](pre-commit/pre-commit-hooks@v4.5.0...v4.6.0)
- [github.com/psf/black: 24.3.0 → 24.4.2](psf/black@24.3.0...24.4.2)
- [github.com/asottile/blacken-docs: 1.16.0 → 1.18.0](adamchainz/blacken-docs@1.16.0...1.18.0)
- [github.com/PyCQA/flake8: 7.0.0 → 7.1.0](PyCQA/flake8@7.0.0...7.1.0)
- [github.com/astral-sh/ruff-pre-commit: v0.3.5 → v0.5.0](astral-sh/ruff-pre-commit@v0.3.5...v0.5.0)
- [github.com/codespell-project/codespell: v2.2.6 → v2.3.0](codespell-project/codespell@v2.2.6...v2.3.0)

* FIX: including new word

* including

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Bru <[email protected]>

* Optuna GridSearch (#630)

* Optuna

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* Optuna - Categorical Distibution

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* Add optuna to dependency and regenerate poetry

* Add optuna to dependency and regenerate poetry

* Add optuna to dependency and regenerate poetry

* Add test on within Session

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* Add test on within Session

* Add test on within Session

* ehn: common function with dict

* ehn: moving function to util

* fix: correcting the what news file.

* Add test benchmark and raise an issue if the conversion didn't worked and exposed time_out parameter

* [pre-commit.ci] auto fixes from pre-commit.com hooks

* FIX: italian to eng

* EHN: making optuna optional

* FIX: fixing the workflow files

* FIX: changing the optuna file

* FIX: including optuna for the windows

---------

Signed-off-by: Bru <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: bruAristimunha <[email protected]>

* correct name (#642)

* V1.1.1 (#644)

* release 1.1.1

* increase the version

* increase the version

* removing the poetry.lock

* increase version

* changing the CITATION.cff

* upload versino for artifact

* pylock -> pyproject.toml

---------

Signed-off-by: PierreGtch <[email protected]>
Signed-off-by: Bru <[email protected]>
Co-authored-by: PierreGtch <[email protected]>
Co-authored-by: Taha Habib <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Pierre Guetschel <[email protected]>
Co-authored-by: Igor Carrara <[email protected]>
Co-authored-by: Quentin Barthélemy <[email protected]>
  • Loading branch information
7 people authored Sep 17, 2024
1 parent 996adea commit d2c980d
Show file tree
Hide file tree
Showing 59 changed files with 1,211 additions and 3,785 deletions.
139 changes: 70 additions & 69 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
@@ -1,23 +1,27 @@
name: Docs

concurrency:
group: ${{ github.workflow }}-${{ github.event.number }}-${{ github.event.ref }}
cancel-in-progress: true


on:
push:
branches: [master, develop]
branches: [ master, develop ]
pull_request:
branches: [master, develop]
branches: [ master, develop ]
permissions:
contents: write
pages: write
id-token: write

jobs:
build_docs:
runs-on: ${{ matrix.os }}
strategy:
fail-fast: true
matrix:
os: [ubuntu-latest]
python-version: ["3.9"]
os: [ ubuntu-latest ]
python-version: [ "3.9" ]

steps:
- uses: actions/checkout@v4
Expand All @@ -27,7 +31,7 @@ jobs:
mkdir ~/mne_data
- name: Setup Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

Expand All @@ -37,63 +41,79 @@ jobs:
virtualenvs-create: true
virtualenvs-in-project: true

- name: Cache datasets and docs
id: cached-dataset-docs
- name: Create/Restore MNE Data Cache
id: cache-mne_data
uses: actions/cache@v3
with:
path: ~/mne_data
key: ${{ runner.os }}-mne_data

- name: Cache docs build
id: cache-docs
uses: actions/cache@v3
with:
key: doc-${{ github.head_ref }}-${{ hashFiles('moabb/datasets/**') }}
path: |
~/mne_data
docs/build
key: docs-build-${{ github.run_id }}-${{ github.run_attempt }}
path: docs/build

- name: Load cached venv
id: cached-poetry-dependencies
uses: actions/cache@v3
with:
path: .venv
key:
docsvenv-${{ matrix.os }}-py${{matrix.python-version}}-${{
hashFiles('**/pyproject.toml') }}

- name: Install dependencies
if: steps.cached-dataset-docs.outputs.cache-hit != 'true'
run: poetry install --no-interaction --no-root --with docs --extras deeplearning
if: (steps.cached-poetry-dependencies.outputs.cache-hit != 'true')
run: poetry install --no-interaction --no-root --with docs --extras deeplearning --extras optuna

- name: Install library
run: poetry install --no-interaction --with docs --extras deeplearning
run: poetry install --no-interaction --with docs --extras deeplearning --extras optuna

- name: Build docs
run: |
cd docs && poetry run make html
# Create an artifact of the html output.
- uses: actions/upload-artifact@v2
- uses: actions/upload-artifact@v4
with:
name: DocumentationHTML
path: docs/build/html/

deploy_docs:
if: ${{ github.ref == 'refs/heads/master' }}
deploy_neurotechx:
if: ${{ github.ref == 'refs/heads/develop' }}
needs: build_docs
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest]
os: [ ubuntu-latest ]

steps:
- uses: actions/checkout@v4

- name: Create local data folder
run: |
mkdir ~/mne_data
- name: Cache datasets and docs
id: cached-dataset-docs
uses: actions/cache@v3
- name: Restore cached docs build
id: cache-docs
uses: actions/cache/restore@v3
with:
key: doc-${{ github.head_ref }}-${{ hashFiles('moabb/datasets/**') }}
path: |
~/mne_data
docs/build
key: docs-build-${{ github.run_id }}-${{ github.run_attempt }}
path: docs/build

- name: Checkout moabb.github.io
uses: actions/checkout@v4
- name: Check cache hit
if: steps.cache-docs.outputs.cache-hit != 'true'
run: exit 1

- name: Deploy Neurotechx Subpage
uses: peaceiris/actions-gh-pages@v4
with:
repository: "NeuroTechX/moabb.github.io"
path: moabb-ghio
token: ${{ secrets.MOABB_GHIO }}
github_token: ${{ secrets.GITHUB_TOKEN }}
deploy_key: ${{ secrets.ACTIONS_DEPLOY_KEY }}
external_repository: NeuroTechX/moabb.github.io
destination_dir: docs/
publish_branch: master
publish_dir: ./docs/build/html
cname: moabb.neurotechx.com/

deploy_gh_pages:
if: ${{ github.ref == 'refs/heads/develop' }}
Expand All @@ -102,47 +122,28 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest]
os: [ ubuntu-latest ]

steps:
- uses: actions/checkout@v4

- name: Create local data folder
run: |
mkdir ~/mne_data
- name: Cache datasets and docs
id: cached-dataset-docs
uses: actions/cache@v3
- name: Restore cached docs build
id: cache-docs
uses: actions/cache/restore@v3
with:
key: doc-${{ github.head_ref }}-${{ hashFiles('moabb/datasets/**') }}
path: |
~/mne_data
docs/build
key: docs-build-${{ github.run_id }}-${{ github.run_attempt }}
path: docs/build

- name: Checkout gh pages
uses: actions/checkout@v4
with:
ref: gh-pages
path: moabb-ghpages
- name: Check cache hit
if: steps.cache-docs.outputs.cache-hit != 'true'
run: exit 1

- name: Deploy Neurotechx Subpage
uses: peaceiris/actions-gh-pages@v3
- name: Deploy gh-pages
uses: peaceiris/actions-gh-pages@v4
with:
deploy_key: ${{ secrets.ACTIONS_DEPLOY_KEY }}
external_repository: NeuroTechX/moabb.github.io
github_token: ${{ secrets.GITHUB_TOKEN }}
deploy_key: ${{ secrets.MOABB_DEPLOY_KEY_NEW }}
destination_dir: docs/
publish_branch: master
publish_branch: gh-pages
publish_dir: ./docs/build/html
cname: moabb.neurotechx.com/

- name: Deploy on gh-pages
run: |
git config --global user.email "[email protected]"
git config --global user.name "Github Actions"
cd ~/work/moabb/moabb/moabb-ghpages
rm -Rf docs
cp -a ~/work/moabb/moabb/docs/build/html ./docs
git add -A
git commit -m "GH Actions update of GH pages ($GITHUB_RUN_ID - $GITHUB_RUN_NUMBER)"
git push origin gh-pages
cname: neurotechx.github.io/moabb/
36 changes: 20 additions & 16 deletions .github/workflows/test-braindecode.yml
Original file line number Diff line number Diff line change
@@ -1,15 +1,13 @@
name: Test-braindecode

concurrency:
group: ${{ github.workflow }}-${{ github.event.number }}-${{ github.event.ref }}
cancel-in-progress: true


on:
push:
branches: [develop]
branches: [ develop ]
pull_request:
branches: [develop]
branches: [ develop ]

jobs:
test:
Expand All @@ -18,8 +16,8 @@ jobs:
strategy:
fail-fast: true
matrix:
os: [ubuntu-latest]
python-version: ["3.8"]
os: [ ubuntu-latest ]
python-version: [ "3.8" ]
defaults:
run:
shell: bash
Expand All @@ -33,8 +31,21 @@ jobs:
repository: braindecode/braindecode
path: braindecode

- name: Create local data folder
if: runner.os != 'Windows'
run: |
mkdir ~/mne_data
- name: Create/Restore MNE Data Cache
if: runner.os != 'Windows'
id: cache-mne_data
uses: actions/cache@v3
with:
path: ~/mne_data
key: ${{ runner.os }}-mne_data

- name: Setup Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

Expand All @@ -44,22 +55,15 @@ jobs:
virtualenvs-create: true
virtualenvs-in-project: true

- name: Create/Restore MNE Data Cache
id: cache-mne_data
uses: actions/cache@v3
with:
path: ~/mne_data
key: ${{ runner.os }}-mne

- name: Load cached venv
if: runner.os != 'Windows'
id: cached-poetry-dependencies
uses: actions/cache@v3
with:
path: .venv
key:
testvenv-${{ matrix.os }}-py${{matrix.python-version}}-${{
hashFiles('**/poetry.lock') }}
testvenv-braindecode-${{ matrix.os }}-py${{matrix.python-version}}-${{
hashFiles('**/pyproject.toml') }}

- name: Install dependencies
if: |
Expand Down
33 changes: 22 additions & 11 deletions .github/workflows/test-devel.yml
Original file line number Diff line number Diff line change
@@ -1,15 +1,13 @@
name: Test-devel

concurrency:
group: ${{ github.workflow }}-${{ github.event.number }}-${{ github.event.ref }}
cancel-in-progress: true


on:
push:
branches: [develop]
branches: [ develop ]
pull_request:
branches: [develop]
branches: [ develop ]

jobs:
test:
Expand All @@ -18,16 +16,29 @@ jobs:
strategy:
fail-fast: true
matrix:
os: [ubuntu-latest, windows-latest, macOS-latest]
python-version: ["3.9", "3.10"]
os: [ ubuntu-latest, windows-latest, macOS-latest ]
python-version: [ "3.9", "3.10" ]
defaults:
run:
shell: bash
steps:
- uses: actions/checkout@v4

- name: Create local data folder
if: runner.os != 'Windows'
run: |
mkdir ~/mne_data
- name: Create/Restore MNE Data Cache
if: runner.os != 'Windows'
id: cache-mne_data
uses: actions/cache@v3
with:
path: ~/mne_data
key: ${{ runner.os }}-mne_data

- name: Setup Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

Expand All @@ -45,21 +56,21 @@ jobs:
path: .venv
key:
testvenv-${{ matrix.os }}-py${{matrix.python-version}}-${{
hashFiles('**/poetry.lock') }}
hashFiles('**/pyproject.toml') }}

- name: Install dependencies
if: |
(runner.os != 'Windows') &&
(steps.cached-poetry-dependencies.outputs.cache-hit != 'true')
run: poetry install --no-interaction --no-root --extras deeplearning
run: poetry install --no-interaction --no-root --extras deeplearning --extras optuna

- name: Install library (Linux/OSX)
if: ${{ runner.os != 'Windows' }}
run: poetry install --no-interaction --extras deeplearning
run: poetry install --no-interaction --extras deeplearning --extras optuna

- name: Install library (Windows)
if: ${{ runner.os == 'Windows' }}
run: poetry install --no-interaction
run: poetry install --no-interaction --extras optuna

- name: Run tests
run: |
Expand Down
Loading

0 comments on commit d2c980d

Please sign in to comment.