Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release 2.4.0 #181

Merged
merged 92 commits into from
Nov 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
92 commits
Select commit Hold shift + click to select a range
cfe8253
Replaced local unicycler module with nf-core module.
FloWuenne Jun 26, 2024
24f0cbc
Removed deprecated unicycler parameters from test_dfast.
FloWuenne Jun 27, 2024
2baeb3b
Updated CHANGELOG.
FloWuenne Jun 27, 2024
e922812
Removed no longer needed unicycler local files.
FloWuenne Jun 27, 2024
03f0dc8
Merge pull request #150 from FloWuenne/unicycler_nfcore
FloWuenne Jun 28, 2024
a49930e
fix missing dev tag in bumped version
Daniel-VM Jul 10, 2024
24ec64c
Merge pull request #152 from Daniel-VM/dev
Daniel-VM Jul 10, 2024
04091a3
remove files_unchanged items form linting
Daniel-VM Jul 10, 2024
1bb3278
fix files_unchanged to make a proper linting check
Daniel-VM Jul 10, 2024
b73c536
added changelog #153
Daniel-VM Jul 10, 2024
cc1882b
Merge pull request #153 from Daniel-VM/dev
Daniel-VM Jul 10, 2024
e8442e2
fix kmerfinder script to download reference
Daniel-VM Jul 10, 2024
3bfab15
add more resources to kmerfinder module to avoid memory related issues
Daniel-VM Jul 10, 2024
3e9cf3f
added changelog #154
Daniel-VM Jul 10, 2024
20830af
Merge pull request #154 from Daniel-VM/fix_kmerfinder_scripts
Daniel-VM Aug 28, 2024
084fec4
fixed current zenodo url to kmerfinderdb
Daniel-VM Aug 29, 2024
82adb62
update changelog in #157
Daniel-VM Aug 29, 2024
5f64486
Merge pull request #157 from Daniel-VM/fix_kmerfinderdb
Daniel-VM Aug 30, 2024
04def18
enhance KmerFinder support parsing multiple file format versions
Daniel-VM Aug 29, 2024
9280f89
update kmerfinderdb description
Daniel-VM Aug 29, 2024
b5b0407
update kmerfinder module
Daniel-VM Aug 30, 2024
fd7dd01
update kmerfinder module dir structure
Daniel-VM Aug 30, 2024
c2a94d7
update module paths and handle memory-errors in kmerfinder
Daniel-VM Aug 30, 2024
9e85db0
Merge pull request #159 from Daniel-VM/update_kmerfinder
Daniel-VM Sep 2, 2024
6250ad3
fix memory issues due to config in kmerfinder
Daniel-VM Sep 3, 2024
09a1eb6
Handle KmerFinder results when no species hit is detected
Daniel-VM Sep 3, 2024
ae525c0
handle prokka/bakkta channel when fasta file is empty
Daniel-VM Sep 3, 2024
fcf0ab9
update changelog in #160
Daniel-VM Sep 3, 2024
c66630f
Merge pull request #160 from Daniel-VM/dev
Daniel-VM Sep 3, 2024
41fd8c9
fix saving merged fastq files in fastp module
Daniel-VM Sep 5, 2024
7f438ec
#163
Daniel-VM Sep 5, 2024
f1f915c
extend description on save_merged param
Daniel-VM Sep 6, 2024
6e0d781
Fix CHANGELOG.md #163
Daniel-VM Sep 6, 2024
feaf603
Merge pull request #163 from Daniel-VM/dev
Daniel-VM Sep 6, 2024
488b5ab
append fastqc trim section to multiqc report
Daniel-VM Sep 13, 2024
b308a95
update multiqc config files to arrange preprocessing sections
Daniel-VM Sep 13, 2024
235bf18
update changelog #166
Daniel-VM Sep 13, 2024
7842632
fix prettier #166
Daniel-VM Sep 13, 2024
ed12940
Merge pull request #166 from Daniel-VM/multiqc_after_filtering_qc
Daniel-VM Sep 17, 2024
3045163
remove prams.save_merged from this pipeline
Daniel-VM Sep 10, 2024
8adff80
update CHANGELOG in #167
Daniel-VM Sep 17, 2024
05e784a
Merge pull request #167 from Daniel-VM/dev
Daniel-VM Sep 17, 2024
37488ee
fix wrong metadata in canu input channel
Daniel-VM Sep 17, 2024
a65648e
updated changelog in #168
Daniel-VM Sep 17, 2024
1cb4574
Merge pull request #168 from Daniel-VM/fix_canu_assembler
Daniel-VM Sep 18, 2024
8bf71a8
fixed input channel to minimap2_align
Daniel-VM Sep 17, 2024
663a770
refactor longread polishing and update medaka
Daniel-VM Sep 18, 2024
92bcfde
add condition to run miniasm on long/hybrid mode only
Daniel-VM Sep 18, 2024
fc6c0b9
Refined polishing step by 1-removing short reads; 2- selecting the po…
Daniel-VM Sep 20, 2024
c5c32d5
restricted the polishing step to long reads mode
Daniel-VM Sep 20, 2024
0917bda
update changelog in #169
Daniel-VM Sep 23, 2024
3f6a42d
Merge pull request #169 from Daniel-VM/fix_nanopolish_ont
Daniel-VM Sep 28, 2024
db3eb59
Template update for nf-core/tools version 3.0.0
nf-core-bot Oct 8, 2024
75ef2ec
Template update for nf-core/tools version 3.0.1
nf-core-bot Oct 9, 2024
4a10f2a
Template update for nf-core/tools version 3.0.2
nf-core-bot Oct 11, 2024
6b526ea
Merge branch 'dev' into nf-core-template-merge-3.0.1
Daniel-VM Oct 11, 2024
1cf3bd7
Merge branch 'dev+nf-core-template-merge-3.0.1' into nf-core-template…
Daniel-VM Oct 11, 2024
efbade0
mv fastp to older version due to inconsistency with fastq_trim_fastp_…
Daniel-VM Oct 11, 2024
b86a1a8
update CHANGELOG in #176
Daniel-VM Oct 11, 2024
2083999
Apply suggestions from code review
Daniel-VM Oct 15, 2024
8a8182a
[automated] Fix code linting
nf-core-bot Oct 15, 2024
5fbe68d
Merge pull request #176 from Daniel-VM/nf-core-template-merge-3.0.2
Daniel-VM Oct 15, 2024
cc0ae3a
added module cat_fastq
Daniel-VM Aug 29, 2024
b2ab59c
fix wrong variable naming
Daniel-VM Aug 29, 2024
1f2b944
update changelog #158
Daniel-VM Aug 29, 2024
2df4fca
fixed channel versions and rename variables
Daniel-VM Aug 29, 2024
cb0c18d
align spaces
Daniel-VM Oct 15, 2024
890f44c
fix bakta running only for one sample
Daniel-VM Oct 24, 2024
9e622f1
add changelog in #178
Daniel-VM Oct 24, 2024
10a7df8
include resequenced samples in test profile
Daniel-VM Oct 24, 2024
4ed19cb
remove view operator
Daniel-VM Oct 24, 2024
5a9559c
fix ambigous channel creation
Daniel-VM Oct 25, 2024
064996b
Merge pull request #178 from Daniel-VM/dev
Daniel-VM Oct 25, 2024
2bccce1
fixed matrix.test_name in linting
Daniel-VM Oct 25, 2024
fd2b54e
remove parameter section in git ci
Daniel-VM Oct 25, 2024
efe6309
Merge pull request #158 from Daniel-VM/concatenate_fasqs
Daniel-VM Oct 25, 2024
85bdf85
implement missing features in memory set up from nf-core tools 3.0.2
Daniel-VM Oct 25, 2024
aa42f95
update changelog #179
Daniel-VM Oct 25, 2024
02bf3dd
Merge pull request #179 from Daniel-VM/fix_linting_test_name
Daniel-VM Nov 5, 2024
8434c60
bump version 2.4.0
Daniel-VM Oct 24, 2024
f1809ce
update changelog #180
Daniel-VM Oct 25, 2024
9c95fb7
fix release date in #180
Daniel-VM Nov 5, 2024
067fb0f
Merge pull request #180 from Daniel-VM/bump-version-2.4.0
Daniel-VM Nov 5, 2024
262f741
uncomment required line for linting in --release mode
Daniel-VM Nov 5, 2024
e241ff8
update changelog in #182
Daniel-VM Nov 5, 2024
c476f50
Merge pull request #182 from Daniel-VM/dev
Daniel-VM Nov 6, 2024
ad68b1f
fix dfast conda issue by updating version
Daniel-VM Nov 6, 2024
0f7646a
update changelog in #183
Daniel-VM Nov 6, 2024
fc52c3e
Merge pull request #183 from Daniel-VM/dev
Daniel-VM Nov 6, 2024
bf367e4
added code review from #181
Daniel-VM Nov 18, 2024
cd1bf3e
fix linting in #184
Daniel-VM Nov 18, 2024
72cfd85
Merge pull request #184 from Daniel-VM/dev
Daniel-VM Nov 18, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 11 additions & 15 deletions .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,14 +19,18 @@ If you'd like to write some code for nf-core/bacass, the standard workflow is as
1. Check that there isn't already an issue about your idea in the [nf-core/bacass issues](https://github.com/nf-core/bacass/issues) to avoid duplicating work. If there isn't one already, please create one so that others know you're working on this
2. [Fork](https://help.github.com/en/github/getting-started-with-github/fork-a-repo) the [nf-core/bacass repository](https://github.com/nf-core/bacass) to your GitHub account
3. Make the necessary changes / additions within your forked repository following [Pipeline conventions](#pipeline-contribution-conventions)
4. Use `nf-core schema build` and add any new parameters to the pipeline JSON schema (requires [nf-core tools](https://github.com/nf-core/tools) >= 1.10).
4. Use `nf-core pipelines schema build` and add any new parameters to the pipeline JSON schema (requires [nf-core tools](https://github.com/nf-core/tools) >= 1.10).
5. Submit a Pull Request against the `dev` branch and wait for the code to be reviewed and merged

If you're not used to this workflow with git, you can start with some [docs from GitHub](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests) or even their [excellent `git` resources](https://try.github.io/).

## Tests

You have the option to test your changes locally by running the pipeline. For receiving warnings about process selectors and other `debug` information, it is recommended to use the debug profile.
You have the option to test your changes locally by running the pipeline. For receiving warnings about process selectors and other `debug` information, it is recommended to use the debug profile. Execute all the tests with the following command:

```bash
nf-test test --profile debug,test,docker --verbose
```

When you create a pull request with changes, [GitHub Actions](https://github.com/features/actions) will run automatic tests.
Typically, pull-requests are only fully reviewed when these tests are passing, though of course we can help out before then.
Expand All @@ -36,7 +40,7 @@ There are typically two types of tests that run:
### Lint tests

`nf-core` has a [set of guidelines](https://nf-co.re/developers/guidelines) which all pipelines must adhere to.
To enforce these and ensure that all pipelines stay in sync, we have developed a helper tool which runs checks on the pipeline code. This is in the [nf-core/tools repository](https://github.com/nf-core/tools) and once installed can be run locally with the `nf-core lint <pipeline-directory>` command.
To enforce these and ensure that all pipelines stay in sync, we have developed a helper tool which runs checks on the pipeline code. This is in the [nf-core/tools repository](https://github.com/nf-core/tools) and once installed can be run locally with the `nf-core pipelines lint <pipeline-directory>` command.

If any failures or warnings are encountered, please follow the listed URL for more documentation.

Expand All @@ -47,14 +51,6 @@ Each `nf-core` pipeline should be set up with a minimal set of test-data.
If there are any failures then the automated tests fail.
These tests are run both with the latest available version of `Nextflow` and also the minimum required version that is stated in the pipeline code.

You can run pipeline tests with the following command:

```bash
nextflow run nf-core/bacass \
-profile <test,test_long,test_hybrid,...>,<docker/singularity/.../institute> \
--outdir <OUTDIR>
```

## Patch

:warning: Only in the unlikely and regretful event of a release happening with a bug.
Expand All @@ -79,7 +75,7 @@ If you wish to contribute a new step, please use the following coding standards:
2. Write the process block (see below).
3. Define the output channel if needed (see below).
4. Add any new parameters to `nextflow.config` with a default (see below).
5. Add any new parameters to `nextflow_schema.json` with help text (via the `nf-core schema build` tool).
5. Add any new parameters to `nextflow_schema.json` with help text (via the `nf-core pipelines schema build` tool).
6. Add sanity checks and validation for all relevant parameters.
7. Perform local tests to validate that the new code works as expected.
8. If applicable, add a new test command in `.github/workflow/ci.yml`.
Expand All @@ -90,11 +86,11 @@ If you wish to contribute a new step, please use the following coding standards:

Parameters should be initialised / defined with default values in `nextflow.config` under the `params` scope.

Once there, use `nf-core schema build` to add to `nextflow_schema.json`.
Once there, use `nf-core pipelines schema build` to add to `nextflow_schema.json`.

### Default processes resource requirements

Sensible defaults for process resource requirements (CPUs / memory / time) for a process should be defined in `conf/base.config`. These should generally be specified generic with `withLabel:` selectors so they can be shared across multiple processes/steps of the pipeline. A nf-core standard set of labels that should be followed where possible can be seen in the [nf-core pipeline template](https://github.com/nf-core/tools/blob/master/nf_core/pipeline-template/conf/base.config), which has the default process as a single core-process, and then different levels of multi-core configurations for increasingly large memory requirements defined with standardised labels.
Sensible defaults for process resource requirements (CPUs / memory / time) for a process should be defined in `conf/base.config`. These should generally be specified generic with `withLabel:` selectors so they can be shared across multiple processes/steps of the pipeline. A nf-core standard set of labels that should be followed where possible can be seen in the [nf-core pipeline template](https://github.com/nf-core/tools/blob/main/nf_core/pipeline-template/conf/base.config), which has the default process as a single core-process, and then different levels of multi-core configurations for increasingly large memory requirements defined with standardised labels.

The process resources can be passed on to the tool dynamically within the process with the `${task.cpus}` and `${task.memory}` variables in the `script:` block.

Expand All @@ -107,7 +103,7 @@ Please use the following naming schemes, to make it easy to understand what is g

### Nextflow version bumping

If you are using a new feature from core Nextflow, you may bump the minimum required version of nextflow in the pipeline with: `nf-core bump-version --nextflow . [min-nf-version]`
If you are using a new feature from core Nextflow, you may bump the minimum required version of nextflow in the pipeline with: `nf-core pipelines bump-version --nextflow . [min-nf-version]`

### Images and figures

Expand Down
3 changes: 2 additions & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,9 @@ Learn more about contributing: [CONTRIBUTING.md](https://github.com/nf-core/baca
- [ ] If you've fixed a bug or added code that should be tested, add tests!
- [ ] If you've added a new tool - have you followed the pipeline conventions in the [contribution docs](https://github.com/nf-core/bacass/tree/master/.github/CONTRIBUTING.md)
- [ ] If necessary, also make a PR on the nf-core/bacass _branch_ on the [nf-core/test-datasets](https://github.com/nf-core/test-datasets) repository.
- [ ] Make sure your code lints (`nf-core lint`).
- [ ] Make sure your code lints (`nf-core pipelines lint`).
- [ ] Ensure the test suite passes (`nextflow run . -profile test,docker --outdir <OUTDIR>`).
- [ ] Check for unexpected warnings in debug mode (`nextflow run . -profile debug,test,docker --outdir <OUTDIR>`).
- [ ] Usage Documentation in `docs/usage.md` is updated.
- [ ] Output Documentation in `docs/output.md` is updated.
- [ ] `CHANGELOG.md` is updated.
Expand Down
26 changes: 22 additions & 4 deletions .github/workflows/awsfulltest.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,36 @@
name: nf-core AWS full size tests
# This workflow is triggered on published releases.
# This workflow is triggered on PRs opened against the master branch.
# It can be additionally triggered manually with GitHub actions workflow dispatch button.
# It runs the -profile 'test_full' on AWS batch

on:
release:
types: [published]
pull_request:
branches:
- master
workflow_dispatch:
pull_request_review:
types: [submitted]

jobs:
run-platform:
name: Run AWS full tests
if: github.repository == 'nf-core/bacass'
# run only if the PR is approved by at least 2 reviewers and against the master branch or manually triggered
if: github.repository == 'nf-core/bacass' && github.event.review.state == 'approved' && github.event.pull_request.base.ref == 'master' || github.event_name == 'workflow_dispatch'
runs-on: ubuntu-latest
steps:
- uses: octokit/[email protected]
id: check_approvals
Daniel-VM marked this conversation as resolved.
Show resolved Hide resolved
if: github.event_name != 'workflow_dispatch'
with:
route: GET /repos/${{ github.repository }}/pulls/${{ github.event.pull_request.number }}/reviews
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- id: test_variables
if: github.event_name != 'workflow_dispatch'
run: |
JSON_RESPONSE='${{ steps.check_approvals.outputs.data }}'
CURRENT_APPROVALS_COUNT=$(echo $JSON_RESPONSE | jq -c '[.[] | select(.state | contains("APPROVED")) ] | length')
test $CURRENT_APPROVALS_COUNT -ge 2 || exit 1 # At least 2 approvals are required
- name: Launch workflow via Seqera Platform
uses: seqeralabs/action-tower-launch@v2
# Add full size test data (but still relatively small datasets for few samples)
Expand Down
81 changes: 50 additions & 31 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,65 +7,84 @@ on:
pull_request:
release:
types: [published]
workflow_dispatch:

env:
NXF_ANSI_LOG: false
NXF_SINGULARITY_CACHEDIR: ${{ github.workspace }}/.singularity
NXF_SINGULARITY_LIBRARYDIR: ${{ github.workspace }}/.singularity

concurrency:
group: "${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}"
cancel-in-progress: true

jobs:
test:
name: Run pipeline with test data
name: "Run pipeline with test data (${{ matrix.NXF_VER }} | ${{ matrix.test_name }} | ${{ matrix.profile }})"
# Only run on push if this is the nf-core dev branch (merged PRs)
if: "${{ github.event_name != 'push' || (github.event_name == 'push' && github.repository == 'nf-core/bacass') }}"
runs-on: ubuntu-latest
strategy:
matrix:
NXF_VER:
- "23.04.0"
- "24.04.2"
- "latest-everything"
profile:
- "conda"
- "docker"
- "singularity"
test_name:
- "test"
- "test_long"
- "test_long_miniasm"
- "test_hybrid"
- "test_dfast"
isMaster:
- ${{ github.base_ref == 'master' }}
# Exclude conda and singularity on dev
exclude:
- isMaster: false
profile: "conda"
- isMaster: false
profile: "singularity"

steps:
- name: Check out pipeline code
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b # v4

- name: Install Nextflow
- name: Set up Nextflow
uses: nf-core/setup-nextflow@v2
with:
version: "${{ matrix.NXF_VER }}"

- name: Disk space cleanup
uses: jlumbroso/free-disk-space@54081f138730dfa15788a46383842cd2f914a1be # v1.3.1
- name: Set up Apptainer
if: matrix.profile == 'singularity'
uses: eWaterCycle/setup-apptainer@main

- name: Run pipeline with test data
# For example: adding multiple test runs with different parameters
# Remember that you can parallelise this by using strategy.matrix
- name: Set up Singularity
if: matrix.profile == 'singularity'
run: |
nextflow run ${GITHUB_WORKSPACE} -profile test,docker --outdir results
mkdir -p $NXF_SINGULARITY_CACHEDIR
mkdir -p $NXF_SINGULARITY_LIBRARYDIR

profiles:
name: Run workflow profile
# Only run on push if this is the nf-core dev branch (merged PRs)
if: ${{ github.event_name != 'push' || (github.event_name == 'push' && github.repository == 'nf-core/bacass') }}
runs-on: ubuntu-latest
env:
NXF_VER: "23.04.0"
NXF_ANSI_LOG: false
strategy:
matrix:
# Run remaining test profiles with minimum nextflow version
profile: [test_long_miniasm, test_hybrid, test_long, test_dfast, test_hybrid_dragonflye]
steps:
- name: Check out pipeline code
uses: actions/checkout@v2
- name: Set up Miniconda
if: matrix.profile == 'conda'
uses: conda-incubator/setup-miniconda@a4260408e20b96e80095f42ff7f1a15b27dd94ca # v3
with:
miniconda-version: "latest"
auto-update-conda: true
conda-solver: libmamba
channels: conda-forge,bioconda

- name: Install Nextflow
env:
CAPSULE_LOG: none
- name: Set up Conda
if: matrix.profile == 'conda'
run: |
wget -qO- get.nextflow.io | bash
sudo mv nextflow /usr/local/bin/
- name: Run pipeline with ${{ matrix.profile }} test profile
echo $(realpath $CONDA)/condabin >> $GITHUB_PATH
echo $(realpath python) >> $GITHUB_PATH

- name: Clean up Disk space
uses: jlumbroso/free-disk-space@54081f138730dfa15788a46383842cd2f914a1be # v1.3.1

- name: "Run pipeline with test data ${{ matrix.NXF_VER }} | ${{ matrix.test_name }} | ${{ matrix.profile }}"
run: |
nextflow run ${GITHUB_WORKSPACE} -profile ${{ matrix.profile }},docker --outdir results
nextflow run ${GITHUB_WORKSPACE} -profile ${{ matrix.test_name }},${{ matrix.profile }} --outdir ./results
53 changes: 43 additions & 10 deletions .github/workflows/download_pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Test successful pipeline download with 'nf-core download'
name: Test successful pipeline download with 'nf-core pipelines download'

# Run the workflow when:
# - dispatched manually
Expand All @@ -8,7 +8,7 @@ on:
workflow_dispatch:
inputs:
testbranch:
description: "The specific branch you wish to utilize for the test execution of nf-core download."
description: "The specific branch you wish to utilize for the test execution of nf-core pipelines download."
required: true
default: "dev"
pull_request:
Expand Down Expand Up @@ -39,9 +39,11 @@ jobs:
with:
python-version: "3.12"
architecture: "x64"
- uses: eWaterCycle/setup-singularity@931d4e31109e875b13309ae1d07c70ca8fbc8537 # v7

- name: Setup Apptainer
uses: eWaterCycle/setup-apptainer@4bb22c52d4f63406c49e94c804632975787312b3 # v2.0.0
with:
singularity-version: 3.8.3
apptainer-version: 1.3.4

- name: Install dependencies
run: |
Expand All @@ -54,33 +56,64 @@ jobs:
echo "REPOTITLE_LOWERCASE=$(basename ${GITHUB_REPOSITORY,,})" >> ${GITHUB_ENV}
echo "REPO_BRANCH=${{ github.event.inputs.testbranch || 'dev' }}" >> ${GITHUB_ENV}

- name: Make a cache directory for the container images
run: |
mkdir -p ./singularity_container_images

- name: Download the pipeline
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
run: |
nf-core download ${{ env.REPO_LOWERCASE }} \
nf-core pipelines download ${{ env.REPO_LOWERCASE }} \
--revision ${{ env.REPO_BRANCH }} \
--outdir ./${{ env.REPOTITLE_LOWERCASE }} \
--compress "none" \
--container-system 'singularity' \
--container-library "quay.io" -l "docker.io" -l "ghcr.io" \
--container-library "quay.io" -l "docker.io" -l "community.wave.seqera.io" \
--container-cache-utilisation 'amend' \
--download-configuration
--download-configuration 'yes'

- name: Inspect download
run: tree ./${{ env.REPOTITLE_LOWERCASE }}

- name: Count the downloaded number of container images
id: count_initial
run: |
image_count=$(ls -1 ./singularity_container_images | wc -l | xargs)
echo "Initial container image count: $image_count"
echo "IMAGE_COUNT_INITIAL=$image_count" >> ${GITHUB_ENV}

- name: Run the downloaded pipeline (stub)
id: stub_run_pipeline
continue-on-error: true
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -stub -profile test,singularity --outdir ./results
- name: Run the downloaded pipeline (stub run not supported)
id: run_pipeline
if: ${{ job.steps.stub_run_pipeline.status == failure() }}
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -profile test,singularity --outdir ./results

- name: Count the downloaded number of container images
id: count_afterwards
run: |
image_count=$(ls -1 ./singularity_container_images | wc -l | xargs)
echo "Post-pipeline run container image count: $image_count"
echo "IMAGE_COUNT_AFTER=$image_count" >> ${GITHUB_ENV}

- name: Compare container image counts
run: |
if [ "${{ env.IMAGE_COUNT_INITIAL }}" -ne "${{ env.IMAGE_COUNT_AFTER }}" ]; then
initial_count=${{ env.IMAGE_COUNT_INITIAL }}
final_count=${{ env.IMAGE_COUNT_AFTER }}
difference=$((final_count - initial_count))
echo "$difference additional container images were \n downloaded at runtime . The pipeline has no support for offline runs!"
tree ./singularity_container_images
exit 1
else
echo "The pipeline can be downloaded successfully!"
fi
23 changes: 19 additions & 4 deletions .github/workflows/linting.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: nf-core linting
# This workflow is triggered on pushes and PRs to the repository.
# It runs the `nf-core lint` and markdown lint tests to ensure
# It runs the `nf-core pipelines lint` and markdown lint tests to ensure
# that the code meets the nf-core guidelines.
on:
push:
Expand Down Expand Up @@ -41,17 +41,32 @@ jobs:
python-version: "3.12"
architecture: "x64"

- name: read .nf-core.yml
uses: pietrobolcato/[email protected]
id: read_yml
with:
config: ${{ github.workspace }}/.nf-core.yml

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install nf-core
pip install nf-core==${{ steps.read_yml.outputs['nf_core_version'] }}

- name: Run nf-core pipelines lint
if: ${{ github.base_ref != 'master' }}
env:
GITHUB_COMMENTS_URL: ${{ github.event.pull_request.comments_url }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_PR_COMMIT: ${{ github.event.pull_request.head.sha }}
run: nf-core -l lint_log.txt pipelines lint --dir ${GITHUB_WORKSPACE} --markdown lint_results.md

- name: Run nf-core lint
- name: Run nf-core pipelines lint --release
if: ${{ github.base_ref == 'master' }}
env:
GITHUB_COMMENTS_URL: ${{ github.event.pull_request.comments_url }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_PR_COMMIT: ${{ github.event.pull_request.head.sha }}
run: nf-core -l lint_log.txt lint --dir ${GITHUB_WORKSPACE} --markdown lint_results.md
run: nf-core -l lint_log.txt pipelines lint --release --dir ${GITHUB_WORKSPACE} --markdown lint_results.md

- name: Save PR number
if: ${{ always() }}
Expand Down
Loading
Loading