diff --git a/sites/docs/src/content/api_reference/3.1.1/_static/autodoc_pydantic.css b/sites/docs/src/content/api_reference/3.1.1/_static/autodoc_pydantic.css new file mode 100644 index 0000000000..994a3e548d --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/_static/autodoc_pydantic.css @@ -0,0 +1,11 @@ +.autodoc_pydantic_validator_arrow { + padding-left: 8px; + } + +.autodoc_pydantic_collapsable_json { + cursor: pointer; + } + +.autodoc_pydantic_collapsable_erd { + cursor: pointer; + } \ No newline at end of file diff --git a/sites/docs/src/content/api_reference/3.1.1/api/index.md b/sites/docs/src/content/api_reference/3.1.1/api/index.md new file mode 100644 index 0000000000..561dfc555d --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/index.md @@ -0,0 +1,11 @@ +# nf-core/tools documentation + +This API documentation is for the [`nf-core/tools`](https://github.com/nf-core/tools) package. + +## Contents + +- [Pipeline commands]() (run by `nf-core pipelines lint`) +- [Module commands]() (run by `nf-core modules lint`) +- [Subworkflow commands]() (run by `nf-core subworkflows lint`) +- [nf-core/tools Python package API reference]() + - [nf-core/tools pipeline commands API reference]() diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/bump_version.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/bump_version.md new file mode 100644 index 0000000000..eefda70eb5 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/bump_version.md @@ -0,0 +1,60 @@ +# nf_core.pipelines.bump_version + +Bumps the version number in all appropriate files for +a nf-core pipeline. + +### `nf_core.pipelines.bump_version.bump_nextflow_version(pipeline_obj:{:python}`[`Pipeline{:python}`](../utils#nf_core.utils.Pipeline)`, new_version: str) → None{:python}` + +Bumps the required Nextflow version number of a pipeline. + +- **Parameters:** + - **pipeline_obj** ([_nf_core.utils.Pipeline_](../utils#nf_core.utils.Pipeline)) – A Pipeline object that holds information + about the pipeline contents and build files. + - **new_version** (_str_) – The new version tag for the required Nextflow version. + +### `nf_core.pipelines.bump_version.bump_pipeline_version(pipeline_obj:{:python}`[`Pipeline{:python}`](../utils#nf_core.utils.Pipeline)`, new_version: str) → None{:python}` + +Bumps a pipeline version number. + +- **Parameters:** + - **pipeline_obj** ([_nf_core.utils.Pipeline_](../utils#nf_core.utils.Pipeline)) – A Pipeline object that holds information + about the pipeline contents and build files. + - **new_version** (_str_) – The new version tag for the pipeline. Semantic versioning only. + +### `nf_core.pipelines.bump_version.handle_error(message: str, required: bool){:python}` + +### `nf_core.pipelines.bump_version.log_change(old_content: str, new_content: str){:python}` + +### `nf_core.pipelines.bump_version.update_file_version(filename: str | Path, pipeline_obj:{:python}`[`Pipeline{:python}`](../utils#nf_core.utils.Pipeline)`, patterns: List[Tuple[str, str]], required: bool = True, yaml_key: List[str] | None = None) → None{:python}` + +Updates a file with a new version number. + +- **Parameters:** + - **filename** (_str_) – The name of the file to update. + - **pipeline_obj** ([_nf_core.utils.Pipeline_](../utils#nf_core.utils.Pipeline)) – A Pipeline object that holds information + about the pipeline contents. + - **patterns** (_List_ \*\[\*_Tuple_ \*\[\*_str_ _,_ _str_ _]_ _]_) – A list of tuples containing the regex patterns to + match and the replacement strings. + - **required** (_bool_ _,_ _optional_) – Whether the file is required to exist. Defaults to True. + - **yaml_key** (_Optional_ \*\[\*_List_ \*\[\*_str_ _]_ _]_ _,_ _optional_) – The YAML key to update. Defaults to None. + +### `nf_core.pipelines.bump_version.update_text_file(fn: Path, patterns: List[Tuple[str, str]], required: bool){:python}` + +Updates a text file with a new version number. + +- **Parameters:** + - **fn** (_Path_) – The name of the file to update. + - **patterns** (_List_ \*\[\*_Tuple_ \*\[\*_str_ _,_ _str_ _]_ _]_) – A list of tuples containing the regex patterns to + match and the replacement strings. + - **required** (_bool_) – Whether the file is required to exist. + +### `nf_core.pipelines.bump_version.update_yaml_file(fn: Path, patterns: List[Tuple[str, str]], yaml_key: List[str], required: bool){:python}` + +Updates a YAML file with a new version number. + +- **Parameters:** + - **fn** (_Path_) – The name of the file to update. + - **patterns** (_List_ \*\[\*_Tuple_ \*\[\*_str_ _,_ _str_ _]_ _]_) – A list of tuples containing the regex patterns to + match and the replacement strings. + - **yaml_key** (_List_ \*\[\*_str_ _]_) – The YAML key to update. + - **required** (_bool_) – Whether the file is required to exist. diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/create.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/create.md new file mode 100644 index 0000000000..598e2805ce --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/create.md @@ -0,0 +1,75 @@ +# nf_core.pipelines.create + +A Textual app to create a pipeline. + +### _`class{:python}`_`nf_core.pipelines.create.PipelineCreateApp(driver_class: Type[Driver] | None = None, css_path: str | PurePath | List[str | PurePath] | None = None, watch_css: bool = False){:python}` + +Bases: `App`\[`CreateConfig`] + +A Textual app to manage stopwatches. + +#### `BINDINGS{:python}`_: ClassVar\[list\[BindingType]]_ _= \[('d', 'toggle_dark', 'Toggle dark mode'), ('q', 'quit', 'Quit'), ('a', 'toggle_all', 'Toggle all')]_ + +#### `CSS_PATH{:python}`_: ClassVar\[CSSPathType | None]_ _= 'create.tcss'_ + +File paths to load CSS from. + +#### `LOGGING_STATE{:python}`_= None_ + +#### `LOG_HANDLER{:python}`_= \_ + +#### `NFCORE_PIPELINE{:python}`_= True_ + +#### `SCREENS{:python}`_: ClassVar\[dict\[str, Screen\[Any] | Callable\[\[], Screen\[Any]]]]_ _= {'basic_details': BasicDetails(), 'choose_type': ChoosePipelineType(), 'final_details': FinalDetails(), 'github_exit': GithubExit(), 'github_repo': GithubRepo(), 'github_repo_question': GithubRepoQuestion(), 'logging': LoggingScreen(), 'type_custom': CustomPipeline(), 'type_nfcore': NfcorePipeline(), 'welcome': WelcomeScreen()}_ + +Screens associated with the app for the lifetime of the app. + +#### `SUB_TITLE{:python}`_: str | None_ _= 'Create a new pipeline with the nf-core pipeline template'_ + +A class variable to set the default sub-title for the application. + +To update the sub-title while the app is running, you can set the \[sub_title]\[textual.app.App.sub_title] attribute. +See also \[the Screen.SUB_TITLE attribute]\[textual.screen.Screen.SUB_TITLE]. + +#### `TEMPLATE_CONFIG{:python}`_= CreateConfig(org=None, name=None, description=None, author=None, version=None, force=True, outdir=None, skip_features=None, is_nfcore=None)_ + +#### `TITLE{:python}`_: str | None_ _= 'nf-core pipelines create'_ + +A class variable to set the _default_ title for the application. + +To update the title while the app is running, you can set the \[title]\[textual.app.App.title] attribute. +See also \[the Screen.TITLE attribute]\[textual.screen.Screen.TITLE]. + +#### `_computes{:python}`_: ClassVar\[frozenset\[str]]_ _= frozenset({})_ + +#### `_css_type_name{:python}`_: str_ _= 'PipelineCreateApp'_ + +#### `_css_type_names{:python}`_: ClassVar\[frozenset\[str]]_ _= frozenset({'App', 'DOMNode', 'PipelineCreateApp'})_ + +#### `_decorated_handlers{:python}`_: dict\[type\[Message], list\[tuple\[Callable, str | None]]]_ _= {}_ + +#### `_inherit_bindings{:python}`_: ClassVar\[bool]_ _= True_ + +#### `_inherit_component_classes{:python}`_: ClassVar\[bool]_ _= True_ + +#### `_inherit_css{:python}`_: ClassVar\[bool]_ _= True_ + +#### `_merged_bindings{:python}`_: ClassVar\[\_Bindings | None]_ _= \_Bindings({'ctrl+c': Binding(key='ctrl+c', action='quit', description='Quit', show=False, key_display=None, priority=True), 'ctrl+backslash': Binding(key='ctrl+backslash', action='command_palette', description='', show=False, key_display=None, priority=True), 'd': Binding(key='d', action='toggle_dark', description='Toggle dark mode', show=True, key_display=None, priority=False), 'q': Binding(key='q', action='quit', description='Quit', show=True, key_display=None, priority=False), 'a': Binding(key='a', action='toggle_all', description='Toggle all', show=True, key_display=None, priority=False)})_ + +#### `_reactives{:python}`_: ClassVar\[dict\[str, Reactive]]_ _= {'ansi_theme_dark': Reactive(\, layout=False, repaint=True, init=False, always_update=False, compute=True, recompose=False), 'ansi_theme_light': Reactive(\, layout=False, repaint=True, init=False, always_update=False, compute=True, recompose=False), 'app_focus': Reactive(True, layout=False, repaint=True, init=False, always_update=False, compute=False, recompose=False), 'dark': Reactive(True, layout=False, repaint=True, init=False, always_update=False, compute=False, recompose=False), 'sub_title': Reactive('', layout=False, repaint=True, init=False, always_update=False, compute=False, recompose=False), 'title': Reactive('', layout=False, repaint=True, init=False, always_update=False, compute=False, recompose=False)}_ + +#### `action_toggle_all() → None{:python}` + +An action to toggle all Switches. + +#### `action_toggle_dark() → None{:python}` + +An action to toggle dark mode. + +#### `on_button_pressed(event: Pressed) → None{:python}` + +Handle all button pressed events. + +#### `on_mount() → None{:python}` + +#### `template_features_yml{:python}`_= {'adaptivecard': {'custom_pipelines': True, 'description': 'Enable pipeline status update messages through Microsoft Teams', 'help_text': 'This adds an Adaptive Card. A snippets of user interface.\nThis Adaptive Card is used as a template for pipeline update messages and it is compatible with Microsoft Teams.\n', 'linting': {'files_unchanged': \['.prettierignore']}, 'nfcore_pipelines': False, 'short_description': 'Support Microsoft Teams notifications', 'skippable_paths': \['assets/adaptivecard.json']}, 'changelog': {'custom_pipelines': True, 'description': 'Add a CHANGELOG.md file.', 'help_text': 'Having a \`CHANGELOG.md\` file in the pipeline root directory is useful to track the changes added to each version.\n\nYou can read more information on the recommended format here: https://keepachangelog.com/en/1.0.0/\n', 'linting': {'files_exist': \['CHANGELOG.md']}, 'nfcore_pipelines': False, 'short_description': 'Add a changelog', 'skippable_paths': \['CHANGELOG.md']}, 'ci': {'custom_pipelines': True, 'description': 'The pipeline will include several GitHub actions for Continuous Integration (CI) testing', 'help_text': 'Nf-core provides a set of Continuous Integration (CI) tests for Github.\nWhen you open a pull request (PR) on your pipeline repository, these tests will run automatically.\n\nThere are different types of tests:\n\* Linting tests check that your code is formatted correctly and that it adheres to nf-core standards\n For code linting they will use [prettier](https://prettier.io/).\n\* Pipeline tests run your pipeline on a small dataset to check that it works\n These tests are run with a small test dataset on GitHub and a larger test dataset on AWS\n\* Marking old issues as stale\n', 'linting': {'files_exist': \['.github/workflows/branch.yml', '.github/workflows/ci.yml', '.github/workflows/linting_comment.yml', '.github/workflows/linting.yml']}, 'nfcore_pipelines': False, 'short_description': 'Add Github CI tests', 'skippable_paths': \['.github/workflows/']}, 'citations': {'custom_pipelines': True, 'description': 'Include pipeline tools citations in CITATIONS.md and a method description in the MultiQC report (if enabled).', 'help_text': 'If adding citations, the pipeline template will contain a \`CITATIONS.md\` file to add the citations of all tools used in the pipeline.\n\nAdditionally, it will include a YAML file (\`assets/methods_description_template.yml\`) to add a Materials & Methods section describing the tools used in the pieline,\nand the logics to add this section to the output MultiQC report (if the report is generated).\n', 'linting': {'files_exist': \['CITATIONS.md']}, 'nfcore_pipelines': False, 'short_description': 'Include citations', 'skippable_paths': \['assets/methods_description_template.yml', 'CITATIONS.md']}, 'code_linters': {'custom_pipelines': True, 'description': 'The pipeline will include code linters and CI tests to lint your code: pre-commit, editor-config and prettier.', 'help_text': 'Pipelines include code linters to check the formatting of your code in order to harmonize code styles between developers.\nLinters will check all non-ignored files, e.g., JSON, YAML, Nextlow or Python files in your repository.\nThe available code linters are:\n\n- pre-commit (https://pre-commit.com/): used to run all code-linters on every PR and on ever commit if you run \`pre-commit install\` to install it in your local repository.\n- editor-config (https://github.com/editorconfig-checker/editorconfig-checker): checks rules such as indentation or trailing spaces.\n- prettier (https://github.com/prettier/prettier): enforces a consistent style (indentation, quoting, line length, etc).\n', 'linting': {'files_exist': \['.editorconfig', '.prettierignore', '.prettierrc.yml']}, 'nfcore_pipelines': False, 'short_description': 'Use code linters', 'skippable_paths': \['.editorconfig', '.pre-commit-config.yaml', '.prettierignore', '.prettierrc.yml', '.github/workflows/fix-linting.yml']}, 'codespaces': {'custom_pipelines': True, 'description': 'The pipeline will include a devcontainer configuration for GitHub Codespaces, providing a development environment with nf-core/tools and Nextflow installed.', 'help_text': 'The pipeline will include a devcontainer configuration.\nThe devcontainer will create a GitHub Codespaces for Nextflow development with nf-core/tools and Nextflow installed.\n\nGithub Codespaces (https://github.com/features/codespaces) is an online developer environment that runs in your browser, complete with VSCode and a terminal.\n', 'linting': {'files_unchanged': \['.github/CONTRIBUTING.md']}, 'nfcore_pipelines': False, 'short_description': 'Include GitHub Codespaces', 'skippable_paths': \['.devcontainer/devcontainer.json']}, 'documentation': {'custom_pipelines': True, 'description': 'Add documentation to the pipeline', 'help_text': 'This will add documentation markdown files where you can describe your pipeline.\nIt includes:\n- docs/README.md: A README file where you can describe the structure of your documentation.\n- docs/output.md: A file where you can explain the output generated by the pipeline\n- docs/usage.md: A file where you can explain the usage of the pipeline and its parameters.\n\nThese files come with an exemplary documentation structure written.\n', 'linting': {'files_exist': \['docs/output.md', 'docs/README.md', 'docs/usage.md']}, 'nfcore_pipelines': False, 'short_description': 'Add documentation', 'skippable_paths': \['docs']}, 'email': {'custom_pipelines': True, 'description': 'Enable sending emails on pipeline completion.', 'help_text': 'Enable the option of sending an email which will include pipeline execution reports on pipeline completion.\n', 'linting': {'files_exist': \['assets/email_template.html', 'assets/sendmail_template.txt', 'assets/email_template.txt'], 'files_unchanged': \['.prettierignore']}, 'nfcore_pipelines': False, 'short_description': 'Enable email updates', 'skippable_paths': \['assets/email_template.html', 'assets/sendmail_template.txt', 'assets/email_template.txt']}, 'fastqc': {'custom_pipelines': True, 'description': 'The pipeline will include the FastQC module which performs quality control analysis of input FASTQ files.', 'help_text': 'FastQC is a tool which provides quality control checks on raw sequencing data.\nThe pipeline will include the FastQC module.\n', 'nfcore_pipelines': True, 'short_description': 'Use fastqc', 'skippable_paths': \['modules/nf-core/fastqc/']}, 'github': {'custom_pipelines': True, 'description': 'Create a GitHub repository for the pipeline.', 'help_text': 'This will create a GitHub repository for the pipeline.\n\nThe repository will include:\n- Continuous Integration (CI) tests\n- Issues and pull requests templates\n\nThe initialisation of a git repository is required to use the nf-core/tools.\nThis means that even if you unselect this option, your pipeline will still contain a \`.git\` directory and \`.gitignore\` file.\n', 'linting': {'files_exist': \['.github/ISSUE_TEMPLATE/bug_report.yml', '.github/ISSUE_TEMPLATE/feature_request.yml', '.github/PULL_REQUEST_TEMPLATE.md', '.github/CONTRIBUTING.md', '.github/.dockstore.yml'], 'files_unchanged': \['.github/ISSUE_TEMPLATE/bug_report.yml', '.github/ISSUE_TEMPLATE/config.yml', '.github/ISSUE_TEMPLATE/feature_request.yml', '.github/PULL_REQUEST_TEMPLATE.md', '.github/workflows/branch.yml', '.github/workflows/linting_comment.yml', '.github/workflows/linting.yml', '.github/CONTRIBUTING.md', '.github/.dockstore.yml'], 'readme': \['nextflow_badge']}, 'nfcore_pipelines': False, 'short_description': 'Use a GitHub repository.', 'skippable_paths': \['.github', '.gitattributes']}, 'github_badges': {'custom_pipelines': True, 'description': 'The README.md file of the pipeline will include GitHub badges', 'help_text': 'The pipeline \`README.md\` will include badges for:\n\* AWS CI Tests\n\* Zenodo DOI\n\* Nextflow\n\* Conda\n\* Docker\n\* Singularity\n\* Launching on Nextflow Tower\n', 'linting': {'readme': \['nextflow_badge']}, 'nfcore_pipelines': False, 'short_description': 'Add Github badges', 'skippable_paths': False}, 'gitpod': {'custom_pipelines': True, 'description': 'Include the configuration required to use Gitpod.', 'help_text': 'Gitpod (https://www.gitpod.io/) provides standardized and automated development environments.\n\nIncluding this to your pipeline will provide an environment with the latest version of nf-core/tools installed and all its requirements.\nThis is useful to have all the tools ready for pipeline development.\n', 'nfcore_pipelines': False, 'short_description': 'Include a gitpod environment', 'skippable_paths': \['.gitpod.yml']}, 'igenomes': {'custom_pipelines': True, 'description': 'The pipeline will be configured to use a copy of the most common reference genome files from iGenomes', 'help_text': 'Nf-core pipelines are configured to use a copy of the most common reference genome files.\n\nBy selecting this option, your pipeline will include a configuration file specifying the paths to these files.\n\nThe required code to use these files will also be included in the template.\nWhen the pipeline user provides an appropriate genome key,\nthe pipeline will automatically download the required reference files.\n\nFor more information about reference genomes in nf-core pipelines,\nsee the [nf-core docs](https://nf-co.re/docs/usage/reference_genomes).\n', 'linting': {'files_exist': \['conf/igenomes.config', 'conf/igenomes_ignored.config']}, 'nfcore_pipelines': True, 'short_description': 'Use reference genomes', 'skippable_paths': \['conf/igenomes.config', 'conf/igenomes_ignored.config']}, 'is_nfcore': {'custom_pipelines': False, 'description': '', 'help_text': '', 'linting': {'files_exist': \['CODE_OF_CONDUCT.md', 'assets/nf-core-{{short\_name}}\_logo_light.png', 'docs/images/nf-core-{{short\_name}}\_logo_light.png', 'docs/images/nf-core-{{short\_name}}\_logo_dark.png', '.github/ISSUE_TEMPLATE/config.yml', '.github/workflows/awstest.yml', '.github/workflows/awsfulltest.yml'], 'files_unchanged': \['CODE_OF_CONDUCT.md', 'assets/nf-core-{{short\_name}}\_logo_light.png', 'docs/images/nf-core-{{short\_name}}\_logo_light.png', 'docs/images/nf-core-{{short\_name}}\_logo_dark.png', '.github/ISSUE_TEMPLATE/bug_report.yml', '.github/CONTRIBUTING.md', '.github/PULL_REQUEST_TEMPLATE.md', 'assets/email_template.txt', 'docs/README.md'], 'multiqc_config': \['report_comment'], 'nextflow_config': \['manifest.name', 'manifest.homePage', 'validation.help.beforeText', 'validation.help.afterText', 'validation.summary.beforeText', 'validation.summary.afterText']}, 'nfcore_pipelines': False, 'short_description': "A custom pipeline which won't be part of the nf-core organisation but be compatible with nf-core/tools.", 'skippable_paths': \['.github/ISSUE_TEMPLATE/config', 'CODE_OF_CONDUCT.md', '.github/workflows/awsfulltest.yml', '.github/workflows/awstest.yml', '.github/workflows/release-announcements.yml']}, 'license': {'custom_pipelines': True, 'description': 'Add the MIT license file.', 'help_text': 'To protect the copyright of the pipeline, you can add a LICENSE file.\nThis option ads the MIT License. You can read the conditions here: https://opensource.org/license/MIT\n', 'linting': {'files_exist': \['LICENSE'], 'files_unchanged': \['LICENSE']}, 'nfcore_pipelines': False, 'short_description': 'Add a license File', 'skippable_paths': \['LICENSE']}, 'modules': {'custom_pipelines': True, 'description': 'Include all required files to use nf-core modules and subworkflows', 'help_text': 'It is \*recommended\* to use this feature if you want to use modules and subworkflows in your pipeline.\nThis will add all required files to use nf-core components or any compatible components from private repos by using \`nf-core modules\` and \`nf-core subworkflows\` commands.\n', 'linting': {'base_config': False, 'files_exist': \['conf/base.config', 'conf/modules.config', 'modules.json'], 'modules_config': False, 'modules_json': False, 'nfcore_components': False}, 'nfcore_pipelines': False, 'short_description': 'Use nf-core components', 'skippable_paths': \['conf/base.config', 'conf/modules.config', 'modules.json', 'modules', 'subworkflows']}, 'multiqc': {'custom_pipelines': True, 'description': 'The pipeline will include the MultiQC module which generates an HTML report for quality control.', 'help_text': 'MultiQC is a visualization tool that generates a single HTML report summarising all samples in your project. Most of the pipeline quality control results can be visualised in the report and further statistics are available in the report data directory.\n\nThe pipeline will include the MultiQC module and will have special steps which also allow the software versions to be reported in the MultiQC output for future traceability. For more information about how to use MultiQC reports, see http://multiqc.info.\n', 'linting': {'files_exist': \['assets/multiqc_config.yml'], 'files_unchanged': \['.github/CONTRIBUTING.md', 'assets/sendmail_template.txt'], 'multiqc_config': False}, 'nfcore_pipelines': True, 'short_description': 'Use multiqc', 'skippable_paths': \['assets/multiqc_config.yml', 'assets/methods_description_template.yml', 'modules/nf-core/multiqc/']}, 'nf_core_configs': {'custom_pipelines': True, 'description': 'The pipeline will include configuration profiles containing custom parameters required to run nf-core pipelines at different institutions', 'help_text': 'Nf-core has a repository with a collection of configuration profiles.\n\nThose config files define a set of parameters which are specific to compute environments at different Institutions.\nThey can be used within all nf-core pipelines.\nIf you are likely to be running nf-core pipelines regularly it is a good idea to use or create a custom config file for your organisation.\n\nFor more information about nf-core configuration profiles, see the [nf-core/configs repository](https://github.com/nf-core/configs)\n', 'linting': {'files_exist': \['conf/igenomes.config'], 'included_configs': False, 'nextflow_config': \['process.cpus', 'process.memory', 'process.time', 'custom_config', 'params.custom_config_version', 'params.custom_config_base']}, 'nfcore_pipelines': False, 'short_description': 'Add configuration files', 'skippable_paths': False}, 'nf_schema': {'custom_pipelines': True, 'description': 'Use the nf-schema Nextflow plugin for this pipeline.', 'help_text': '[nf-schema](https://nextflow-io.github.io/nf-schema/latest/) is used to validate input parameters based on a JSON schema.\nIt also provides helper functionality to create help messages, get a summary\nof changed parameters and validate and convert a samplesheet to a channel.\n', 'linting': {'files_exist': \['nextflow_schema.json'], 'nextflow_config': False, 'schema_description': False, 'schema_lint': False, 'schema_params': False}, 'nfcore_pipelines': True, 'short_description': 'Use nf-schema', 'skippable_paths': \['subworkflows/nf-core/utils_nfschema_plugin', 'nextflow_schema.json', 'assets/schema_input.json', 'assets/samplesheet.csv']}, 'rocrate': {'custom_pipelines': True, 'description': 'Add a RO-Crate metadata file to describe the pipeline', 'help_text': 'RO-Crate is a metadata specification to describe research data and software.\nThis will add a \`ro-crate-metadata.json\` file to describe the pipeline.\n', 'linting': {'files_unchanged': \['.prettierignore'], 'files_warn': \['ro-crate-metadata.json']}, 'nfcore_pipelines': False, 'short_description': 'Add RO-Crate metadata', 'skippable_paths': \['ro-crate-metadata.json']}, 'seqera_platform': {'custom_pipelines': True, 'description': 'Add a YAML file to specify which output files to upload when launching a pipeline from the Seqera Platform', 'help_text': 'When launching a pipeline with the Seqera Platform, a \`tower.yml\` file can be used to add configuration options.\n\nIn the pipeline template, this file is used to specify the output files of you pipeline which will be shown on the reports tab of Seqera Platform.\nYou can extend this file adding any other desired configuration.\n', 'nfcore_pipelines': False, 'short_description': 'Add Seqera Platform output', 'skippable_paths': \['tower.yml']}, 'slackreport': {'custom_pipelines': True, 'description': 'Enable pipeline status update messages through Slack', 'help_text': 'This adds an JSON template used as a template for pipeline update messages in Slack.\n', 'linting': {'files_unchanged': \['.prettierignore']}, 'nfcore_pipelines': False, 'short_description': 'Support Slack notifications', 'skippable_paths': \['assets/slackreport.json']}, 'test_config': {'custom_pipelines': True, 'description': 'Add two default testing profiles', 'help_text': 'This will add two default testing profiles to run the pipeline with different inputs.\nYou can customise them and add other test profiles.\n\nThese profiles can be used to run the pipeline with a minimal testing dataset with \`nextflow run \ -profile test\`.\n\nThe pipeline will include two profiles: \`test\` and \`test_full\`.\nIn nf-core, we typically use the \`test\` profile to run the pipeline with a minimal dataset and the \`test_full\` to run the pipeline with a larger dataset that simulates a real-world scenario.\n', 'linting': {'files_exist': \['conf/test.config', 'conf/test_full.config', '.github/workflows/ci.yml'], 'files_unchanged': \['.github/CONTRIBUTING.md', '.github/PULL_REQUEST_TEMPLATE.md'], 'nextflow_config': False}, 'nfcore_pipelines': False, 'short_description': 'Add testing profiles', 'skippable_paths': \['conf/test.config', 'conf/test_full.config', '.github/workflows/awsfulltest.yml', '.github/workflows/awstest.yml', '.github/workflows/ci.yml']}, 'vscode': {'custom_pipelines': True, 'description': 'Add a VSCode configuration to render website admonitions', 'help_text': 'This will add a VSCode configuration file to render the admonitions in markdown files with the same style as the nf-core website.\n\nAdds the \`.vscode\` directory to the pipelinerepository.\n', 'nfcore_pipelines': False, 'short_description': 'Render website admonitions in VSCode', 'skippable_paths': \['.vscode']}}_ diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/download.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/download.md new file mode 100644 index 0000000000..1f13663d29 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/download.md @@ -0,0 +1,388 @@ +# nf_core.pipelines.download + +Downloads a nf-core pipeline to the local file system. + +### _`exception{:python}`_`nf_core.pipelines.download.ContainerError(container, registry, address, absolute_URI, out_path, singularity_command, error_msg){:python}` + +Bases: `Exception` + +A class of errors related to pulling containers with Singularity/Apptainer + +#### _`exception{:python}`_`ImageExistsError(error_log){:python}` + +Bases: `FileExistsError` + +Image already exists in cache/output directory. + +#### _`exception{:python}`_`ImageNotFoundError(error_log){:python}` + +Bases: `FileNotFoundError` + +The image can not be found in the registry + +#### _`exception{:python}`_`InvalidTagError(error_log){:python}` + +Bases: `AttributeError` + +Image and registry are valid, but the (version) tag is not + +#### _`exception{:python}`_`NoSingularityContainerError(error_log){:python}` + +Bases: `RuntimeError` + +The container image is no native Singularity Image Format. + +#### _`exception{:python}`_`OtherError(error_log){:python}` + +Bases: `RuntimeError` + +Undefined error with the container + +#### _`exception{:python}`_`RegistryNotFoundError(error_log){:python}` + +Bases: `ConnectionRefusedError` + +The specified registry does not resolve to a valid IP address + +### _`exception{:python}`_`nf_core.pipelines.download.DownloadError{:python}` + +Bases: `RuntimeError` + +A custom exception that is raised when nf-core pipelines download encounters a problem that we already took into consideration. +In this case, we do not want to print the traceback, but give the user some concise, helpful feedback instead. + +### _`class{:python}`_`nf_core.pipelines.download.DownloadProgress(*columns: str | ProgressColumn, console: Console | None = None, auto_refresh: bool = True, refresh_per_second: float = 10, speed_estimate_period: float = 30.0, transient: bool = False, redirect_stdout: bool = True, redirect_stderr: bool = True, get_time: Callable[[], float] | None = None, disable: bool = False, expand: bool = False){:python}` + +Bases: `Progress` + +Custom Progress bar class, allowing us to have two progress +bars with different columns / layouts. + +#### `get_renderables(){:python}` + +Get a number of renderables for the progress display. + +### _`class{:python}`_`nf_core.pipelines.download.DownloadWorkflow(pipeline=None, revision=None, outdir=None, compress_type=None, force=False, platform=False, download_configuration=None, additional_tags=None, container_system=None, container_library=None, container_cache_utilisation=None, container_cache_index=None, parallel_downloads=4){:python}` + +Bases: `object` + +Downloads a nf-core workflow from GitHub to the local file system. + +Can also download its Singularity container image if required. + +- **Parameters:** + - **pipeline** (_str_) – A nf-core pipeline name. + - **revision** (_List_ \*\[\*_str_ _]_) – The workflow revision(s) to download, like 1.0 or dev . Defaults to None. + - **outdir** (_str_) – Path to the local download directory. Defaults to None. + - **compress_type** (_str_) – Type of compression for the downloaded files. Defaults to None. + - **force** (_bool_) – Flag to force download even if files already exist (overwrite existing files). Defaults to False. + - **platform** (_bool_) – Flag to customize the download for Seqera Platform (convert to git bare repo). Defaults to False. + - **download_configuration** (_str_) – Download the configuration files from nf-core/configs. Defaults to None. + - **tag** (_List_ \*\[\*_str_ _]_) – Specify additional tags to add to the downloaded pipeline. Defaults to None. + - **container_system** (_str_) – The container system to use (e.g., “singularity”). Defaults to None. + - **container_library** (_List_ \*\[\*_str_ _]_) – The container libraries (registries) to use. Defaults to None. + - **container_cache_utilisation** (_str_) – If a local or remote cache of already existing container images should be considered. Defaults to None. + - **container_cache_index** (_str_) – An index for the remote container cache. Defaults to None. + - **parallel_downloads** (_int_) – The number of parallel downloads to use. Defaults to 4. + +#### `compress_download() → None{:python}` + +Take the downloaded files and make a compressed .tar.gz archive. + +#### `download_configs(){:python}` + +Downloads the centralised config profiles from nf-core/configs to `self.outdir`. + +#### `download_wf_files(revision, wf_sha, download_url){:python}` + +Downloads workflow files from GitHub to the `self.outdir`. + +#### `download_workflow(){:python}` + +Starts a nf-core workflow download. + +#### `download_workflow_platform(location=None){:python}` + +Create a bare-cloned git repository of the workflow, so it can be launched with tw launch as pipeline + +#### `download_workflow_static(){:python}` + +Downloads a nf-core workflow from GitHub to the local file system in a self-contained manner. + +#### `find_container_images(workflow_directory: str) → None{:python}` + +Find container image names for workflow. + +Starts by using nextflow config to pull out any process.container +declarations. This works for DSL1. It should return a simple string with resolved logic, +but not always, e.g. not for differentialabundance 1.2.0 + +Second, we look for DSL2 containers. These can’t be found with +nextflow config at the time of writing, so we scrape the pipeline files. +This returns raw matches that will likely need to be cleaned. + +#### `gather_registries(workflow_directory: str) → None{:python}` + +Fetch the registries from the pipeline config and CLI arguments and store them in a set. +This is needed to symlink downloaded container images so Nextflow will find them. + +#### `get_revision_hash(){:python}` + +Find specified revision / branch hash + +#### `get_singularity_images(current_revision: str = '') → None{:python}` + +Loop through container names and download Singularity images + +#### `prioritize_direct_download(container_list: List[str]) → List[str]{:python}` + +Helper function that takes a list of container images (URLs and Docker URIs), +eliminates all Docker URIs for which also a URL is contained and returns the +cleaned and also deduplicated list. + +Conceptually, this works like so: + +Everything after the last Slash should be identical, e.g. “scanpy:1.7.2–pyhdfd78af_0” in +\[‘https://depot.galaxyproject.org/singularity/scanpy:1.7.2–pyhdfd78af\_0’, ‘biocontainers/scanpy:1.7.2–pyhdfd78af_0’] + +re.sub(‘.\*/(.\*)’,’1’,c) will drop everything up to the last slash from c (container_id) + +d.get(k:=re.sub(‘.\*/(.\*)’,’1’,c),’’) assigns the truncated string to k (key) and gets the +corresponding value from the dict if present or else defaults to “”. + +If the regex pattern matches, the original container_id will be assigned to the dict with the k key. +r”^$|(?!^http)” matches an empty string (we didn’t have it in the dict yet and want to keep it in either case) or +any string that does not start with http. Because if our current dict value already starts with http, +we want to keep it and not replace with with whatever we have now (which might be the Docker URI). + +A regex that matches http, r”^$|^http” could thus be used to prioritize the Docker URIs over http Downloads + +We also need to handle a special case: The Singularity downloads from Seqera Containers all end in ‘data’, although +they are not equivalent, e.g.: + +‘’ +‘’ + +Lastly, we want to remove at least a few Docker URIs for those modules, that have an oras:// download link. + +#### `prompt_compression_type(){:python}` + +Ask user if we should compress the downloaded files + +#### `prompt_config_inclusion(){:python}` + +Prompt for inclusion of institutional configurations + +#### `prompt_container_download(){:python}` + +Prompt whether to download container images or not + +#### `prompt_pipeline_name(){:python}` + +Prompt for the pipeline name if not set with a flag + +#### `prompt_revision() → None{:python}` + +Prompt for pipeline revision / branch +Prompt user for revision tag if ‘–revision’ was not set +If –platform is specified, allow to select multiple revisions +Also the static download allows for multiple revisions, but +we do not prompt this option interactively. + +#### `prompt_singularity_cachedir_creation(){:python}` + +Prompt about using $NXF_SINGULARITY_CACHEDIR if not already set + +#### `prompt_singularity_cachedir_remote(){:python}` + +Prompt about the index of a remote $NXF_SINGULARITY_CACHEDIR + +#### `prompt_singularity_cachedir_utilization(){:python}` + +Ask if we should _only_ use $NXF_SINGULARITY_CACHEDIR without copying into target + +#### `read_remote_containers(){:python}` + +Reads the file specified as index for the remote Singularity cache dir + +#### _`static{:python}`_`reconcile_seqera_container_uris(prioritized_container_list: List[str], other_list: List[str]) → List[str]{:python}` + +Helper function that takes a list of Seqera container URIs, +extracts the software string and builds a regex from them to filter out +similar containers from the second container list. + +prioritzed_container_list = \[ +… “oras://community.wave.seqera.io/library/multiqc:1.25.1–f0e743d16869c0bf”, +… “oras://community.wave.seqera.io/library/multiqc_pip_multiqc-plugins:e1f4877f1515d03c” +… ] + +will be cleaned to + +\[‘library/multiqc:1.25.1’, ‘library/multiqc_pip_multiqc-plugins’] + +Subsequently, build a regex from those and filter out matching duplicates in other_list: + +#### `rectify_raw_container_matches(raw_findings){:python}` + +Helper function to rectify the raw extracted container matches into fully qualified container names. +If multiple containers are found, any prefixed with http for direct download is prioritized + +Example syntax: + +Early DSL2: + +```groovy +if (workflow.containerEngine == 'singularity' && !params.singularity_pull_docker_container) { + container "https://depot.galaxyproject.org/singularity/fastqc:0.11.9--0" +} else { + container "quay.io/biocontainers/fastqc:0.11.9--0" +} +``` + +Later DSL2: + +```groovy +container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ? + 'https://depot.galaxyproject.org/singularity/fastqc:0.11.9--0' : + 'biocontainers/fastqc:0.11.9--0' }" +``` + +Later DSL2, variable is being used: + +```groovy +container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ? + "https://depot.galaxyproject.org/singularity/${container_id}" : + "quay.io/biocontainers/${container_id}" }" + +container_id = 'mulled-v2-1fa26d1ce03c295fe2fdcf85831a92fbcbd7e8c2:afaaa4c6f5b308b4b6aa2dd8e99e1466b2a6b0cd-0' +``` + +DSL1 / Special case DSL2: + +```groovy +container "nfcore/cellranger:6.0.2" +``` + +#### `singularity_copy_cache_image(container: str, out_path: str, cache_path: str | None) → None{:python}` + +Copy Singularity image from NXF_SINGULARITY_CACHEDIR to target folder. + +#### `singularity_download_image(container: str, out_path: str, cache_path: str | None, progress:{:python}`[DownloadProgress](#nf_core.pipelines.download.DownloadProgress)) → None + +Download a singularity image from the web. + +Use native Python to download the file. + +- **Parameters:** + - **container** (_str_) – A pipeline’s container name. Usually it is of similar format + to `https://depot.galaxyproject.org/singularity/name:version` + - **out_path** (_str_) – The final target output path + - **cache_path** (_str_ _,_ _None_) – The NXF_SINGULARITY_CACHEDIR path if set, None if not + - **progress** (_Progress_) – Rich progress bar instance to add tasks to. + +#### `singularity_image_filenames(container: str) → Tuple[str, str | None]{:python}` + +Check Singularity cache for image, copy to destination folder if found. + +- **Parameters:** + **container** (_str_) – A pipeline’s container name. Can be direct download URL + or a Docker Hub repository ID. +- **Returns:** + Returns a tuple of (out_path, cache_path). + : out_path is the final target output path. it may point to the NXF_SINGULARITY_CACHEDIR, if cache utilisation was set to ‘amend’. + If cache utilisation was set to ‘copy’, it will point to the target folder, a subdirectory of the output directory. In the latter case, + cache_path may either be None (image is not yet cached locally) or point to the image in the NXF_SINGULARITY_CACHEDIR, so it will not be + downloaded from the web again, but directly copied from there. See get_singularity_images() for implementation. +- **Return type:** + tuple (str, str) + +#### `singularity_pull_image(container: str, out_path: str, cache_path: str | None, library: List[str], progress:{:python}`[DownloadProgress](#nf_core.pipelines.download.DownloadProgress)) → None + +Pull a singularity image using `singularity pull` + +Attempt to use a local installation of singularity to pull the image. + +- **Parameters:** + - **container** (_str_) – A pipeline’s container name. Usually it is of similar format + to `nfcore/name:version`. + - **library** (_list_ _of_ _str_) – A list of libraries to try for pulling the image. +- **Raises:** + **Various exceptions possible from subprocess execution** **of** **Singularity.** – + +#### `symlink_singularity_images(image_out_path: str) → None{:python}` + +Create a symlink for each registry in the registry set that points to the image. +We have dropped the explicit registries from the modules in favor of the configurable registries. +Unfortunately, Nextflow still expects the registry to be part of the file name, so a symlink is needed. + +The base image, e.g. ./nf-core-gatk-4.4.0.0.img will thus be symlinked as for example ./quay.io-nf-core-gatk-4.4.0.0.img +by prepending all registries in self.registry_set to the image name. + +Unfortunately, out output image name may contain a registry definition (Singularity image pulled from depot.galaxyproject.org +or older pipeline version, where the docker registry was part of the image name in the modules). Hence, it must be stripped +before to ensure that it is really the base name. + +#### `wf_use_local_configs(revision_dirname){:python}` + +Edit the downloaded nextflow.config file to use the local config files + +### _`class{:python}`_`nf_core.pipelines.download.WorkflowRepo(remote_url, revision, commit, additional_tags, location=None, hide_progress=False, in_cache=True){:python}` + +Bases: `SyncedRepo` + +An object to store details about a locally cached workflow repository. + +Important Attributes: +: fullname: The full name of the repository, `nf-core/{self.pipelinename}`. +local_repo_dir (str): The local directory, where the workflow is cloned into. Defaults to `$HOME/.cache/nf-core/nf-core/{self.pipeline}`. + +#### `__add_additional_tags() → None{:python}` + +#### `access(){:python}` + +#### `bare_clone(destination){:python}` + +#### `checkout(commit){:python}` + +Checks out the repository at the requested commit + +- **Parameters:** + **commit** (_str_) – Git SHA of the commit + +#### `get_remote_branches(remote_url){:python}` + +Get all branches from a remote repository + +- **Parameters:** + **remote_url** (_str_) – The git url to the remote repository +- **Returns:** + All branches found in the remote +- **Return type:** + (set\[str]) + +#### _`property{:python}`_`heads{:python}` + +#### `retry_setup_local_repo(skip_confirm=False){:python}` + +#### `setup_local_repo(remote, location=None, in_cache=True){:python}` + +Sets up the local git repository. If the repository has been cloned previously, it +returns a git.Repo object of that clone. Otherwise it tries to clone the repository from +the provided remote URL and returns a git.Repo of the new clone. + +- **Parameters:** + - **remote** (_str_) – git url of remote + - **location** (_Path_) – location where the clone should be created/cached. + - **in_cache** (_bool_ _,_ _optional_) – Whether to clone the repository from the cache. Defaults to False. + +Sets self.repo + +#### _`property{:python}`_`tags{:python}` + +#### `tidy_tags_and_branches(){:python}` + +Function to delete all tags and branches that are not of interest to the downloader. +This allows a clutter-free experience in Seqera Platform. The untagged commits are evidently still available. + +However, due to local caching, the downloader might also want access to revisions that had been deleted before. +In that case, don’t bother with re-adding the tags and rather download anew from Github. diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/index.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/index.md new file mode 100644 index 0000000000..207189bbcc --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/index.md @@ -0,0 +1,12 @@ +# API Reference + +- [nf_core.pipelines.bump_version](bump_version) +- [nf_core.pipelines.create](create) +- [nf_core.pipelines.download](download) +- [nf_core.pipelines.launch](launch) +- [nf_core.pipelines.lint](lint) +- [nf_core.pipelines.list](list) +- [nf_core.pipelines.params_file](params-file) +- [nf_core.pipelines.schema](schema) +- [nf_core.pipelines.sync](sync) +- [nf_core.pipelines.utils](utils) diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/launch.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/launch.md new file mode 100644 index 0000000000..72d9ef8cb1 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/launch.md @@ -0,0 +1,87 @@ +# nf_core.pipelines.launch + +Launch a pipeline, interactively collecting params + +### _`class{:python}`_`nf_core.pipelines.launch.Launch(pipeline=None, revision=None, command_only=False, params_in=None, params_out=None, save_all=False, show_hidden=False, url=None, web_id=None){:python}` + +Bases: `object` + +Class to hold config option to launch a pipeline + +#### `build_command(){:python}` + +Build the nextflow run command based on what we know + +#### `get_pipeline_schema(){:python}` + +Load and validate the schema from the supplied pipeline + +#### `get_web_launch_response(){:python}` + +Given a URL for a web-gui launch response, recursively query it until results are ready. + +#### `launch_pipeline(){:python}` + +#### `launch_web_gui(){:python}` + +Send schema to nf-core website and launch input GUI + +#### `launch_workflow(){:python}` + +Launch nextflow if required + +#### `merge_nxf_flag_schema(){:python}` + +Take the Nextflow flag schema and merge it with the pipeline schema + +#### `print_param_header(param_id, param_obj, is_group=False){:python}` + +#### `prompt_group(group_id, group_obj){:python}` + +Prompt for edits to a group of parameters (subschema in ‘definitions’) + +- **Parameters:** + - **group_id** – Parameter ID (string) + - **group_obj** – JSON Schema keys (dict) +- **Returns:** + val answers +- **Return type:** + Dict of param_id + +#### `prompt_param(param_id, param_obj, is_required, answers){:python}` + +Prompt for a single parameter + +#### `prompt_schema(){:python}` + +Go through the pipeline schema and prompt user to change defaults + +#### `prompt_web_gui(){:python}` + +Ask whether to use the web-based or cli wizard to collect params + +#### `sanitise_web_response(){:python}` + +The web builder returns everything as strings. +Use the functions defined in the cli wizard to convert to the correct types. + +#### `set_schema_inputs(){:python}` + +Take the loaded schema and set the defaults as the input parameters +If a nf_params.json file is supplied, apply these over the top + +#### `single_param_to_questionary(param_id, param_obj, answers=None, print_help=True){:python}` + +Convert a JSONSchema param to a Questionary question + +- **Parameters:** + - **param_id** – Parameter ID (string) + - **param_obj** – JSON Schema keys (dict) + - **answers** – Optional preexisting answers (dict) + - **print_help** – If description and help_text should be printed (bool) +- **Returns:** + Single Questionary dict, to be appended to questions list + +#### `strip_default_params(){:python}` + +Strip parameters if they have not changed from the default diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/lint.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/lint.md new file mode 100644 index 0000000000..7d402ba205 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/lint.md @@ -0,0 +1,27 @@ +# nf_core.pipelines.lint + +:::tip +See the [Lint Tests]() docs for information about specific linting functions. +::: + + + +Linting policy for nf-core pipeline projects. + +Tests Nextflow-based pipelines to check that they adhere to +the nf-core community guidelines. + +### `nf_core.pipelines.lint.run_linting(pipeline_dir, release_mode: bool = False, fix=(), key=(), show_passed: bool = False, fail_ignored: bool = False, fail_warned: bool = False, sort_by: str = 'test', md_fn=None, json_fn=None, hide_progress: bool = False) → Tuple[PipelineLint, ComponentLint | None, ComponentLint | None]{:python}` + +Runs all nf-core linting checks on a given Nextflow pipeline project +in either release mode or normal mode (default). Returns an object +of type `PipelineLint` after finished. + +- **Parameters:** + - **pipeline_dir** (_str_) – The path to the Nextflow pipeline root directory + - **release_mode** (_bool_) – Set this to True, if the linting should be run in the release mode. + See `PipelineLint` for more information. +- **Returns:** + An object of type `PipelineLint` that contains all the linting results. + An object of type `ComponentLint` that contains all the linting results for the modules. + An object of type `ComponentLint` that contains all the linting results for the subworkflows. diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/list.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/list.md new file mode 100644 index 0000000000..2b253918ca --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/list.md @@ -0,0 +1,99 @@ +# nf_core.pipelines.list + +Lists available nf-core pipelines and versions. + +### _`class{:python}`_`nf_core.pipelines.list.LocalWorkflow(name){:python}` + +Bases: `object` + +Class to handle local workflows pulled by nextflow + +#### `get_local_nf_workflow_details(){:python}` + +Get full details about a local cached workflow + +### _`class{:python}`_`nf_core.pipelines.list.RemoteWorkflow(data){:python}` + +Bases: `object` + +A information container for a remote workflow. + +- **Parameters:** + **data** (_dict_) – workflow information as they are retrieved from the GitHub repository REST API request + (). + +### _`class{:python}`_`nf_core.pipelines.list.Workflows(filter_by=None, sort_by='release', show_archived=False){:python}` + +Bases: `object` + +Workflow container class. + +Is used to collect local and remote nf-core pipelines. Pipelines +can be sorted, filtered and compared. + +- **Parameters:** + - **filter_by** (_list_) – A list of strings that can be used for filtering. + - **sort_by** (_str_) – workflows can be sorted by keywords. Keyword must be one of + release (default), name, stars. + +#### `compare_remote_local(){:python}` + +Matches local to remote workflows. + +If a matching remote workflow is found, the local workflow’s Git commit hash is compared +with the latest one from remote. + +A boolean flag in `RemoteWorkflow.local_is_latest` is set to True, if the local workflow +is the latest. + +#### `filtered_workflows(){:python}` + +Filters remote workflows for keywords. + +- **Returns:** + Filtered remote workflows. +- **Return type:** + list + +#### `get_local_nf_workflows(){:python}` + +Retrieves local Nextflow workflows. + +Local workflows are stored in `self.local_workflows` list. + +#### `get_remote_workflows(){:python}` + +Retrieves remote workflows from [nf-co.re](https://nf-co.re). + +Remote workflows are stored in `self.remote_workflows` list. + +#### `print_json(){:python}` + +Dump JSON of all parsed information + +#### `print_summary(){:python}` + +Prints a summary of all pipelines. + +### `nf_core.pipelines.list.get_local_wf(workflow: str | Path, revision=None) → str | None{:python}` + +Check if this workflow has a local copy and use nextflow to pull it if not + +### `nf_core.pipelines.list.list_workflows(filter_by=None, sort_by='release', as_json=False, show_archived=False){:python}` + +Prints out a list of all nf-core workflows. + +- **Parameters:** + - **filter_by** (_list_) – A list of strings that can be used for filtering. + - **sort_by** (_str_) – workflows can be sorted by keywords. Keyword must be one of + release (default), name, stars. + - **as_json** (_boolean_) – Set to true, if the lists should be printed in JSON. + +### `nf_core.pipelines.list.pretty_date(time){:python}` + +Transforms a datetime object or a int() Epoch timestamp into a +pretty string like ‘an hour ago’, ‘Yesterday’, ‘3 months ago’, +‘just now’, etc + +Based on +Adapted by sven1103 diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/params-file.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/params-file.md new file mode 100644 index 0000000000..b34f8ec4cb --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/params-file.md @@ -0,0 +1,82 @@ +# nf_core.pipelines.params_file + +Create a YAML parameter file + +### _`class{:python}`_`nf_core.pipelines.params_file.ParamsFileBuilder(pipeline=None, revision=None){:python}` + +Bases: `object` + +Class to hold config option to launch a pipeline. + +- **Parameters:** + - **pipeline** (_str_ _,_ _optional_) – Path to a local pipeline path or a remote pipeline. + - **revision** (_str_ _,_ _optional_) – Revision of the pipeline to use. + +#### `format_group(definition, show_hidden=False) → str{:python}` + +Format a group of parameters of the schema as commented YAML. + +- **Parameters:** + - **definition** (_dict_) – Definition of the group from the schema + - **show_hidden** (_bool_) – Whether to include hidden parameters +- **Returns:** + Formatted output for a group +- **Return type:** + str + +#### `format_param(name: str, properties: Dict, required_properties: List[str] = [], show_hidden: bool = False) → str | None{:python}` + +Format a single parameter of the schema as commented YAML + +- **Parameters:** + - **name** (_str_) – Name of the parameter + - **properties** (_dict_) – Properties of the parameter + - **required_properties** (_list_) – List of required properties + - **show_hidden** (_bool_) – Whether to include hidden parameters +- **Returns:** + Section of a params-file.yml for given parameter + None: If the parameter is skipped because it is hidden and show_hidden is not set +- **Return type:** + str + +#### `generate_params_file(show_hidden: bool = False) → str{:python}` + +Generate the contents of a parameter template file. + +Assumes the pipeline has been fetched (if remote) and the schema loaded. + +- **Parameters:** + **show_hidden** (_bool_) – Whether to include hidden parameters +- **Returns:** + Formatted output for the pipeline schema +- **Return type:** + str + +#### `get_pipeline() → bool | None{:python}` + +Prompt the user for a pipeline name and get the schema + +#### `write_params_file(output_fn: Path = PosixPath('nf-params.yaml'), show_hidden=False, force=False) → bool{:python}` + +Build a template file for the pipeline schema. + +- **Parameters:** + - **output_fn** (_str_ _,_ _optional_) – Filename to write the template to. + - **show_hidden** (_bool_ _,_ _optional_) – Include parameters marked as hidden in the output + - **force** (_bool_ _,_ _optional_) – Whether to overwrite existing output file. +- **Returns:** + True if the template was written successfully, False otherwise +- **Return type:** + bool + +### `nf_core.pipelines.params_file._print_wrapped(text, fill_char='-', mode='both', width=80, indent=0, drop_whitespace=True) → str{:python}` + +Helper function to format text for the params-file template. + +- **Parameters:** + - **text** (_str_) – Text to print + - **fill_char** (_str_ _,_ _optional_) – Character to use for creating dividers. Defaults to ‘-‘. + - **mode** (_str_ _,_ _optional_) – Where to place dividers. Defaults to “both”. + - **width** (_int_ _,_ _optional_) – Maximum line-width of the output text. Defaults to 80. + - **indent** (_int_ _,_ _optional_) – Number of spaces to indent the text. Defaults to 0. + - **drop_whitespace** (_bool_ _,_ _optional_) – Whether to drop whitespace from the start and end of lines. diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/schema.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/schema.md new file mode 100644 index 0000000000..50bb17f98b --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/schema.md @@ -0,0 +1,180 @@ +# nf_core.pipelines.schema + +Code to deal with pipeline JSON Schema + +### _`class{:python}`_`nf_core.pipelines.schema.PipelineSchema{:python}` + +Bases: `object` + +Class to generate a schema object with +functions to handle pipeline JSON Schema + +#### `_update_validation_plugin_from_config() → None{:python}` + +#### `add_schema_found_configs(){:python}` + +Add anything that’s found in the Nextflow params that’s missing in the pipeline schema +Update defaults if they have changed + +#### `build_schema(pipeline_dir, no_prompts, web_only, url){:python}` + +Interactively build a new pipeline schema for a pipeline + +#### `build_schema_param(p_val){:python}` + +Build a pipeline schema dictionary for an param interactively + +#### `check_for_input_mimetype(){:python}` + +Check that the input parameter has a mimetype + +Common mime types: + +- **Returns:** + The mimetype of the input parameter +- **Return type:** + mimetype (str) +- **Raises:** + **LookupError** – If the input parameter is not found or defined in the correct place + +#### `del_schema_filename() → None{:python}` + +#### `get_schema_defaults() → None{:python}` + +Generate set of default input parameters from schema. + +Saves defaults to self.schema_defaults +Returns count of how many parameters were found (with or without a default value) + +#### `get_schema_filename() → str{:python}` + +#### `get_schema_path(path: str | Path, local_only: bool = False, revision: str | None = None) → None{:python}` + +Given a pipeline name, directory, or path, set self.schema_filename + +#### `get_schema_types() → None{:python}` + +Get a list of all parameter types in the schema + +#### `get_web_builder_response(){:python}` + +Given a URL for a Schema build response, recursively query it until results are ready. +Once ready, validate Schema and write to disk. + +#### `get_wf_params(){:python}` + +Load the pipeline parameter defaults using nextflow config +Strip out only the params. values and ignore anything that is not a flat variable + +#### `launch_web_builder(){:python}` + +Send pipeline schema to web builder and wait for response + +#### `load_input_params(params_path){:python}` + +Load a given a path to a parameters file (JSON/YAML) + +These should be input parameters used to run a pipeline with +the Nextflow -params-file option. + +#### `load_lint_schema(){:python}` + +Load and lint a given schema to see if it looks valid + +#### `load_schema(){:python}` + +Load a pipeline schema from a file + +#### `make_skeleton_schema(){:python}` + +Make a new pipeline schema from the template + +#### `markdown_param_table(properties, required, columns){:python}` + +Creates a markdown table for params from jsonschema properties section + +- **Parameters:** + - **properties** (_dict_) – A jsonschema properties dictionary + - **required** (_list_) – A list of the required fields. + Should come from the same level of the jsonschema as properties + - **columns** (_list_) – A list of columns to write +- **Returns:** + A string with the markdown table +- **Return type:** + str + +#### `markdown_to_html(markdown_str){:python}` + +Convert markdown to html + +#### `print_documentation(output_fn=None, format='markdown', force=False, columns=None){:python}` + +Prints documentation for the schema. + +#### `prompt_remove_schema_notfound_config(p_key){:python}` + +Check if a given key is found in the nextflow config params and prompt to remove it if note + +Returns True if it should be removed, False if not. + +#### `remove_schema_empty_definitions(){:python}` + +Go through top-level schema remove definitions that don’t have +any property attributes + +#### `remove_schema_notfound_configs(){:python}` + +Go through top-level schema and all definitions sub-schemas to remove +anything that’s not in the nextflow config. + +#### `remove_schema_notfound_configs_single_schema(schema){:python}` + +Go through a single schema / set of properties and strip out +anything that’s not in the nextflow config. + +Takes: Schema or sub-schema with properties key +Returns: Cleaned schema / sub-schema + +#### `sanitise_param_default(param){:python}` + +Given a param, ensure that the default value is the correct variable type + +#### `save_schema(suppress_logging=False){:python}` + +Save a pipeline schema to a file + +#### _`property{:python}`_`schema_filename{:python}`_: str_ + +#### `schema_to_markdown(columns){:python}` + +Creates documentation for the schema in Markdown format. + +#### `set_schema_filename(schema: str) → None{:python}` + +#### `validate_config_default_parameter(param, schema_param, config_default){:python}` + +Assure that default parameters in the nextflow.config are correctly set +by comparing them to their type in the schema + +#### `validate_default_params(){:python}` + +Check that all default parameters in the schema are valid +Ignores ‘required’ flag, as required parameters might have no defaults + +Additional check that all parameters have defaults in nextflow.config and that +these are valid and adhere to guidelines + +#### `validate_params(){:python}` + +Check given parameters against a schema and validate + +#### `validate_schema(schema=None){:python}` + +Check that the Schema is valid + +Returns: Number of parameters found + +#### `validate_schema_title_description(schema=None){:python}` + +Extra validation command for linting. +Checks that the schema “$id”, “title” and “description” attributes match the pipeline config. diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/sync.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/sync.md new file mode 100644 index 0000000000..32760f768d --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/sync.md @@ -0,0 +1,162 @@ +# nf_core.pipelines.sync + +Synchronise a pipeline TEMPLATE branch with the template. + +### _`class{:python}`_`nf_core.pipelines.sync.PipelineSync(pipeline_dir: str | Path, from_branch: str | None = None, make_pr: bool = False, gh_repo: str | None = None, gh_username: str | None = None, template_yaml_path: str | None = None, force_pr: bool = False){:python}` + +Bases: `object` + +Object to hold syncing information and results. + +- **Parameters:** + - **pipeline_dir** (_str_) – The path to the Nextflow pipeline root directory + - **from_branch** (_str_) – The branch to use to fetch config vars. If not set, will use current active branch + - **make_pr** (_bool_) – Set this to True to create a GitHub pull-request with the changes + - **gh_username** (_str_) – GitHub username + - **gh_repo** (_str_) – GitHub repository name + - **template_yaml_path** (_str_) – Path to template.yml file for pipeline creation settings. DEPRECATED + - **force_pr** (_bool_) – Force the creation of a pull request, even if there are no changes to the template + +#### `pipeline_dir{:python}` + +Path to target pipeline directory + +- **Type:** + str + +#### `from_branch{:python}` + +Repo branch to use when collecting workflow variables. Default: active branch. + +- **Type:** + str + +#### `original_branch{:python}` + +Repo branch that was checked out before we started. + +- **Type:** + str + +#### `made_changes{:python}` + +Whether making the new template pipeline introduced any changes + +- **Type:** + bool + +#### `make_pr{:python}` + +Whether to try to automatically make a PR on GitHub.com + +- **Type:** + bool + +#### `required_config_vars{:python}` + +List of nextflow variables required to make template pipeline + +- **Type:** + list + +#### `gh_username{:python}` + +GitHub username + +- **Type:** + str + +#### `gh_repo{:python}` + +GitHub repository name + +- **Type:** + str + +#### _`static{:python}`_`_parse_json_response(response) → Tuple[Any, str]{:python}` + +Helper method to parse JSON response and create pretty-printed string. + +- **Parameters:** + **response** – requests.Response object +- **Returns:** + Tuple of (parsed_json, pretty_printed_str) + +#### `checkout_template_branch(){:python}` + +Try to check out the origin/TEMPLATE in a new TEMPLATE branch. +If this fails, try to check out an existing local TEMPLATE branch. + +#### `close_open_pr(pr) → bool{:python}` + +Given a PR API response, add a comment and close. + +#### `close_open_template_merge_prs(){:python}` + +Get all template merging branches (starting with ‘nf-core-template-merge-‘) +and check for any open PRs from these branches to the self.from_branch +If open PRs are found, add a comment and close them + +#### `commit_template_changes(){:python}` + +If we have any changes with the new template files, make a git commit + +#### `create_merge_base_branch(){:python}` + +Create a new branch from the updated TEMPLATE branch +This branch will then be used to create the PR + +#### `delete_template_branch_files(){:python}` + +Delete all files in the TEMPLATE branch + +#### `get_wf_config(){:python}` + +Check out the target branch if requested and fetch the nextflow config. +Check that we have the required config variables. + +#### `inspect_sync_dir(){:python}` + +Takes a look at the target directory for syncing. Checks that it’s a git repo +and makes sure that there are no uncommitted changes. + +#### `make_pull_request(){:python}` + +Create a pull request to a base branch (default: dev), +from a head branch (default: TEMPLATE) + +Returns: An instance of class requests.Response + +#### `make_template_pipeline(){:python}` + +Delete all files and make a fresh template using the workflow variables + +#### `push_merge_branch(){:python}` + +Push the newly created merge branch to the remote repository + +#### `push_template_branch(){:python}` + +If we made any changes, push the TEMPLATE branch to the default remote +and try to make a PR. If we don’t have the auth token, try to figure out a URL +for the PR and print this to the console. + +#### `reset_target_dir(){:python}` + +Reset the target pipeline directory. Check out the original branch. + +#### `sync() → None{:python}` + +Find workflow attributes, create a new template pipeline on TEMPLATE + +### _`exception{:python}`_`nf_core.pipelines.sync.PullRequestExceptionError{:python}` + +Bases: `Exception` + +Exception raised when there was an error creating a Pull-Request on GitHub.com + +### _`exception{:python}`_`nf_core.pipelines.sync.SyncExceptionError{:python}` + +Bases: `Exception` + +Exception raised when there was an error with TEMPLATE branch synchronisation diff --git a/sites/docs/src/content/api_reference/3.1.1/api/pipelines/utils.md b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/utils.md new file mode 100644 index 0000000000..8ae937bff0 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/pipelines/utils.md @@ -0,0 +1 @@ +# nf_core.pipelines.utils diff --git a/sites/docs/src/content/api_reference/3.1.1/api/utils.md b/sites/docs/src/content/api_reference/3.1.1/api/utils.md new file mode 100644 index 0000000000..e1aca14b63 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/api/utils.md @@ -0,0 +1,1838 @@ +# nf_core.utils + +Common utility functions for the nf-core python package. + +### _`class{:python}`_`nf_core.utils.GitHubAPISession{:python}` + +Bases: `CachedSession` + +Class to provide a single session for interacting with the GitHub API for a run. +Inherits the requests_cache.CachedSession and adds additional functionality, +such as automatically setting up GitHub authentication if we can. + +#### `get(url, **kwargs){:python}` + +Initialise the session if we haven’t already, then call the superclass get method. + +#### `lazy_init() → None{:python}` + +Initialise the object. + +Only do this when it’s actually being used (due to global import) + +#### `log_content_headers(request, post_data=None){:python}` + +Try to dump everything to the console, useful when things go wrong. + +#### `request_retry(url, post_data=None){:python}` + +Try to fetch a URL, keep retrying if we get a certain return code. + +Used in nf-core pipelines sync code because we get 403 errors: too many simultaneous requests +See + +#### `safe_get(url){:python}` + +Run a GET request, raise a nice exception with lots of logging if it fails. + +#### `setup_github_auth(auth=None){:python}` + +Try to automatically set up GitHub authentication + +### _`pydantic model{:python}`_`nf_core.utils.NFCoreTemplateConfig{:python}` + +Bases: `BaseModel` + +Template configuration schema + +

+Show JSON schema +```json +{ + "title": "NFCoreTemplateConfig", + "description": "Template configuration schema", + "type": "object", + "properties": { + "org": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Org" + }, + "name": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Name" + }, + "description": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Description" + }, + "author": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Author" + }, + "version": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Version" + }, + "force": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": true, + "title": "Force" + }, + "outdir": { + "anyOf": [ + { + "type": "string" + }, + { + "format": "path", + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Outdir" + }, + "skip_features": { + "anyOf": [ + { + "items": {}, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Skip Features" + }, + "is_nfcore": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Is Nfcore" + } + } +} +``` + +

+* **Fields:** + - [`author (str | None)`](#nf_core.utils.NFCoreTemplateConfig.author) + - [`description (str | None)`](#nf_core.utils.NFCoreTemplateConfig.description) + - [`force (bool | None)`](#nf_core.utils.NFCoreTemplateConfig.force) + - [`is_nfcore (bool | None)`](#nf_core.utils.NFCoreTemplateConfig.is_nfcore) + - [`name (str | None)`](#nf_core.utils.NFCoreTemplateConfig.name) + - [`org (str | None)`](#nf_core.utils.NFCoreTemplateConfig.org) + - [`outdir (str | pathlib.Path | None)`](#nf_core.utils.NFCoreTemplateConfig.outdir) + - [`skip_features (list | None)`](#nf_core.utils.NFCoreTemplateConfig.skip_features) + - [`version (str | None)`](#nf_core.utils.NFCoreTemplateConfig.version) +* **Validators:** + - [`outdir_to_str`](#nf_core.utils.NFCoreTemplateConfig.outdir_to_str) » [`outdir`](#nf_core.utils.NFCoreTemplateConfig.outdir) + +#### _`field{:python}`_`author{:python}`_: str | None_`{:python}`_= None_ + +Pipeline author + +#### _`field{:python}`_`description{:python}`_: str | None_`{:python}`_= None_ + +Pipeline description + +#### _`field{:python}`_`force{:python}`_: bool | None_`{:python}`_= True_ + +Force overwrite of existing files + +#### _`field{:python}`_`is_nfcore{:python}`_: bool | None_`{:python}`_= None_ + +Whether the pipeline is an nf-core pipeline. + +#### _`field{:python}`_`name{:python}`_: str | None_`{:python}`_= None_ + +Pipeline name + +#### _`field{:python}`_`org{:python}`_: str | None_`{:python}`_= None_ + +Organisation name + +#### _`field{:python}`_`outdir{:python}`_: str | Path | None_`{:python}`_= None_ + +Output directory + +- **Validated by:** + - [`outdir_to_str`](#nf_core.utils.NFCoreTemplateConfig.outdir_to_str) + +#### _`field{:python}`_`skip_features{:python}`_: list | None_`{:python}`_= None_ + +Skip features. See for a list of features. + +#### _`field{:python}`_`version{:python}`_: str | None_`{:python}`_= None_ + +Pipeline version + +#### `get(item: str, default: Any = None) → Any{:python}` + +#### _`validator{:python}`_`outdir_to_str{:python}`_»_`{:python}`[_outdir_](#nf_core.utils.NFCoreTemplateConfig.outdir) + +#### `_abc_impl{:python}`_= <\_abc.\_abc_data object>_ + +### _`pydantic model{:python}`_`nf_core.utils.NFCoreYamlConfig{:python}` + +Bases: `BaseModel` + +.nf-core.yml configuration file schema + +

+Show JSON schema +```json +{ + "title": "NFCoreYamlConfig", + "description": ".nf-core.yml configuration file schema", + "type": "object", + "properties": { + "repository_type": { + "anyOf": [ + { + "enum": [ + "pipeline", + "modules" + ], + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Repository Type" + }, + "nf_core_version": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Nf Core Version" + }, + "org_path": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Org Path" + }, + "lint": { + "anyOf": [ + { + "$ref": "#/$defs/NFCoreYamlLintConfig" + }, + { + "type": "null" + } + ], + "default": null + }, + "template": { + "anyOf": [ + { + "$ref": "#/$defs/NFCoreTemplateConfig" + }, + { + "type": "null" + } + ], + "default": null + }, + "bump_version": { + "anyOf": [ + { + "additionalProperties": { + "type": "boolean" + }, + "type": "object" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Bump Version" + }, + "update": { + "anyOf": [ + { + "additionalProperties": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "boolean" + }, + { + "additionalProperties": { + "anyOf": [ + { + "type": "string" + }, + { + "additionalProperties": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "boolean" + } + ] + }, + "type": "object" + } + ] + }, + "type": "object" + } + ] + }, + "type": "object" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Update" + } + }, + "$defs": { + "NFCoreTemplateConfig": { + "description": "Template configuration schema", + "properties": { + "org": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Org" + }, + "name": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Name" + }, + "description": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Description" + }, + "author": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Author" + }, + "version": { + "anyOf": [ + { + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Version" + }, + "force": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": true, + "title": "Force" + }, + "outdir": { + "anyOf": [ + { + "type": "string" + }, + { + "format": "path", + "type": "string" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Outdir" + }, + "skip_features": { + "anyOf": [ + { + "items": {}, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Skip Features" + }, + "is_nfcore": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Is Nfcore" + } + }, + "title": "NFCoreTemplateConfig", + "type": "object" + }, + "NFCoreYamlLintConfig": { + "description": "schema for linting config in `.nf-core.yml` should cover:\n\n.. code-block:: yaml\n files_unchanged:\n - .github/workflows/branch.yml\n modules_config: False\n modules_config:\n - fastqc\n # merge_markers: False\n merge_markers:\n - docs/my_pdf.pdf\n nextflow_config: False\n nextflow_config:\n - manifest.name\n - config_defaults:\n - params.annotation_db\n - params.multiqc_comment_headers\n - params.custom_table_headers\n # multiqc_config: False\n multiqc_config:\n - report_section_order\n - report_comment\n files_exist:\n - .github/CONTRIBUTING.md\n - CITATIONS.md\n template_strings: False\n template_strings:\n - docs/my_pdf.pdf\n nfcore_components: False", + "properties": { + "files_unchanged": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Files Unchanged" + }, + "modules_config": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Modules Config" + }, + "merge_markers": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Merge Markers" + }, + "nextflow_config": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "anyOf": [ + { + "type": "string" + }, + { + "additionalProperties": { + "items": { + "type": "string" + }, + "type": "array" + }, + "type": "object" + } + ] + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Nextflow Config" + }, + "multiqc_config": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Multiqc Config" + }, + "files_exist": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Files Exist" + }, + "template_strings": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Template Strings" + }, + "readme": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Readme" + }, + "nfcore_components": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Nfcore Components" + }, + "actions_ci": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Actions Ci" + }, + "actions_awstest": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Actions Awstest" + }, + "actions_awsfulltest": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Actions Awsfulltest" + }, + "pipeline_todos": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Pipeline Todos" + }, + "plugin_includes": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Plugin Includes" + }, + "pipeline_name_conventions": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Pipeline Name Conventions" + }, + "schema_lint": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Schema Lint" + }, + "schema_params": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Schema Params" + }, + "system_exit": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "System Exit" + }, + "schema_description": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Schema Description" + }, + "actions_schema_validation": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Actions Schema Validation" + }, + "modules_json": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Modules Json" + }, + "modules_structure": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Modules Structure" + }, + "base_config": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Base Config" + }, + "nfcore_yml": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Nfcore Yml" + }, + "version_consistency": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Version Consistency" + }, + "included_configs": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Included Configs" + } + }, + "title": "NFCoreYamlLintConfig", + "type": "object" + } + } +} +``` + +

+* **Fields:** + - [`bump_version (Dict[str, bool] | None)`](#nf_core.utils.NFCoreYamlConfig.bump_version) + - [`lint (nf_core.utils.NFCoreYamlLintConfig | None)`](#nf_core.utils.NFCoreYamlConfig.lint) + - [`nf_core_version (str | None)`](#nf_core.utils.NFCoreYamlConfig.nf_core_version) + - [`org_path (str | None)`](#nf_core.utils.NFCoreYamlConfig.org_path) + - [`repository_type (Literal['pipeline', 'modules'] | None)`](#nf_core.utils.NFCoreYamlConfig.repository_type) + - [`template (nf_core.utils.NFCoreTemplateConfig | None)`](#nf_core.utils.NFCoreYamlConfig.template) + - [`update (Dict[str, str | bool | Dict[str, str | Dict[str, str | bool]]] | None)`](#nf_core.utils.NFCoreYamlConfig.update) + +#### _`field{:python}`_`bump_version{:python}`_: Dict\[str, bool] | None_`{:python}`_= None_ + +Disable bumping of the version for a module/subworkflow (when repository_type is modules). See for more information. + +#### _`field{:python}`_`lint{:python}`_: [NFCoreYamlLintConfig](#nf_core.utils.NFCoreYamlLintConfig) | None_`{:python}`_= None_ + +Pipeline linting configuration, see for examples and documentation + +#### _`field{:python}`_`nf_core_version{:python}`_: str | None_`{:python}`_= None_ + +Version of nf-core/tools used to create/update the pipeline + +#### _`field{:python}`_`org_path{:python}`_: str | None_`{:python}`_= None_ + +Path to the organisation’s modules repository (used for modules repo_type only) + +#### _`field{:python}`_`repository_type{:python}`_: Literal\['pipeline', 'modules'] | None_`{:python}`_= None_ + +Type of repository + +#### _`field{:python}`_`template{:python}`_: [NFCoreTemplateConfig](#nf_core.utils.NFCoreTemplateConfig) | None_`{:python}`_= None_ + +Pipeline template configuration + +#### _`field{:python}`_`update{:python}`_: Dict\[str, str | bool | Dict\[str, str | Dict\[str, str | bool]]] | None_`{:python}`_= None_ + +Disable updating specific modules/subworkflows (when repository_type is pipeline). See for more information. + +#### `get(item: str, default: Any = None) → Any{:python}` + +#### `model_dump(**kwargs) → Dict[str, Any]{:python}` + +Usage docs: + +Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. + +- **Parameters:** + - **mode** – The mode in which to_python should run. + If mode is ‘json’, the output will only contain JSON serializable types. + If mode is ‘python’, the output may contain non-JSON-serializable Python objects. + - **include** – A set of fields to include in the output. + - **exclude** – A set of fields to exclude from the output. + - **context** – Additional context to pass to the serializer. + - **by_alias** – Whether to use the field’s alias in the dictionary key if defined. + - **exclude_unset** – Whether to exclude fields that have not been explicitly set. + - **exclude_defaults** – Whether to exclude fields that are set to their default value. + - **exclude_none** – Whether to exclude fields that have a value of None. + - **round_trip** – If True, dumped values should be valid as input for non-idempotent types such as Json\[T]. + - **warnings** – How to handle serialization errors. False/”none” ignores them, True/”warn” logs errors, + “error” raises a \[PydanticSerializationError]\[pydantic_core.PydanticSerializationError]. + - **serialize_as_any** – Whether to serialize fields with duck-typing serialization behavior. +- **Returns:** + A dictionary representation of the model. + +#### `_abc_impl{:python}`_= <\_abc.\_abc_data object>_ + +### _`pydantic model{:python}`_`nf_core.utils.NFCoreYamlLintConfig{:python}` + +Bases: `BaseModel` + +schema for linting config in .nf-core.yml should cover: + +

+Show JSON schema +```json +{ + "title": "NFCoreYamlLintConfig", + "description": "schema for linting config in `.nf-core.yml` should cover:\n\n.. code-block:: yaml\n files_unchanged:\n - .github/workflows/branch.yml\n modules_config: False\n modules_config:\n - fastqc\n # merge_markers: False\n merge_markers:\n - docs/my_pdf.pdf\n nextflow_config: False\n nextflow_config:\n - manifest.name\n - config_defaults:\n - params.annotation_db\n - params.multiqc_comment_headers\n - params.custom_table_headers\n # multiqc_config: False\n multiqc_config:\n - report_section_order\n - report_comment\n files_exist:\n - .github/CONTRIBUTING.md\n - CITATIONS.md\n template_strings: False\n template_strings:\n - docs/my_pdf.pdf\n nfcore_components: False", + "type": "object", + "properties": { + "files_unchanged": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Files Unchanged" + }, + "modules_config": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Modules Config" + }, + "merge_markers": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Merge Markers" + }, + "nextflow_config": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "anyOf": [ + { + "type": "string" + }, + { + "additionalProperties": { + "items": { + "type": "string" + }, + "type": "array" + }, + "type": "object" + } + ] + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Nextflow Config" + }, + "multiqc_config": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Multiqc Config" + }, + "files_exist": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Files Exist" + }, + "template_strings": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Template Strings" + }, + "readme": { + "anyOf": [ + { + "type": "boolean" + }, + { + "items": { + "type": "string" + }, + "type": "array" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Readme" + }, + "nfcore_components": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Nfcore Components" + }, + "actions_ci": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Actions Ci" + }, + "actions_awstest": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Actions Awstest" + }, + "actions_awsfulltest": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Actions Awsfulltest" + }, + "pipeline_todos": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Pipeline Todos" + }, + "plugin_includes": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Plugin Includes" + }, + "pipeline_name_conventions": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Pipeline Name Conventions" + }, + "schema_lint": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Schema Lint" + }, + "schema_params": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Schema Params" + }, + "system_exit": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "System Exit" + }, + "schema_description": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Schema Description" + }, + "actions_schema_validation": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Actions Schema Validation" + }, + "modules_json": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Modules Json" + }, + "modules_structure": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Modules Structure" + }, + "base_config": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Base Config" + }, + "nfcore_yml": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Nfcore Yml" + }, + "version_consistency": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Version Consistency" + }, + "included_configs": { + "anyOf": [ + { + "type": "boolean" + }, + { + "type": "null" + } + ], + "default": null, + "title": "Included Configs" + } + } +} +``` + +

+* **Fields:** + - [`actions_awsfulltest (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.actions_awsfulltest) + - [`actions_awstest (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.actions_awstest) + - [`actions_ci (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.actions_ci) + - [`actions_schema_validation (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.actions_schema_validation) + - [`base_config (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.base_config) + - [`files_exist (bool | List[str] | None)`](#nf_core.utils.NFCoreYamlLintConfig.files_exist) + - [`files_unchanged (bool | List[str] | None)`](#nf_core.utils.NFCoreYamlLintConfig.files_unchanged) + - [`included_configs (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.included_configs) + - [`merge_markers (bool | List[str] | None)`](#nf_core.utils.NFCoreYamlLintConfig.merge_markers) + - [`modules_config (bool | List[str] | None)`](#nf_core.utils.NFCoreYamlLintConfig.modules_config) + - [`modules_json (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.modules_json) + - [`modules_structure (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.modules_structure) + - [`multiqc_config (bool | List[str] | None)`](#nf_core.utils.NFCoreYamlLintConfig.multiqc_config) + - [`nextflow_config (bool | List[str | Dict[str, List[str]]] | None)`](#nf_core.utils.NFCoreYamlLintConfig.nextflow_config) + - [`nfcore_components (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.nfcore_components) + - [`nfcore_yml (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.nfcore_yml) + - [`pipeline_name_conventions (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.pipeline_name_conventions) + - [`pipeline_todos (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.pipeline_todos) + - [`plugin_includes (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.plugin_includes) + - [`readme (bool | List[str] | None)`](#nf_core.utils.NFCoreYamlLintConfig.readme) + - [`schema_description (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.schema_description) + - [`schema_lint (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.schema_lint) + - [`schema_params (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.schema_params) + - [`system_exit (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.system_exit) + - [`template_strings (bool | List[str] | None)`](#nf_core.utils.NFCoreYamlLintConfig.template_strings) + - [`version_consistency (bool | None)`](#nf_core.utils.NFCoreYamlLintConfig.version_consistency) + +#### _`field{:python}`_`actions_awsfulltest{:python}`_: bool | None_`{:python}`_= None_ + +Lint all required files to run full tests on AWS + +#### _`field{:python}`_`actions_awstest{:python}`_: bool | None_`{:python}`_= None_ + +Lint all required files to run tests on AWS + +#### _`field{:python}`_`actions_ci{:python}`_: bool | None_`{:python}`_= None_ + +Lint all required files to use GitHub Actions CI + +#### _`field{:python}`_`actions_schema_validation{:python}`_: bool | None_`{:python}`_= None_ + +Lint GitHub Action workflow files with schema + +#### _`field{:python}`_`base_config{:python}`_: bool | None_`{:python}`_= None_ + +Lint base.config file + +#### _`field{:python}`_`files_exist{:python}`_: bool | List\[str] | None_`{:python}`_= None_ + +List of files that can not exist + +#### _`field{:python}`_`files_unchanged{:python}`_: bool | List\[str] | None_`{:python}`_= None_ + +List of files that should not be changed + +#### _`field{:python}`_`included_configs{:python}`_: bool | None_`{:python}`_= None_ + +Lint for included configs + +#### _`field{:python}`_`merge_markers{:python}`_: bool | List\[str] | None_`{:python}`_= None_ + +List of files that should not contain merge markers + +#### _`field{:python}`_`modules_config{:python}`_: bool | List\[str] | None_`{:python}`_= None_ + +List of modules that should not be changed + +#### _`field{:python}`_`modules_json{:python}`_: bool | None_`{:python}`_= None_ + +Lint modules.json file + +#### _`field{:python}`_`modules_structure{:python}`_: bool | None_`{:python}`_= None_ + +Lint modules structure + +#### _`field{:python}`_`multiqc_config{:python}`_: bool | List\[str] | None_`{:python}`_= None_ + +List of MultiQC config options that be changed + +#### _`field{:python}`_`nextflow_config{:python}`_: bool | List\[str | Dict\[str, List\[str]]] | None_`{:python}`_= None_ + +List of Nextflow config files that should not be changed + +#### _`field{:python}`_`nfcore_components{:python}`_: bool | None_`{:python}`_= None_ + +Lint all required files to use nf-core modules and subworkflows + +#### _`field{:python}`_`nfcore_yml{:python}`_: bool | None_`{:python}`_= None_ + +Lint nf-core.yml + +#### _`field{:python}`_`pipeline_name_conventions{:python}`_: bool | None_`{:python}`_= None_ + +Lint for pipeline name conventions + +#### _`field{:python}`_`pipeline_todos{:python}`_: bool | None_`{:python}`_= None_ + +Lint for TODOs statements + +#### _`field{:python}`_`plugin_includes{:python}`_: bool | None_`{:python}`_= None_ + +Lint for nextflow plugin + +#### _`field{:python}`_`readme{:python}`_: bool | List\[str] | None_`{:python}`_= None_ + +Lint the README.md file + +#### _`field{:python}`_`schema_description{:python}`_: bool | None_`{:python}`_= None_ + +Check that every parameter in the schema has a description. + +#### _`field{:python}`_`schema_lint{:python}`_: bool | None_`{:python}`_= None_ + +Lint nextflow_schema.json file + +#### _`field{:python}`_`schema_params{:python}`_: bool | None_`{:python}`_= None_ + +Lint schema for all params + +#### _`field{:python}`_`system_exit{:python}`_: bool | None_`{:python}`_= None_ + +Lint for System.exit calls in groovy/nextflow code + +#### _`field{:python}`_`template_strings{:python}`_: bool | List\[str] | None_`{:python}`_= None_ + +List of files that can contain template strings + +#### _`field{:python}`_`version_consistency{:python}`_: bool | None_`{:python}`_= None_ + +Lint for version consistency + +#### `get(item: str, default: Any = None) → Any{:python}` + +#### `_abc_impl{:python}`_= <\_abc.\_abc_data object>_ + +### _`class{:python}`_`nf_core.utils.Pipeline(wf_path: Path){:python}` + +Bases: `object` + +Object to hold information about a local pipeline. + +- **Parameters:** + **path** (_str_) – The path to the nf-core pipeline directory. + +#### `conda_config{:python}` + +The parsed conda configuration file content (`environment.yml`). + +- **Type:** + dict + +#### `conda_package_info{:python}` + +The conda package(s) information, based on the API requests to Anaconda cloud. + +- **Type:** + dict + +#### `nf_config{:python}` + +The Nextflow pipeline configuration file content. + +- **Type:** + dict + +#### `files{:python}` + +A list of files found during the linting process. + +- **Type:** + list + +#### `git_sha{:python}` + +The git sha for the repo commit / current GitHub pull-request ($GITHUB_PR_COMMIT) + +- **Type:** + str + +#### `minNextflowVersion{:python}` + +The minimum required Nextflow version to run the pipeline. + +- **Type:** + str + +#### `wf_path{:python}` + +Path to the pipeline directory. + +- **Type:** + str + +#### `pipeline_name{:python}` + +The pipeline name, without the nf-core tag, for example hlatyping. + +- **Type:** + str + +#### `schema_obj{:python}` + +A `PipelineSchema` object + +- **Type:** + obj + +#### `_fp(fn: str | Path) → Path{:python}` + +Convenience function to get full path to a file in the pipeline + +#### `_load() → bool{:python}` + +Run core load functions + +#### `_load_conda_environment() → bool{:python}` + +Try to load the pipeline environment.yml file, if it exists + +#### `list_files() → List[Path]{:python}` + +Get a list of all files in the pipeline + +#### `load_pipeline_config() → bool{:python}` + +Get the nextflow config for this pipeline + +Once loaded, set a few convenience reference class attributes + +### _`class{:python}`_`nf_core.utils.SingularityCacheFilePathValidator{:python}` + +Bases: `Validator` + +Validator for file path specified as –singularity-cache-index argument in nf-core pipelines download + +#### `_abc_impl{:python}`_= <\_abc.\_abc_data object>_ + +#### `validate(value){:python}` + +Validate the input. +If invalid, this should raise a `ValidationError`. + +- **Parameters:** + **document** – `Document` instance. + +### `nf_core.utils.anaconda_package(dep, dep_channels=None){:python}` + +Query conda package information. + +Sends a HTTP GET request to the Anaconda remote API. + +- **Parameters:** + - **dep** (_str_) – A conda package name. + - **dep_channels** (_list_) – list of conda channels to use +- **Raises:** + - **A LookupError**\*\*,\*\* **if the connection fails** **or** **times out** **or** **gives an unexpected status code** – + - **A ValueError**\*\*,\*\* **if the package name can not be found** **(\*\***404\***\*)** – + +### `nf_core.utils.check_if_outdated(current_version=None, remote_version=None, source_url='https://nf-co.re/tools_version'){:python}` + +Check if the current version of nf-core is outdated + +### `nf_core.utils.custom_yaml_dumper(){:python}` + +Overwrite default PyYAML output to make Prettier YAML linting happy + +### `nf_core.utils.determine_base_dir(directory: Path | str = '.') → Path{:python}` + +### `nf_core.utils.fetch_remote_version(source_url){:python}` + +### `nf_core.utils.fetch_wf_config(wf_path: Path, cache_config: bool = True) → dict{:python}` + +Uses Nextflow to retrieve the the configuration variables +from a Nextflow workflow. + +- **Parameters:** + - **wf_path** (_str_) – Nextflow workflow file system path. + - **cache_config** (_bool_) – cache configuration or not (def. True) +- **Returns:** + Workflow configuration settings. +- **Return type:** + dict + +### `nf_core.utils.file_md5(fname){:python}` + +Calculates the md5sum for a file on the disk. + +- **Parameters:** + **fname** (_str_) – Path to a local file. + +### `nf_core.utils.get_biocontainer_tag(package, version){:python}` + +Given a bioconda package and version, looks for Docker and Singularity containers +using the biocontaineres API, e.g.: +/{tool}/versions/{tool}-{version} +Returns the most recent container versions by default. +:param package: A bioconda package name. +:type package: str +:param version: Version of the bioconda package +:type version: str + +- **Raises:** + - **A LookupError**\*\*,\*\* **if the connection fails** **or** **times out** **or** **gives an unexpected status code** – + - **A ValueError**\*\*,\*\* **if the package name can not be found** **(\*\***404\***\*)** – + +### `nf_core.utils.get_first_available_path(directory: Path | str, paths: List[str]) → Path | None{:python}` + +### `nf_core.utils.get_repo_releases_branches(pipeline, wfs){:python}` + +Fetches details of a nf-core workflow to download. + +- **Parameters:** + - **pipeline** (_str_) – GitHub repo username/repo + - **wfs** – A nf_core.pipelines.list.Workflows() object, where get_remote_workflows() has been called. +- **Returns:** + Array of releases, Array of branches +- **Return type:** + wf_releases, wf_branches (tuple) +- **Raises:** + **LockupError**\*\*,\*\* **if the pipeline can not be found.** – + +### `nf_core.utils.get_wf_files(wf_path: Path){:python}` + +Return a list of all files in a directory (ignores .gitigore files) + +### `nf_core.utils.is_file_binary(path){:python}` + +Check file path to see if it is a binary file + +### `nf_core.utils.is_pipeline_directory(wf_path){:python}` + +Checks if the specified directory have the minimum required files +(‘main.nf’, ‘nextflow.config’) for a pipeline directory + +- **Parameters:** + **wf_path** (_str_) – The directory to be inspected +- **Raises:** + **UserWarning** – If one of the files are missing + +### `nf_core.utils.is_relative_to(path1, path2){:python}` + +Checks if a path is relative to another. + +Should mimic Path.is_relative_to which not available in Python < 3.9 + +path1 (Path | str): The path that could be a subpath +path2 (Path | str): The path the could be the superpath + +### `nf_core.utils.load_tools_config(directory: str | Path = '.') → Tuple[Path | None,{:python}`[`NFCoreYamlConfig{:python}`](#nf_core.utils.NFCoreYamlConfig)`| None]{:python}` + +Parse the nf-core.yml configuration file + +Look for a file called either .nf-core.yml or .nf-core.yaml + +Also looks for the deprecated file .nf-core-lint.yml/yaml and issues +a warning that this file will be deprecated in the future + +Returns the loaded config dict or False, if the file couldn’t be loaded + +### `nf_core.utils.nested_delitem(d, keys){:python}` + +Deletes a key from a nested dictionary + +- **Parameters:** + - **d** (_dict_) – the nested dictionary to traverse + - **keys** (_list_ \*\[\*_Any_ _]_) – A list of keys to iteratively traverse, deleting the final one + +### `nf_core.utils.nested_setitem(d, keys, value){:python}` + +Sets the value in a nested dict using a list of keys to traverse + +- **Parameters:** + - **d** (_dict_) – the nested dictionary to traverse + - **keys** (_list_ \*\[\*_Any_ _]_) – A list of keys to iteratively traverse + - **value** (_Any_) – The value to be set for the last key in the chain + +### `nf_core.utils.parse_anaconda_licence(anaconda_response, version=None){:python}` + +Given a response from the anaconda API using anaconda_package, parse the software licences. + +Returns: Set of licence types + +### `nf_core.utils.pip_package(dep){:python}` + +Query PyPI package information. + +Sends a HTTP GET request to the PyPI remote API. + +- **Parameters:** + **dep** (_str_) – A PyPI package name. +- **Raises:** + - **A LookupError**\*\*,\*\* **if the connection fails** **or** **times out** – + - **A ValueError**\*\*,\*\* **if the package name can not be found** – + +### `nf_core.utils.plural_es(list_or_int){:python}` + +Return a ‘es’ if the input is not one or has not the length of one. + +### `nf_core.utils.plural_s(list_or_int){:python}` + +Return an s if the input is not one or has not the length of one. + +### `nf_core.utils.plural_y(list_or_int){:python}` + +Return ‘ies’ if the input is not one or has not the length of one, else ‘y’. + +### `nf_core.utils.poll_nfcore_web_api(api_url: str, post_data: Dict | None = None) → Dict{:python}` + +Poll the nf-core website API + +Takes argument api_url for URL + +Expects API response to be valid JSON and contain a top-level ‘status’ key. + +### `nf_core.utils.prompt_pipeline_release_branch(wf_releases: List[Dict[str, Any]], wf_branches: Dict[str, Any], multiple: bool = False) → Tuple[Any, List[str]]{:python}` + +Prompt for pipeline release / branch + +- **Parameters:** + - **wf_releases** (_array_) – Array of repo releases as returned by the GitHub API + - **wf_branches** (_array_) – Array of repo branches, as returned by the GitHub API + - **multiple** (_bool_) – Allow selection of multiple releases & branches (for Seqera Platform) +- **Returns:** + Selected release / branch or False if no releases / branches available +- **Return type:** + choice (questionary.Choice or bool) + +### `nf_core.utils.prompt_remote_pipeline_name(wfs){:python}` + +Prompt for the pipeline name with questionary + +- **Parameters:** + **wfs** – A nf_core.pipelines.list.Workflows() object, where get_remote_workflows() has been called. +- **Returns:** + GitHub repo - username/repo +- **Return type:** + pipeline (str) +- **Raises:** + **AssertionError**\*\*,\*\* **if pipeline cannot be found** – + +### `nf_core.utils.rich_force_colors(){:python}` + +Check if any environment variables are set to force Rich to use coloured output + +### `nf_core.utils.run_cmd(executable: str, cmd: str) → Tuple[bytes, bytes] | None{:python}` + +Run a specified command and capture the output. Handle errors nicely. + +### `nf_core.utils.set_wd(path: Path) → Generator[None, None, None]{:python}` + +Sets the working directory for this context. + +- **Parameters:** + **path** (_Path_) – Path to the working directory to be used inside this context. + +### `nf_core.utils.setup_nfcore_cachedir(cache_fn: str | Path) → Path{:python}` + +Sets up local caching for caching files between sessions. + +### `nf_core.utils.setup_nfcore_dir() → bool{:python}` + +Creates a directory for files that need to be kept between sessions + +Currently only used for keeping local copies of modules repos + +### `nf_core.utils.setup_requests_cachedir() → Dict[str, Path | timedelta | str]{:python}` + +Sets up local caching for faster remote HTTP requests. + +Caching directory will be set up in the user’s home directory under +a .config/nf-core/cache\_\* subdir. + +Uses requests_cache monkey patching. +Also returns the config dict so that we can use the same setup with a Session. + +### `nf_core.utils.sort_dictionary(d: Dict) → Dict{:python}` + +Sorts a nested dictionary recursively + +### `nf_core.utils.strip_ansi_codes(string, replace_with=''){:python}` + +Strip ANSI colouring codes from a string to return plain text. + +From Stack Overflow: + +### `nf_core.utils.validate_file_md5(file_name, expected_md5hex){:python}` + +Validates the md5 checksum of a file on disk. + +- **Parameters:** + - **file_name** (_str_) – Path to a local file. + - **expected** (_str_) – The expected md5sum. +- **Raises:** + **IOError**\*\*,\*\* **if the md5sum does not match the remote sum.** – + +### `nf_core.utils.wait_cli_function(poll_func: Callable[[], bool], refresh_per_second: int = 20) → None{:python}` + +Display a command-line spinner while calling a function repeatedly. + +Keep waiting until that function returns True + +- **Parameters:** + - **poll_func** (_function_) – Function to call + - **refresh_per_second** (_int_) – Refresh this many times per second. Default: 20. +- **Returns:** + None. Just sits in an infinite loop until the function returns True. diff --git a/sites/docs/src/content/api_reference/3.1.1/index.md b/sites/docs/src/content/api_reference/3.1.1/index.md new file mode 100644 index 0000000000..72ba5a7a8a --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/index.md @@ -0,0 +1,10 @@ +# nf-core/tools documentation + +This API documentation is for the [`nf-core/tools`](https://github.com/nf-core/tools) package. + +## Contents + +- [Pipeline code lint tests]() (run by `nf-core pipelines lint`) +- [Module code lint tests]() (run by `nf-core modules lint`) +- [Subworkflow code lint tests]() (run by `nf-core subworkflows lint`) +- [nf-core/tools Python package API reference]() diff --git a/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/environment_yml.md b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/environment_yml.md new file mode 100644 index 0000000000..35262c559e --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/environment_yml.md @@ -0,0 +1,9 @@ +# environment_yml + +#### `ModuleLint.environment_yml(module: NFCoreComponent) → None{:python}` + +Lint an `environment.yml` file. + +The lint test checks that the `dependencies` section +in the environment.yml file is valid YAML and that it +is sorted alphabetically. diff --git a/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/index.md b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/index.md new file mode 100644 index 0000000000..85e7a0115f --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/index.md @@ -0,0 +1,13 @@ +# Module Lint Tests + +```none +- [environment_yml](./environment_yml/) +- [main_nf](./main_nf/) +- [meta_yml](./meta_yml/) +- [module_changes](./module_changes/) +- [module_deprecations](./module_deprecations/) +- [module_patch](./module_patch/) +- [module_tests](./module_tests/) +- [module_todos](./module_todos/) +- [module_version](./module_version/) +``` diff --git a/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/main_nf.md b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/main_nf.md new file mode 100644 index 0000000000..1e710f0912 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/main_nf.md @@ -0,0 +1,20 @@ +# main_nf + +#### `ModuleLint.main_nf(module: NFCoreComponent, fix_version: bool, registry: str, progress_bar: Progress) → Tuple[List[str], List[str]]{:python}` + +Lint a `main.nf` module file + +Can also be used to lint local module files, +in which case failures will be reported as +warnings. + +The test checks for the following: + +- Software versions and containers are valid +- The module has a process label and it is among + the standard ones. +- If a `meta` map is defined as one of the modules + inputs it should be defined as one of the outputs, + and be correctly configured in the `saveAs` function. +- The module script section should contain definitions + of `software` and `prefix` diff --git a/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/meta_yml.md b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/meta_yml.md new file mode 100644 index 0000000000..d309777a7a --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/meta_yml.md @@ -0,0 +1,29 @@ +# meta_yml + +#### `ModuleLint.meta_yml(module: NFCoreComponent) → None{:python}` + +Lint a `meta.yml` file + +The lint test checks that the module has +a `meta.yml` file and that it follows the +JSON schema defined in the `modules/meta-schema.json` +file in the nf-core/modules repository. + +In addition it checks that the module name +and module input is consistent between the +`meta.yml` and the `main.nf`. + +If the module has inputs or outputs, they are expected to be +formatted as: + +```groovy +tuple val(foo) path(bar) +val foo +path foo +``` + +or permutations of the above. + +- **Parameters:** + - **module_lint_object** (_ComponentLint_) – The lint object for the module + - **module** (_NFCoreComponent_) – The module to lint diff --git a/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_changes.md b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_changes.md new file mode 100644 index 0000000000..3e8a7de3c4 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_changes.md @@ -0,0 +1,14 @@ +# module_changes + +#### `ModuleLint.module_changes(module){:python}` + +Checks whether installed nf-core modules have changed compared to the +original repository + +Downloads the `main.nf` and `meta.yml` files for every module +and compares them to the local copies + +If the module has a commit SHA entry in the `modules.json`, the file content is +compared against the files in the remote at this SHA. + +Only runs when linting a pipeline, not the modules repository diff --git a/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_deprecations.md b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_deprecations.md new file mode 100644 index 0000000000..763df0ab8a --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_deprecations.md @@ -0,0 +1,5 @@ +# module_deprecations + +#### `ModuleLint.module_deprecations(module){:python}` + +Check that the modules are up to the latest nf-core standard diff --git a/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_patch.md b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_patch.md new file mode 100644 index 0000000000..5d38a40d31 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_patch.md @@ -0,0 +1,8 @@ +# module_patch + +#### `ModuleLint.module_patch(module: NFCoreComponent){:python}` + +Lint a patch file found in a module + +Checks that the file name is well formed, and that +the patch can be applied in reverse with the correct result. diff --git a/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_tests.md b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_tests.md new file mode 100644 index 0000000000..d6f0420237 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_tests.md @@ -0,0 +1,8 @@ +# module_tests + +#### `ModuleLint.module_tests(module: NFCoreComponent){:python}` + +Lint the tests of a module in `nf-core/modules` + +It verifies that the test directory exists +and contains a `main.nf.test` and a `main.nf.test.snap` diff --git a/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_todos.md b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_todos.md new file mode 100644 index 0000000000..f4db6824f9 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_todos.md @@ -0,0 +1,28 @@ +# module_todos + +#### `ModuleLint.module_todos(module){:python}` + +Look for TODO statements in the module files + +The nf-core module template contains a number of comment lines to help developers +of new modules know where they need to edit files and add content. +They typically have the following format: + +```groovy +// TODO nf-core: Make some kind of change to the workflow here +``` + +..or in markdown: + +```html + +``` + +This lint test runs through all files in the module and searches for these lines. +If any are found they will throw a warning. + +:::note +Note that many GUI code editors have plugins to list all instances of _TODO_ +in a given project directory. This is a very quick and convenient way to get +started on your pipeline! +::: diff --git a/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_version.md b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_version.md new file mode 100644 index 0000000000..8a2564937a --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/module_lint_tests/module_version.md @@ -0,0 +1,9 @@ +# module_version + +#### `ModuleLint.module_version(module: NFCoreComponent){:python}` + +Verifies that the module has a version specified in the `modules.json` file + +It checks whether the module has an entry in the `modules.json` file +containing a commit SHA. If that is true, it verifies that there are no +newer version of the module available. diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_awsfulltest.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_awsfulltest.md new file mode 100644 index 0000000000..d9916b1841 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_awsfulltest.md @@ -0,0 +1,30 @@ +# actions_awsfulltest + +#### `PipelineLint.actions_awsfulltest() → Dict[str, List[str]]{:python}` + +Checks the GitHub Actions awsfulltest is valid. + +In addition to small test datasets run on GitHub Actions, we provide the possibility of testing the pipeline on full size datasets on AWS. +This should ensure that the pipeline runs as expected on AWS and provide a resource estimation. + +The GitHub Actions workflow is called `awsfulltest.yml`, and it can be found in the `.github/workflows/` directory. + +:::warning +This workflow incurs AWS costs, therefore it should only be triggered for pipeline releases: +`release` (after the pipeline release) and `workflow_dispatch`. +::: + +:::note +You can manually trigger the AWS tests by going to the Actions tab on the pipeline GitHub repository and selecting the +nf-core AWS full size tests workflow on the left. +::: + +:::note +For tests on full data prior to release, [Seqera Platform](https://seqera.io/platform/) launch feature can be employed. +::: + +The `.github/workflows/awsfulltest.yml` file is tested for the following: + +- Must be turned on `workflow_dispatch`. +- Must be turned on for `release` with `types: [published]` +- Should run the profile `test_full` that should be edited to provide the links to full-size datasets. If it runs the profile `test`, a warning is given. diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_awstest.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_awstest.md new file mode 100644 index 0000000000..4e4698c4e9 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_awstest.md @@ -0,0 +1,24 @@ +# actions_awstest + +#### `PipelineLint.actions_awstest(){:python}` + +Checks the GitHub Actions awstest is valid. + +In addition to small test datasets run on GitHub Actions, we provide the possibility of testing the pipeline on AWS. +This should ensure that the pipeline runs as expected on AWS (which often has its own unique edge cases). + +:::warning +Running tests on AWS incurs costs, so these tests are not triggered automatically. +Instead, they use the `workflow_dispatch` trigger, which allows for manual triggering +of the workflow when testing on AWS is desired. +::: + +:::note +You can trigger the tests by going to the Actions tab on the pipeline GitHub repository +and selecting the nf-core AWS test workflow on the left. +::: + +The `.github/workflows/awstest.yml` file is tested for the following: + +- Must _not_ be turned on for `push` or `pull_request`. +- Must be turned on for `workflow_dispatch`. diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_ci.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_ci.md new file mode 100644 index 0000000000..aec43b98fd --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_ci.md @@ -0,0 +1,36 @@ +# actions_ci + +#### `PipelineLint.actions_ci(){:python}` + +Checks that the GitHub Actions pipeline CI (Continuous Integration) workflow is valid. + +The `.github/workflows/ci.yml` GitHub Actions workflow runs the pipeline on a minimal test +dataset using `-profile test` to check that no breaking changes have been introduced. +Final result files are not checked, just that the pipeline exists successfully. + +This lint test checks this GitHub Actions workflow file for the following: + +- Workflow must be triggered on the following events: + ```yaml + on: + push: + branches: + - dev + pull_request: + release: + types: [published] + ``` +- The minimum Nextflow version specified in the pipeline’s `nextflow.config` matches that defined by `NXF_VER` in the test matrix: + + ```yaml + strategy: + matrix: + # Nextflow versions: check pipeline minimum and current latest + NXF_VER: ['19.10.0', ''] + ``` + + :::note + These `matrix` variables run the test workflow twice, varying the `NXF_VER` variable each time. + This is used in the `nextflow run` commands to test the pipeline with both the latest available version + of the pipeline (`''`) and the stated minimum required version. + ::: diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_schema_validation.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_schema_validation.md new file mode 100644 index 0000000000..4b68346b57 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/actions_schema_validation.md @@ -0,0 +1,12 @@ +# actions_schema_validation + +#### `PipelineLint.actions_schema_validation() → Dict[str, List[str]]{:python}` + +Checks that the GitHub Action workflow yml/yaml files adhere to the correct schema + +nf-core pipelines use GitHub actions workflows to run CI tests, check formatting and also linting, among others. +These workflows are defined by `yml` scripts in `.github/workflows/`. This lint test verifies that these scripts are valid +by comparing them against the [JSON schema for GitHub workflows](https://json.schemastore.org/github-workflow). + +To pass this test, make sure that all your workflows contain the required properties `on` and `jobs` and that +all other properties are of the correct type, as specified in the schema (link above). diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/base_config.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/base_config.md new file mode 100644 index 0000000000..088c00f81e --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/base_config.md @@ -0,0 +1,16 @@ +# base_config + +#### `PipelineLint.base_config() → Dict[str, List[str]]{:python}` + +Make sure the conf/base.config file follows the nf-core template, especially removed sections. + +:::note +You can choose to ignore this lint tests by editing the file called +`.nf-core.yml` in the root of your pipeline and setting the test to false: + +```yaml +lint: + base_config: False +``` + +::: diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/files_exist.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/files_exist.md new file mode 100644 index 0000000000..10279aa788 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/files_exist.md @@ -0,0 +1,111 @@ +# files_exist + +#### `PipelineLint.files_exist() → Dict[str, List[str]]{:python}` + +Checks a given pipeline directory for required files. + +Iterates through the pipeline’s directory content and checks that specified +files are either present or absent, as required. + +:::note +This test raises an `AssertionError` if neither `nextflow.config` or `main.nf` are found. +If these files are not found then this cannot be a Nextflow pipeline and something has gone badly wrong. +All lint tests are stopped immediately with a critical error message. +::: + +Files that _must_ be present: + +```bash +.gitattributes +.gitignore +.nf-core.yml +.editorconfig +.prettierignore +.prettierrc.yml +.github/.dockstore.yml +.github/CONTRIBUTING.md +.github/ISSUE_TEMPLATE/bug_report.yml +.github/ISSUE_TEMPLATE/config.yml +.github/ISSUE_TEMPLATE/feature_request.yml +.github/PULL_REQUEST_TEMPLATE.md +.github/workflows/branch.yml +.github/workflows/ci.yml +.github/workflows/linting_comment.yml +.github/workflows/linting.yml +[LICENSE, LICENSE.md, LICENCE, LICENCE.md] # NB: British / American spelling +assets/email_template.html +assets/email_template.txt +assets/nf-core-PIPELINE_logo_light.png +assets/sendmail_template.txt +conf/modules.config +conf/test.config +conf/test_full.config +CHANGELOG.md +CITATIONS.md +CODE_OF_CONDUCT.md +docs/images/nf-core-PIPELINE_logo_light.png +docs/images/nf-core-PIPELINE_logo_dark.png +docs/output.md +docs/README.md +docs/usage.md +nextflow_schema.json +nextflow.config +README.md +``` + +Files that _should_ be present: + +```bash +main.nf +assets/multiqc_config.yml +conf/base.config +conf/igenomes.config +.github/workflows/awstest.yml +.github/workflows/awsfulltest.yml +ro-crate-metadata.json +``` + +Files that _must not_ be present, due to being renamed or removed in the template: + +```bash +.github/ISSUE_TEMPLATE/bug_report.md +.github/ISSUE_TEMPLATE/feature_request.md +.github/workflows/push_dockerhub.yml +.markdownlint.yml +.nf-core.yaml # NB: Should be yml, not yaml +.yamllint.yml +bin/markdown_to_html.r +conf/aws.config +docs/images/nf-core-PIPELINE_logo.png +lib/Checks.groovy +lib/Completion.groovy +lib/NfcoreTemplate.groovy +lib/Utils.groovy +lib/Workflow.groovy +lib/WorkflowMain.groovy +lib/WorkflowPIPELINE.groovy +lib/nfcore_external_java_deps.jar +parameters.settings.json +pipeline_template.yml # saving information in .nf-core.yml +Singularity +``` + +Files that _should not_ be present: + +```bash +.travis.yml +``` + +:::note +You can configure the `nf-core pipelines lint` tests to ignore any of these checks by setting +the `files_exist` key as follows in your `.nf-core.yml` config file. For example: + +```yaml + +``` + +::: + +lint: +: files_exist: +: - assets/multiqc_config.yml diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/files_unchanged.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/files_unchanged.md new file mode 100644 index 0000000000..15f3a9ffa8 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/files_unchanged.md @@ -0,0 +1,53 @@ +# files_unchanged + +#### `PipelineLint.files_unchanged() → Dict[str, List[str] | bool]{:python}` + +Checks that certain pipeline files are not modified from template output. + +Iterates through the pipeline’s directory content and compares specified files +against output from the template using the pipeline’s metadata. File content +should not be modified / missing. + +Files that must be unchanged: + +```default +.gitattributes +.prettierrc.yml +.github/.dockstore.yml +.github/CONTRIBUTING.md +.github/ISSUE_TEMPLATE/bug_report.yml +.github/ISSUE_TEMPLATE/config.yml +.github/ISSUE_TEMPLATE/feature_request.yml +.github/PULL_REQUEST_TEMPLATE.md +.github/workflows/branch.yml +.github/workflows/linting_comment.yml +.github/workflows/linting.yml +assets/email_template.html +assets/email_template.txt +assets/nf-core-PIPELINE_logo_light.png +assets/sendmail_template.txt +CODE_OF_CONDUCT.md +docs/images/nf-core-PIPELINE_logo_light.png +docs/images/nf-core-PIPELINE_logo_dark.png +docs/README.md' +['LICENSE', 'LICENSE.md', 'LICENCE', 'LICENCE.md'], # NB: British / American spelling +``` + +Files that can have additional content but must include the template contents: + +```default +.gitignore +.prettierignore +``` + +:::note +You can configure the `nf-core pipelines lint` tests to ignore any of these checks by setting +the `files_unchanged` key as follows in your `.nf-core.yml` config file. For example: + +```yaml +lint: + files_unchanged: + - .github/workflows/branch.yml +``` + +::: diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/included_configs.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/included_configs.md new file mode 100644 index 0000000000..6dae0b182c --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/included_configs.md @@ -0,0 +1,10 @@ +# included_configs + +````none +```{eval-rst} +.. automethod:: nf_core.pipelines.lint.PipelineLint.included_configs +```` + +``` + +``` diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/index.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/index.md new file mode 100644 index 0000000000..676aa279fc --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/index.md @@ -0,0 +1,29 @@ +# Pipeline Lint Tests + +```none +- [actions_awsfulltest](./actions_awsfulltest/) +- [actions_awstest](./actions_awstest/) +- [actions_ci](./actions_ci/) +- [actions_schema_validation](./actions_schema_validation/) +- [base_config](./base_config/) +- [files_exist](./files_exist/) +- [files_unchanged](./files_unchanged/) +- [included_configs](./included_configs/) +- [merge_markers](./merge_markers/) +- [modules_config](./modules_config/) +- [modules_json](./modules_json/) +- [modules_structure](./modules_structure/) +- [multiqc_config](./multiqc_config/) +- [nextflow_config](./nextflow_config/) +- [nfcore_yml](./nfcore_yml/) +- [pipeline_name_conventions](./pipeline_name_conventions/) +- [pipeline_todos](./pipeline_todos/) +- [plugin_includes](./plugin_includes/) +- [readme](./readme/) +- [schema_description](./schema_description/) +- [schema_lint](./schema_lint/) +- [schema_params](./schema_params/) +- [system_exit](./system_exit/) +- [template_strings](./template_strings/) +- [version_consistency](./version_consistency/) +``` diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/merge_markers.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/merge_markers.md new file mode 100644 index 0000000000..2f6aef0aa4 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/merge_markers.md @@ -0,0 +1,28 @@ +# merge_markers + +#### `PipelineLint.merge_markers(){:python}` + +Check for remaining merge markers. + +This test looks for remaining merge markers in the code, e.g.: +`>>>>>>>` or `<<<<<<<` + +:::note +You can choose to ignore this lint tests by editing the file called +`.nf-core.yml` in the root of your pipeline and setting the test to false: + +```yaml +lint: + merge_markers: False +``` + +::: + +To disable this test only for specific files, you can specify a list of file paths to ignore. +For example, to ignore a pdf you added to the docs: + +```yaml +lint: + merge_markers: + - docs/my_pdf.pdf +``` diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/modules_config.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/modules_config.md new file mode 100644 index 0000000000..4da63f3d1e --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/modules_config.md @@ -0,0 +1,24 @@ +# modules_config + +#### `PipelineLint.modules_config() → Dict[str, List[str]]{:python}` + +Make sure the conf/modules.config file follows the nf-core template, especially removed sections. + +:::note +You can choose to ignore this lint tests by editing the file called +`.nf-core.yml` in the root of your pipeline and setting the test to false: + +```yaml +lint: + modules_config: False +``` + +::: + +To disable this test only for specific modules, you can specify a list of module names. + +```yaml +lint: + modules_config: + - fastqc +``` diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/modules_json.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/modules_json.md new file mode 100644 index 0000000000..428ca0417a --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/modules_json.md @@ -0,0 +1,10 @@ +# modules_json + +#### `PipelineLint.modules_json() → Dict[str, List[str]]{:python}` + +Make sure all modules described in the `modules.json` file are actually installed + +Every module installed from `nf-core/modules` must have an entry in the `modules.json` file +with an associated version commit hash. + +- Failure: If module entries are found in `modules.json` for modules that are not installed diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/modules_structure.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/modules_structure.md new file mode 100644 index 0000000000..4e08c24956 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/modules_structure.md @@ -0,0 +1,15 @@ +# modules_structure + +#### `PipelineLint.modules_structure(){:python}` + +Check that the structure of the modules directory in a pipeline is the correct one: + +```bash +modules/nf-core/TOOL/SUBTOOL +``` + +Prior to nf-core/tools release 2.6 the directory structure had an additional level of nesting: + +```bash +modules/nf-core/modules/TOOL/SUBTOOL +``` diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/multiqc_config.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/multiqc_config.md new file mode 100644 index 0000000000..9264973668 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/multiqc_config.md @@ -0,0 +1,42 @@ +# multiqc_config + +#### `PipelineLint.multiqc_config() → Dict[str, List[str]]{:python}` + +Make sure basic multiQC plugins are installed and plots are exported +Basic template: + +```yaml +report_comment: > + This report has been generated by the nf-core/quantms + analysis pipeline. For information about how to interpret these results, please see the + documentation. +report_section_order: + software_versions: + order: -1000 + nf-core-quantms-summary: + order: -1001 +export_plots: true +``` + +:::note +You can choose to ignore this lint tests by editing the file called +`.nf-core.yml` in the root of your pipeline and setting the test to false: + +```yaml +lint: + multiqc_config: False +``` + +::: + +To disable this test only for specific sections, you can specify a list of section names. +For example: + +```yaml + +``` + +lint: +: multiqc_config: +: - report_section_order +\- report_comment diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/nextflow_config.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/nextflow_config.md new file mode 100644 index 0000000000..226276090f --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/nextflow_config.md @@ -0,0 +1,133 @@ +# nextflow_config + +#### `PipelineLint.nextflow_config() → Dict[str, List[str]]{:python}` + +Checks the pipeline configuration for required variables. + +All nf-core pipelines are required to be configured with a minimal set of variable +names. This test fails or throws warnings if required variables are not set. + +:::note +These config variables must be set in `nextflow.config` or another config +file imported from there. Any variables set in nextflow script files (eg. `main.nf`) +are not checked and will be assumed to be missing. +::: + +**The following variables fail the test if missing:** + +- `params.outdir`: A directory in which all pipeline results should be saved +- `manifest.name`: The pipeline name. Should begin with `nf-core/` +- `manifest.description`: A description of the pipeline +- `manifest.version` + - The version of this pipeline. This should correspond to a [GitHub release](https://help.github.com/articles/creating-releases/). + - If `--release` is set when running `nf-core pipelines lint`, the version number must not contain the string `dev` + - If `--release` is \_not\_ set, the version should end in `dev` (warning triggered if not) +- `manifest.nextflowVersion` + - The minimum version of Nextflow required to run the pipeline. + - Should be `>=` or `!>=` and a version number, eg. `manifest.nextflowVersion = '>=0.31.0'` (see [Nextflow documentation](https://www.nextflow.io/docs/latest/config.html#scope-manifest)) + - `>=` warns about old versions but tries to run anyway, `!>=` fails for old versions. Only use the latter if you _know_ that the pipeline will certainly fail before this version. + - This should correspond to the `NXF_VER` version tested by GitHub Actions. +- `manifest.homePage` + - The homepage for the pipeline. Should be the nf-core GitHub repository URL, + so beginning with `https://github.com/nf-core/` +- `timeline.enabled`, `trace.enabled`, `report.enabled`, `dag.enabled` + - The nextflow timeline, trace, report and DAG should be enabled by default (set to `true`) +- `process.cpus`, `process.memory`, `process.time` + - Default CPUs, memory and time limits for tasks +- `params.input` + - Input parameter to specify input data, specify this to avoid a warning + - Typical usage: + - `params.input`: Input data that is not NGS sequencing data +- `params.custom_config_version` + > - Should always be set to default value `master` +- `params.custom_config_base` + + > - Should always be set to default value: + + > `https://raw.githubusercontent.com/nf-core/configs/${params.custom_config_version}` + +**The following variables throw warnings if missing:** + +- `manifest.mainScript`: The filename of the main pipeline script (should be `main.nf`) +- `timeline.file`, `trace.file`, `report.file`, `dag.file` + - Default filenames for the timeline, trace and report + - The DAG file path should end with `.svg` (If Graphviz is not installed, Nextflow will generate a `.dot` file instead) + +**The following variables are depreciated and fail the test if they are still present:** + +- `params.version`: The old method for specifying the pipeline version. Replaced by `manifest.version` +- `params.nf_required_version`: The old method for specifying the minimum Nextflow version. Replaced by `manifest.nextflowVersion` +- `params.container`: The old method for specifying the dockerhub container address. Replaced by `process.container` +- `igenomesIgnore`: Changed to `igenomes_ignore` +- `params.max_cpus`: Old method of specifying the maximum number of CPUs a process can request. Replaced by native Nextflow + + ``` + ` + ``` + + resourceLimits\`directive in config files. + +- `params.max_memory`: Old method of specifying the maximum number of memory can request. Replaced by native Nextflow + + ``` + ` + ``` + + resourceLimits\`directive. + +- `params.max_time`: Old method of specifying the maximum number of CPUs can request. Replaced by native Nextflow + + ``` + ` + ``` + + resourceLimits\`directive. + + > :::note + > The `snake_case` convention should now be used when defining pipeline parameters + > ::: + +**The following Nextflow syntax is depreciated and fails the test if present:** + +- Process-level configuration syntax still using the old Nextflow syntax, for example: `process.$fastqc` instead of `process withName:'fastqc'`. + +:::note +You can choose to ignore tests for the presence or absence of specific config variables +by creating a file called `.nf-core.yml` in the root of your pipeline and creating +a list the config variables that should be ignored. For example: + +```yaml +lint: + nextflow_config: + - params.input +``` + +::: + +The other checks in this test (depreciated syntax etc) can not be individually identified, +but you can skip the entire test block if you wish: + +```yaml +lint: + nextflow_config: False +``` + +**The configuration should contain the following or the test will fail:** + +- A `test` configuration profile should exist. + +**The default values in \`\`nextflow.config\`\` should match the default values defined in the \`\`nextflow_schema.json\`\`.** + +:::note +You can choose to ignore tests for the default value of an specific parameter +by creating a file called `.nf-core.yml` in the root of your pipeline and creating +a list the config parameters that should be ignored. For example to ignore the default value for the input parameter: + +```yaml +lint: + nextflow_config: + - config_defaults: + - params.input +``` + +::: diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/nfcore_yml.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/nfcore_yml.md new file mode 100644 index 0000000000..5966e48ffe --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/nfcore_yml.md @@ -0,0 +1,12 @@ +# nfcore_yml + +#### `PipelineLint.nfcore_yml() → Dict[str, List[str]]{:python}` + +Repository `.nf-core.yml` tests + +The `.nf-core.yml` contains metadata for nf-core tools to correctly apply its features. + +- repository type: + > - Check that the repository type is set. +- nf core version: + > - Check if the nf-core version is set to the latest version. diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/pipeline_name_conventions.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/pipeline_name_conventions.md new file mode 100644 index 0000000000..788d77e41a --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/pipeline_name_conventions.md @@ -0,0 +1,13 @@ +# pipeline_name_conventions + +#### `PipelineLint.pipeline_name_conventions(){:python}` + +Checks that the pipeline name adheres to nf-core conventions. + +In order to ensure consistent naming, pipeline names should contain only lower case, alphanumeric characters. +Otherwise a warning is displayed. + +:::warning +DockerHub is very picky about image names and doesn’t even allow hyphens (we are `nfcore`). +This is a large part of why we set this rule. +::: diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/pipeline_todos.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/pipeline_todos.md new file mode 100644 index 0000000000..ea401161ac --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/pipeline_todos.md @@ -0,0 +1,28 @@ +# pipeline_todos + +#### `PipelineLint.pipeline_todos(root_dir=None){:python}` + +Check for nf-core _TODO_ lines. + +The nf-core workflow template contains a number of comment lines to help developers +of new pipelines know where they need to edit files and add content. +They typically have the following format: + +```groovy +// TODO nf-core: Make some kind of change to the workflow here +``` + +..or in markdown: + +```html + +``` + +This lint test runs through all files in the pipeline and searches for these lines. +If any are found they will throw a warning. + +:::note +Note that many GUI code editors have plugins to list all instances of _TODO_ +in a given project directory. This is a very quick and convenient way to get +started on your pipeline! +::: diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/plugin_includes.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/plugin_includes.md new file mode 100644 index 0000000000..0e9e58566b --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/plugin_includes.md @@ -0,0 +1,14 @@ +# plugin_includes + +#### `PipelineLint.plugin_includes() → Dict[str, List[str]]{:python}` + +Checks the include statements in the all + +``` +* +``` + +.nf files for plugin includes + +When nf-schema is used in an nf-core pipeline, the include statements of the plugin +functions have to use nf-schema instead of nf-validation and vice versa diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/readme.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/readme.md new file mode 100644 index 0000000000..d642a457a0 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/readme.md @@ -0,0 +1,33 @@ +# readme + +#### `PipelineLint.readme(){:python}` + +Repository `README.md` tests + +The `README.md` files for a project are very important and must meet some requirements: + +- Nextflow badge + - If no Nextflow badge is found, a warning is given + - If a badge is found but the version doesn’t match the minimum version in the config file, the test fails + - Example badge code: + ```md + [![Nextflow](https://img.shields.io/badge/nextflow-%E2%89%A50.27.6-brightgreen.svg)](https://www.nextflow.io/) + ``` + +:::note +This badge are a markdown image `![alt-text]()` _inside_ a markdown link `[markdown image]()`, so a bit fiddly to write. +::: + +- Zenodo release + > - If pipeline is released but still contains a ‘zenodo.XXXXXXX’ tag, the test fails + +To disable this test, add the following to the pipeline’s `.nf-core.yml` file: + +To disable subsets of these tests, add the following to the pipeline’s `.nf-core.yml` file: + +```yaml +lint: + readme: + - nextflow_badge + - zenodo_release +``` diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/schema_description.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/schema_description.md new file mode 100644 index 0000000000..762ad42ebd --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/schema_description.md @@ -0,0 +1,12 @@ +# schema_description + +#### `PipelineLint.schema_description(){:python}` + +Check that every parameter in the schema has a description. + +The `nextflow_schema.json` pipeline schema should describe every flat parameter. + +Furthermore warns about parameters outside of groups. + +- Warning: Parameters in `nextflow_schema.json` without a description +- Warning: Parameters in `nextflow_schema.json` that are defined outside of a group diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/schema_lint.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/schema_lint.md new file mode 100644 index 0000000000..95afa850c5 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/schema_lint.md @@ -0,0 +1,96 @@ +# schema_lint + +#### `PipelineLint.schema_lint(){:python}` + +Pipeline schema syntax + +Pipelines should have a `nextflow_schema.json` file that describes the different +pipeline parameters (eg. `params.something`, `--something`). + +:::note +Reminder: you should generally never need to edit this JSON file by hand. +The `nf-core pipelines schema build` command can create _and edit_ the file for you +to keep it up to date, with a friendly user-interface for customisation. +::: + +The lint test checks the schema for the following: + +- Schema should be a valid JSON file +- Schema should adhere to [JSONSchema](https://json-schema.org/), Draft 7 or Draft 2020-12. +- Parameters can be described in two places: + + > - As `properties` in the top-level schema object + > - As `properties` within subschemas listed in a top-level + + > ``` + > `` + > ``` + + > definitions\`\`(draft 7) or + + > ``` + > `` + > ``` + + > $defs\`\`(draft 2020-12) objects + +- The schema must describe at least one parameter +- There must be no duplicate parameter IDs across the schema and definition subschema +- All subschema in `definitions` or `$defs` must be referenced in the top-level `allOf` key +- The top-level `allOf` key must not describe any non-existent definitions +- Default parameters in the schema must be valid +- Core top-level schema attributes should exist and be set as follows: + > - `$schema`: `https://json-schema.org/draft-07/schema` or `https://json-schema.org/draft/2020-12/schema` + > - `$id`: URL to the raw schema file, eg. `https://raw.githubusercontent.com/YOURPIPELINE/master/nextflow_schema.json` + > - `title`: `YOURPIPELINE pipeline parameters` + > - `description`: The pipeline config `manifest.description` +- That the `input` property is defined and has a mimetype. A list of common mimetypes can be found [here](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/MIME_types/Common_types). + +For example, an _extremely_ minimal schema could look like this (draft 7): + +```json +{ + "$schema": "https://json-schema.org/draft-07/schema", + "$id": "https://raw.githubusercontent.com/YOURPIPELINE/master/nextflow_schema.json", + "title": "YOURPIPELINE pipeline parameters", + "description": "This pipeline is for testing", + "properties": { + "first_param": { "type": "string" } + }, + "definitions": { + "my_first_group": { + "properties": { + "second_param": { "type": "string" } + } + } + }, + "allOf": [{ "$ref": "#/definitions/my_first_group" }] +} +``` + +Or this (draft 2020-12): + +```json +{ + "$schema": "https://json-schema.org/draft/2020-12/schema", + "$id": "https://raw.githubusercontent.com/YOURPIPELINE/master/nextflow_schema.json", + "title": "YOURPIPELINE pipeline parameters", + "description": "This pipeline is for testing", + "properties": { + "first_param": { "type": "string" } + }, + "$defs": { + "my_first_group": { + "properties": { + "second_param": { "type": "string" } + } + } + }, + "allOf": [{ "$ref": "#/$defs/my_first_group" }] +} +``` + +:::note +You can check your pipeline schema without having to run the entire pipeline lint +by running `nf-core pipelines schema lint` instead of `nf-core pipelines lint` +::: diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/schema_params.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/schema_params.md new file mode 100644 index 0000000000..7d9440fa15 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/schema_params.md @@ -0,0 +1,11 @@ +# schema_params + +#### `PipelineLint.schema_params(){:python}` + +Check that the schema describes all flat params in the pipeline. + +The `nextflow_schema.json` pipeline schema should describe every flat parameter +returned from the `nextflow config` command (params that are objects or more complex structures are ignored). + +- Failure: If parameters are found in `nextflow_schema.json` that are not in `nextflow_schema.json` +- Warning: If parameters are found in `nextflow_schema.json` that are not in `nextflow_schema.json` diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/system_exit.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/system_exit.md new file mode 100644 index 0000000000..dc54311edc --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/system_exit.md @@ -0,0 +1,10 @@ +# system_exit + +#### `PipelineLint.system_exit(){:python}` + +Check for System.exit calls in groovy/nextflow code + +Calls to System.exit(1) should be replaced by throwing errors + +This lint test looks for all calls to System.exit +in any file with the .nf or .groovy extension diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/template_strings.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/template_strings.md new file mode 100644 index 0000000000..364674d06f --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/template_strings.md @@ -0,0 +1,37 @@ +# template_strings + +#### `PipelineLint.template_strings(){:python}` + +Check for template placeholders. + +The `nf-core pipelines create` pipeline template uses +[Jinja](https://jinja.palletsprojects.com/en/2.11.x/) behind the scenes. + +This lint test fails if any Jinja template variables such as +`{{ pipeline_name }}` are found in your pipeline code. + +Finding a placeholder like this means that something was probably copied and pasted +from the template without being properly rendered for your pipeline. + +This test ignores any double-brackets prefixed with a dollar sign, such as +`${{ secrets.AWS_ACCESS_KEY_ID }}` as these placeholders are used in GitHub Actions workflows. + +:::note +You can choose to ignore lint test tests by editing the file called +`.nf-core.yml` in the root of your pipeline and setting the test to false: + +```yaml +lint: + template_strings: False +``` + +::: + +To disable this test only for specific files, you can specify a list of file paths to ignore. +For example, to ignore a pdf you added to the docs: + +```yaml +lint: + template_strings: + - docs/my_pdf.pdf +``` diff --git a/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/version_consistency.md b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/version_consistency.md new file mode 100644 index 0000000000..0e201600e0 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/pipeline_lint_tests/version_consistency.md @@ -0,0 +1,23 @@ +# version_consistency + +#### `PipelineLint.version_consistency(){:python}` + +Pipeline and container version number consistency. + +:::note +This test only runs when the `--release` flag is set for `nf-core pipelines lint`, +or `$GITHUB_REF` is equal to `main`. +::: + +This lint fetches the pipeline version number from three possible locations: + +- The pipeline config, `manifest.version` +- The docker container in the pipeline config, `process.container` + > - Some pipelines may not have this set on a pipeline level. If it is not found, it is ignored. +- `$GITHUB_REF`, if it looks like a release tag (`refs/tags/`) + +The test then checks that: + +- The container name has a tag specified (eg. `nfcore/pipeline:version`) +- The pipeline version number is numeric (contains only numbers and dots) +- That the version numbers all match one another diff --git a/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/index.md b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/index.md new file mode 100644 index 0000000000..b64f553db0 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/index.md @@ -0,0 +1,10 @@ +# Subworkflow Lint Tests + +```none +- [main_nf](./main_nf/) +- [meta_yml](./meta_yml/) +- [subworkflow_changes](./subworkflow_changes/) +- [subworkflow_tests](./subworkflow_tests/) +- [subworkflow_todos](./subworkflow_todos/) +- [subworkflow_version](./subworkflow_version/) +``` diff --git a/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/main_nf.md b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/main_nf.md new file mode 100644 index 0000000000..3f205e942f --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/main_nf.md @@ -0,0 +1,16 @@ +# main_nf + +#### `SubworkflowLint.main_nf(subworkflow: NFCoreComponent) → Tuple[List[str], List[str]]{:python}` + +Lint a `main.nf` subworkflow file + +Can also be used to lint local subworkflow files, +in which case failures will be reported as +warnings. + +The test checks for the following: + +- A subworkflow SHOULD import at least two modules +- All included modules or subworkflows are used and their names are used for versions.yml +- The workflow name is all capital letters +- The subworkflow emits a software version diff --git a/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/meta_yml.md b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/meta_yml.md new file mode 100644 index 0000000000..2ab290d54b --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/meta_yml.md @@ -0,0 +1,17 @@ +# meta_yml + +#### `SubworkflowLint.meta_yml(subworkflow){:python}` + +Lint a `meta.yml` file + +The lint test checks that the subworkflow has +a `meta.yml` file and that it follows the +JSON schema defined in the `subworkflows/yaml-schema.json` +file in the nf-core/modules repository. + +In addition it checks that the subworkflow name +and subworkflow input is consistent between the +`meta.yml` and the `main.nf`. + +Checks that all input and output channels are specified in `meta.yml`. +Checks that all included components in `main.nf` are specified in `meta.yml`. diff --git a/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_changes.md b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_changes.md new file mode 100644 index 0000000000..cda417f1b6 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_changes.md @@ -0,0 +1,14 @@ +# subworkflow_changes + +#### `SubworkflowLint.subworkflow_changes(subworkflow){:python}` + +Checks whether installed nf-core subworkflow have changed compared to the +original repository + +Downloads the `main.nf` and `meta.yml` files for every subworkflow +and compares them to the local copies + +If the subworkflow has a commit SHA entry in the `modules.json`, the file content is +compared against the files in the remote at this SHA. + +Only runs when linting a pipeline, not the modules repository diff --git a/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_tests.md b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_tests.md new file mode 100644 index 0000000000..1b9c543a6a --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_tests.md @@ -0,0 +1,10 @@ +# subworkflow_tests + +#### `SubworkflowLint.subworkflow_tests(subworkflow: NFCoreComponent){:python}` + +Lint the tests of a subworkflow in `nf-core/modules` + +It verifies that the test directory exists +and contains a `main.nf.test` and a `main.nf.test.snap` + +Additionally, checks that all included components in test `main.nf` are specified in `test.yml` diff --git a/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_todos.md b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_todos.md new file mode 100644 index 0000000000..e17f92c01d --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_todos.md @@ -0,0 +1,28 @@ +# subworkflow_todos + +#### `SubworkflowLint.subworkflow_todos(subworkflow){:python}` + +Look for TODO statements in the subworkflow files + +The nf-core subworkflow template contains a number of comment lines to help developers +of new subworkflow know where they need to edit files and add content. +They typically have the following format: + +```groovy +// TODO nf-core: Make some kind of change to the workflow here +``` + +..or in markdown: + +```html + +``` + +This lint test runs through all files in the subworkflows and searches for these lines. +If any are found they will throw a warning. + +:::note +Note that many GUI code editors have plugins to list all instances of _TODO_ +in a given project directory. This is a very quick and convenient way to get +started on your pipeline! +::: diff --git a/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_version.md b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_version.md new file mode 100644 index 0000000000..9b398909f7 --- /dev/null +++ b/sites/docs/src/content/api_reference/3.1.1/subworkflow_lint_tests/subworkflow_version.md @@ -0,0 +1,9 @@ +# subworkflow_version + +#### `SubworkflowLint.subworkflow_version(subworkflow){:python}` + +Verifies that the subworkflow has a version specified in the `modules.json` file + +It checks whether the subworkflow has an entry in the `modules.json` file +containing a commit SHA. If that is true, it verifies that there are no +newer version of the subworkflow available.