-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not a valid path value type: java.util.LinkedHashMap ([id:spikein_fasta]) #187
Comments
I had a very similar issue that occurred when I let the workflow index a given FASTA file, which seems to be the default for other workflows when giving a |
I have tested it as @cjfields mentioned, using the existing bowtie2 index instead of generating an index by the pipeline itself. It seems to go well now and have passed the alignment process. |
Hi @Gin-Wang @cjfields , I have managed to reproduce and fix the error. It was a channel format issue. its fixed in the commit below and will be in the next release luslab@cdfc356 |
Awesome. thx @chris-cheshire ! |
Hi @chris-cheshire I'm getting the same error trying to run scnanoseq pipeline |
Description of the bug
I met errors in using cutandrun pipeline. In 3.1 it goes well till TRIMGALORE and FASTQC and gets error 'NFCORE_CUTANDRUN:CUTANDRUN:ALIGN_BOWTIE2:BOWTIE2_TARGET_ALIGN (2)' Caused by: Not a valid path value type: java.util.LinkedHashMap ([id:spikein_fasta]).
The error occured in 3.0 version and could not run the pipeline. All input is reachable and readable by anyone. It is strange that I have a completed process of NFCORE_CUTANDRUN:CUTANDRUN:ALIGN_BOWTIE2:BOWTIE2_SPIKEIN_ALIGN, but it still got Not a valid path value type in spike in fasta.
I have posted this in the slack channels, and several people met the same error just like me. I wonder if there is a bug or just mistakes in the command line.
here the parameters I’m trying.
Command used and terminal output
Relevant files
pipeline_report.txt
nextflow.log
System information
No response
The text was updated successfully, but these errors were encountered: