You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I have an application to run ngscheckmate iteratively on the same dataset when new data is added. It is easiest for me to use the fastq file workflow but it takes a considerable amount of time as the amount of data increases. Is there a way to run the program using existing .ncm files previously generated?
Thanks!
The text was updated successfully, but these errors were encountered:
Hello,
Thank you for using the NGSCheckmate.
We use "vaf_ncm.py" script to run NGSCheckMate based on "*.ncm" files .
Usage: python vaf_ncm.py -f -I <INPUT_DIR> -O <OUTPUT_DIR > <-N PREFIX>
Hi @sejooning,
I tried using "vaf_ncm.py" on previously generated ".ncm" files, and it ran without any error messages. However, some of the samples which had high correlations when run using the "ncm_fastq.py" now had zero correlation. These samples should be highly correlated, so I wondered if something is going wrong with the "vaf_ncm.py" correlation output. It also did not generate the "output_matched.txt" file.
Thanks for any advice you can provide.
Hello,
I have an application to run ngscheckmate iteratively on the same dataset when new data is added. It is easiest for me to use the fastq file workflow but it takes a considerable amount of time as the amount of data increases. Is there a way to run the program using existing .ncm files previously generated?
Thanks!
The text was updated successfully, but these errors were encountered: