You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using coordinates from a file in the PGS Catalog for testing on a bgzipped + indexed copy of 1000 Genomes.
I saw in the docs I should be careful to use reopen=True when I call fetch() because of the threaded execution but I get a lot of errors in my terminal:
[E::hts_open_format] Failed to open file <path>: Too many open files
Creating a new file in the worker thread and closing it prevents this error, but I'd expect reopen=True to be OK.
Am I doing something obviously wrong with my thread pool executor approach? Thanks for your time 🚀
The text was updated successfully, but these errors were encountered:
I'm trying to fetch a lot of records from an indexed VCF, so I thought I'd use a thread pool executor:
I'm using coordinates from a file in the PGS Catalog for testing on a bgzipped + indexed copy of 1000 Genomes.
I saw in the docs I should be careful to use
reopen=True
when I callfetch()
because of the threaded execution but I get a lot of errors in my terminal:Creating a new file in the worker thread and closing it prevents this error, but I'd expect
reopen=True
to be OK.Am I doing something obviously wrong with my thread pool executor approach? Thanks for your time 🚀
The text was updated successfully, but these errors were encountered: