Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove defaults conda channel from CI #8840

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

jrbourbeau
Copy link
Member

@jrbourbeau jrbourbeau commented Aug 23, 2024

Let's see if this works

xref dask/community#396

cc @jacobtomlinson

Copy link
Member

@jacobtomlinson jacobtomlinson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to explitly use nodefaults (as dask/dask does) to ensure it doesn't use defaults even if it is configured in the .condarc?

Copy link
Contributor

github-actions bot commented Aug 23, 2024

Unit Test Results

See test report for an extended history of previous test failures. This is useful for diagnosing flaky tests.

    25 files  ±    0      25 suites  ±0   9h 45m 50s ⏱️ - 27m 29s
 4 118 tests ±    0   3 999 ✅  -     4    111 💤 ± 0  8 ❌ +4 
46 466 runs   - 1 105  44 403 ✅  - 1 056  2 055 💤  - 53  8 ❌ +4 

For more details on these failures, see this check.

Results for commit 9046b2d. ± Comparison against base commit ea7d35c.

♻️ This comment has been updated with latest results.

Copy link
Member Author

@jrbourbeau jrbourbeau left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to explitly use nodefaults (as dask/dask does) to ensure it doesn't use defaults even if it is configured in the .condarc?

Not totally sure. Windows builds started crashing due to lack of torch / torchvision builds on conda-forge, so my guess is we're not attempting to pull things from the default channel. Regardless, happy to add nodefaults if it's needed.

Our Windows Python 3.10 build is crashing with the following error:

Fatal Python error: Aborted

Thread 0x000011bc (most recent call first):
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 324 in wait
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\queue.py", line 180 in get
  File "D:\a\distributed\distributed\distributed\threadpoolexecutor.py", line 53 in _worker
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 953 in run
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 1016 in _bootstrap_inner
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 973 in _bootstrap

Thread 0x000006fc (most recent call first):
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 324 in wait
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\queue.py", line 180 in get
  File "D:\a\distributed\distributed\distributed\threadpoolexecutor.py", line 53 in _worker
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 953 in run
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 1016 in _bootstrap_inner
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 973 in _bootstrap

Current thread 0x000019b4 (most recent call first):
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\scipy\linalg\_basic.py", line 358 in solve_triangular
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask\array\utils.py", line 567 in scipy_linalg_safe
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask\array\utils.py", line 571 in solve_triangular_safe
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask\array\linalg.py", line 970 in _solve_triangular_lower
  File "D:\a\distributed\distributed\distributed\worker.py", line 3005 in apply_function_simple
  File "D:\a\distributed\distributed\distributed\worker.py", line 2968 in apply_function
  File "D:\a\distributed\distributed\distributed\utils.py", line 1521 in <lambda>
  File "D:\a\distributed\distributed\distributed\_concurrent_futures_thread.py", line 65 in run
  File "D:\a\distributed\distributed\distributed\threadpoolexecutor.py", line 57 in _worker
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 953 in run
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 1016 in _bootstrap_inner
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 973 in _bootstrap

Thread 0x00000920 (most recent call first):
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 324 in wait
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 607 in wait
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 1376 in run
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 1016 in _bootstrap_inner
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 973 in _bootstrap

Thread 0x00000704 (most recent call first):
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\concurrent\futures\thread.py", line 81 in _worker
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 953 in run
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 1016 in _bootstrap_inner
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 973 in _bootstrap

Thread 0x00001174 (most recent call first):
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\concurrent\futures\thread.py", line 81 in _worker
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 953 in run
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 1016 in _bootstrap_inner
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\threading.py", line 973 in _bootstrap

Thread 0x000001c8 (most recent call first):
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\selectors.py", line 315 in _select
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\selectors.py", line 324 in select
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\asyncio\base_events.py", line 1871 in _run_once
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\asyncio\base_events.py", line 603 in run_forever
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\asyncio\base_events.py", line 636 in run_until_complete
  File "D:\a\distributed\distributed\distributed\compatibility.py", line 236 in asyncio_run
  File "D:\a\distributed\distributed\distributed\utils_test.py", line 377 in _run_and_close_tornado
  File "D:\a\distributed\distributed\distributed\utils_test.py", line 1087 in test_func
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\contextlib.py", line 79 in inner
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\contextlib.py", line 79 in inner
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\python.py", line 159 in pytest_pyfunc_call
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_callers.py", line 103 in _multicall
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_manager.py", line 120 in _hookexec
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_hooks.py", line 513 in __call__
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\python.py", line 1627 in runtest
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\runner.py", line 174 in pytest_runtest_call
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_callers.py", line 103 in _multicall
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_manager.py", line 120 in _hookexec
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_hooks.py", line 513 in __call__
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\runner.py", line 242 in <lambda>
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\runner.py", line 341 in from_call
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\runner.py", line 241 in call_and_report
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\runner.py", line 132 in runtestprotocol
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\runner.py", line 113 in pytest_runtest_protocol
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_callers.py", line 103 in _multicall
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_manager.py", line 120 in _hookexec
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_hooks.py", line 513 in __call__
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\main.py", line [362](https://github.com/dask/distributed/actions/runs/10531450354/job/29262008820?pr=8840#step:20:363) in pytest_runtestloop
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_callers.py", line 103 in _multicall
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_manager.py", line 120 in _hookexec
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_hooks.py", line 513 in __call__
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\main.py", line 337 in _main
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\main.py", line 283 in wrap_session
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\main.py", line 330 in pytest_cmdline_main
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_callers.py", line 103 in _multicall
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_manager.py", line 120 in _hookexec
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pluggy\_hooks.py", line 513 in __call__
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\config\__init__.py", line 175 in main
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\_pytest\config\__init__.py", line 201 in console_main
  File "C:\Users\runneradmin\miniconda3\envs\dask-distributed\Scripts\pytest-script.py", line 9 in <module>

Extension modules: yaml._yaml, cytoolz.utils, cytoolz.itertoolz, cytoolz.functoolz, cytoolz.dicttoolz, cytoolz.recipes, psutil._psutil_windows, markupsafe._speedups, tornado.speedups, msgpack._cmsgpack, lz4._version, lz4.block._block, zstandard.backend_c, numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, crick.space_saving, crick.stats, crick.tdigest, cython.cimports.libc.math, scipy._lib._ccallback_c, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg.cython_lapack, scipy.linalg._cythonized_array_utils, scipy.linalg._solve_toeplitz, scipy.linalg._decomp_lu_cython, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_blas, scipy.linalg._matfuncs_expm, scipy.linalg._decomp_update, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.linalg._propack._spropack, scipy.sparse.linalg._propack._dpropack, scipy.sparse.linalg._propack._cpropack, scipy.sparse.linalg._propack._zpropack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, scipy._lib._uarray._uarray, scipy.special._ufuncs_cxx, scipy.special._ufuncs, scipy.special._specfun, scipy.special._comb, scipy.special._ellip_harm_2, scipy.fftpack.convolve, _brotli, _cffi_backend, gssapi.raw._enum_extensions.ext_dce, gssapi.raw._enum_extensions.ext_iov_mic, gssapi.raw.oids, gssapi.raw.types, gssapi.raw.cython_converters, gssapi.raw.misc, gssapi.raw.names, gssapi.raw.creds, gssapi.raw.chan_bindings, gssapi.raw.sec_contexts, gssapi.raw.message, gssapi.raw.exceptions, gssapi.raw.ext_s4u, gssapi.raw.ext_cred_store, gssapi.raw.ext_rfc4178, gssapi.raw.ext_rfc5587, gssapi.raw.ext_rfc5588, gssapi.raw.ext_rfc5801, gssapi.raw.ext_cred_imp_exp, gssapi.raw.mech_krb5, gssapi.raw.ext_password, gssapi.raw.ext_dce_aead, gssapi.raw.ext_dce, gssapi.raw.ext_iov_mic, gssapi.raw.ext_krb5, gssapi.raw.ext_rfc6680, gssapi.raw.ext_rfc6680_comp_oid, gssapi.raw.ext_ggf, gssapi.raw.ext_set_cred_opt, mmapfile, win32api, win32ui, win32security, pyarrow.lib, pandas._libs.tslibs.ccalendar, pandas._libs.tslibs.np_datetime, pandas._libs.tslibs.dtypes, pandas._libs.tslibs.base, pandas._libs.tslibs.nattype, pandas._libs.tslibs.timezones, pandas._libs.tslibs.fields, pandas._libs.tslibs.timedeltas, pandas._libs.tslibs.tzconversion, pandas._libs.tslibs.timestamps, pandas._libs.properties, pandas._libs.tslibs.offsets, pandas._libs.tslibs.strptime, pandas._libs.tslibs.parsing, pandas._libs.tslibs.conversion, pandas._libs.tslibs.period, pandas._libs.tslibs.vectorized, pandas._libs.ops_dispatch, pandas._libs.missing, pandas._libs.hashtable, pandas._libs.algos, pandas._libs.interval, pandas._libs.lib, pyarrow._compute, pandas._libs.ops, pandas._libs.hashing, pandas._libs.arrays, pandas._libs.tslib, pandas._libs.sparse, pandas._libs.internals, pandas._libs.indexing, pandas._libs.index, pandas._libs.writers, pandas._libs.join, pandas._libs.window.aggregations, pandas._libs.window.indexers, pandas._libs.reshape, pandas._libs.groupby, pandas._libs.json, pandas._libs.parsers, pandas._libs.testing, h5py._errors, h5py.defs, h5py._objects, h5py.h5, h5py.utils, h5py.h5t, h5py.h5s, h5py.h5ac, h5py.h5p, h5py.h5r, h5py._proxy, h5py._conv, h5py.h5z, h5py.h5a, h5py.h5d, h5py.h5ds, h5py.h5g, h5py.h5i, h5py.h5f, h5py.h5fd, h5py.h5pl, h5py.h5o, h5py.h5l, h5py._selector, lz4.frame._frame, pyarrow._fs, pyarrow._hdfs, pyarrow._gcsfs, pyarrow._s3fs, pyarrow._acero, pyarrow._csv, pyarrow._json, pyarrow._dataset, pyarrow._dataset_orc, pyarrow._parquet, pyarrow._parquet_encryption, pyarrow._dataset_parquet_encryption, pyarrow._dataset_parquet, cftime._cftime, netCDF4._netCDF4, torch._C, torch._C._fft, torch._C._linalg, torch._C._nested, torch._C._nn, torch._C._sparse, torch._C._special, multidict._multidict, yarl._quoting_c, aiohttp._helpers, aiohttp._http_writer, aiohttp._http_parser, aiohttp._websocket, frozenlist._frozenlist (total: 195)
distributed/tests/test_client.py::test_recreate_error_array

I'm supportive of sticking with just conda-forge but won't have time for debugging this crash for a while. @jacobtomlinson if you (or anyone else) wants to take over, please feel free to.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants