-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[tune](deps): Bump flaml from 0.5.2 to 0.9.0 in /python/requirements/tune #13
base: master
Are you sure you want to change the base?
[tune](deps): Bump flaml from 0.5.2 to 0.9.0 in /python/requirements/tune #13
Conversation
Bumps [flaml](https://github.com/microsoft/FLAML) from 0.5.2 to 0.9.0. - [Release notes](https://github.com/microsoft/FLAML/releases) - [Commits](https://github.com/microsoft/FLAML/commits/v0.9.0) --- updated-dependencies: - dependency-name: flaml dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <[email protected]>
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
26 similar comments
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
2 similar comments
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
We encountered SIGSEGV when running Python test `python/ray/tests/test_failure_2.py::test_list_named_actors_timeout`. The stack is: ``` #0 0x00007fffed30f393 in std::basic_string<char, std::char_traits<char>, std::allocator<char> >::basic_string(std::string const&) () from /lib64/libstdc++.so.6 #1 0x00007fffee707649 in ray::RayLog::GetLoggerName() () from /home/admin/dev/Arc/merge/ray/python/ray/_raylet.so #2 0x00007fffee70aa90 in ray::SpdLogMessage::Flush() () from /home/admin/dev/Arc/merge/ray/python/ray/_raylet.so #3 0x00007fffee70af28 in ray::RayLog::~RayLog() () from /home/admin/dev/Arc/merge/ray/python/ray/_raylet.so #4 0x00007fffee2b570d in ray::asio::testing::(anonymous namespace)::DelayManager::Init() [clone .constprop.0] () from /home/admin/dev/Arc/merge/ray/python/ray/_raylet.so #5 0x00007fffedd0d95a in _GLOBAL__sub_I_asio_chaos.cc () from /home/admin/dev/Arc/merge/ray/python/ray/_raylet.so #6 0x00007ffff7fe282a in call_init.part () from /lib64/ld-linux-x86-64.so.2 #7 0x00007ffff7fe2931 in _dl_init () from /lib64/ld-linux-x86-64.so.2 #8 0x00007ffff7fe674c in dl_open_worker () from /lib64/ld-linux-x86-64.so.2 #9 0x00007ffff7b82e79 in _dl_catch_exception () from /lib64/libc.so.6 #10 0x00007ffff7fe5ffe in _dl_open () from /lib64/ld-linux-x86-64.so.2 #11 0x00007ffff7d5f39c in dlopen_doit () from /lib64/libdl.so.2 #12 0x00007ffff7b82e79 in _dl_catch_exception () from /lib64/libc.so.6 #13 0x00007ffff7b82f13 in _dl_catch_error () from /lib64/libc.so.6 #14 0x00007ffff7d5fb09 in _dlerror_run () from /lib64/libdl.so.2 #15 0x00007ffff7d5f42a in dlopen@@GLIBC_2.2.5 () from /lib64/libdl.so.2 #16 0x00007fffef04d330 in py_dl_open (self=<optimized out>, args=<optimized out>) at /tmp/python-build.20220507135524.257789/Python-3.7.11/Modules/_ctypes/callproc.c:1369 ``` The root cause is that when loading `_raylet.so`, `static DelayManager _delay_manager` is initialized and `RAY_LOG(ERROR) << "RAY_testing_asio_delay_us is set to " << delay_env;` is executed. However, the static variables declared in `logging.cc` are not initialized yet (in this case, `std::string RayLog::logger_name_ = "ray_log_sink"`). It's better not to rely on the initialization order of static variables in different compilation units because it's not guaranteed. I propose to change all `RAY_LOG`s to `std::cerr` in `DelayManager::Init()`. The crash happens in Ant's internal codebase. Not sure why this test case passes in the community version though. BTW, I've tried different approaches: 1. Using a static local variable in `get_delay_us` and remove the global variable. This doesn't work because `init()` needs to access the variable as well. 2. Defining the global variable as type `std::unique_ptr<DelayManager>` and initialize it in `get_delay_us`. This works but it requires a lock to be thread-safe.
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
22 similar comments
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting |
Bumps flaml from 0.5.2 to 0.9.0.
Release notes
Sourced from flaml's releases.
... (truncated)
Commits
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebase
will rebase this PR@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it@dependabot merge
will merge this PR after your CI passes on it@dependabot squash and merge
will squash and merge this PR after your CI passes on it@dependabot cancel merge
will cancel a previously requested merge and block automerging@dependabot reopen
will reopen this PR if it is closed@dependabot close
will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot ignore this major version
will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor version
will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependency
will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)