-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
performance issue in v2.1.0 ? #81
Comments
Hi. Thank you for the report. git clone https://github.com/simon-weber/python-libfaketime.git
cd python-libfaketime
git clone https://github.com/simon-weber/libfaketime.git libfaketime/vendor/libfaketime
make -C libfaketime/vendor/libfaketime Then install this version in your project environment and run your testsuite: cd your/project/path
pip install the/path/to/python-libfaketime -U --force
pytest |
@azmeuk thanks, I did what you asked, and then I don't observe the performance regression. |
I've run into the same issue. I could work around by setting |
@vlaci what is your OS? If so, I don't know if we would want to enable it by default, as it is discouraged by the libfaketime documentation:
|
@azmeuk I can confirm, just tested v2.1.0 with FAKETIME_FORCE_MONOTONIC_FIX=0, that brings performance back to a normal level. Without that exported variable, performance becomes a disaster. |
@simon-weber do you have any opinion about disabling |
I am on NixOS (Linux) with glibc 2.39 |
It looks like this may be a regression in libfaketime 0.9.10. Maybe we downgrade our vendored libfaketime instead? From a quick look over the FORCE_MONOTONIC_FIX discussions I'm not sure about the safety of disabling it. |
@jgb What error message do you see? |
Sorry, I must have done something wrong initially, using your command it installed just fine. |
Thank you both for you feedback.
The current However, benchmark.py with python-libfaketime 2.0.0 show better results than 2.1.0 (something like 40%), so the good news is that this is reproducible. It seems the fault should lie either on the libfaketime 0.9.7 to 0.9.8+ upgrade, or more probably in my recent changes. Just to be sure, can you check if libfaketime 0.9.8 improve perfs in comparison to 0.9.9 with
Do the tests you are referring to also fail with python-libfaketime 2.1.0 (i.e. with libfaketime 0.9.10) or just with #82 (i.e. with libfaketime 0.9.9)? |
I assume you wanted me to check #83. It was more of a hassle to install, as I needed to explicitly set |
I didn't actually manage to test this: v2.1.0 brings my whole machine to a halt, I killed the pytest processes after a few hours. |
@jgb do you also see nominal perfs with #83? @simon-weber what do you think of all of this? Should we downgrade? |
Hm. Maybe we can get way with fixing this with FAKETIME_FORCE_MONOTONIC_FIX? I see comments in the libfaketime thread about it breaking things for java, but maybe we don't need to worry about it since we're only running against python. |
Worth a try! Any update on this? Can I do something to help? |
Want to open a PR and test it out? I think we just need to add that env var for both platforms here. |
Hello,
when upgrading from v2.0.0 to v2.1.0 our suite of pytests which usually takes around 10 minutes, runs for many hours without finishing.
I've narrowed it down to the fact that upgrading to v2.1.0 of libfaketime seems to be the cause.
Any idea what might be going on here?
Greetings,
jgb
The text was updated successfully, but these errors were encountered: