Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

performance issue in v2.1.0 ? #81

Open
jgb opened this issue May 24, 2024 · 21 comments · May be fixed by #82 or #83
Open

performance issue in v2.1.0 ? #81

jgb opened this issue May 24, 2024 · 21 comments · May be fixed by #82 or #83

Comments

@jgb
Copy link

jgb commented May 24, 2024

Hello,

when upgrading from v2.0.0 to v2.1.0 our suite of pytests which usually takes around 10 minutes, runs for many hours without finishing.
I've narrowed it down to the fact that upgrading to v2.1.0 of libfaketime seems to be the cause.

Any idea what might be going on here?

Greetings,

jgb

@azmeuk
Copy link
Contributor

azmeuk commented May 24, 2024

Hi. Thank you for the report.
For additional context, on which system do you run your testsuite? Do you observe the same behavior on different systems?
We updated the underlying libfaketime version as well as changing a bit of code. To be sure of which one is causing the issue, would you try to run your testsuite, with the current python-libfaketime codebase, and the previous libfaketime version?

git clone https://github.com/simon-weber/python-libfaketime.git
cd python-libfaketime
git clone https://github.com/simon-weber/libfaketime.git libfaketime/vendor/libfaketime
make -C libfaketime/vendor/libfaketime

Then install this version in your project environment and run your testsuite:

cd your/project/path
pip install the/path/to/python-libfaketime -U --force
pytest

@jgb
Copy link
Author

jgb commented May 24, 2024

@azmeuk thanks, I did what you asked, and then I don't observe the performance regression.
This is on Ubuntu 24.04 LTS with python 3.12.

@vlaci
Copy link

vlaci commented Jun 7, 2024

I've run into the same issue. I could work around by setting FAKETIME_FORCE_MONOTONIC_FIX to 0.

@azmeuk
Copy link
Contributor

azmeuk commented Jun 9, 2024

@vlaci what is your OS?
@jgb do you confirm FAKETIME_FORCE_MONOTONIC_FIX make a change for you?

If so, I don't know if we would want to enable it by default, as it is discouraged by the libfaketime documentation:

  Please try to avoid compiling with FORCE_MONOTONIC_FIX on platforms that
  do not need it. While it won't make a difference in most cases, depending
  on the specific FAKETIME settings in use, it would cause certain
  intercepted functions such as pthread_cond_timedwait() return with a
  time-out too early or too late, which could break some applications.

@jgb
Copy link
Author

jgb commented Jun 10, 2024

@azmeuk I can confirm, just tested v2.1.0 with FAKETIME_FORCE_MONOTONIC_FIX=0, that brings performance back to a normal level. Without that exported variable, performance becomes a disaster.

@azmeuk
Copy link
Contributor

azmeuk commented Jun 10, 2024

@simon-weber do you have any opinion about disabling FAKETIME_FORCE_MONOTONIC_FIX by default?

@vlaci
Copy link

vlaci commented Jun 10, 2024

@vlaci what is your OS? @jgb do you confirm FAKETIME_FORCE_MONOTONIC_FIX make a change for you?

If so, I don't know if we would want to enable it by default, as it is discouraged by the libfaketime documentation:

  Please try to avoid compiling with FORCE_MONOTONIC_FIX on platforms that
  do not need it. While it won't make a difference in most cases, depending
  on the specific FAKETIME settings in use, it would cause certain
  intercepted functions such as pthread_cond_timedwait() return with a
  time-out too early or too late, which could break some applications.

I am on NixOS (Linux) with glibc 2.39

@simon-weber
Copy link
Owner

It looks like this may be a regression in libfaketime 0.9.10. Maybe we downgrade our vendored libfaketime instead? From a quick look over the FORCE_MONOTONIC_FIX discussions I'm not sure about the safety of disabling it.

@azmeuk azmeuk linked a pull request Jul 12, 2024 that will close this issue
@azmeuk
Copy link
Contributor

azmeuk commented Jul 12, 2024

@jgb @vlaci can you check if #82 solves the issue for your use cases?

@jgb
Copy link
Author

jgb commented Jul 17, 2024

@jgb @vlaci can you check if #82 solves the issue for your use cases?

hello, I tried, but it fails to build / install...

@azmeuk
Copy link
Contributor

azmeuk commented Aug 5, 2024

@jgb What error message do you see?
Did you install with pip install git+https://github.com/simon-weber/python-libfaketime@refs/pull/82/head?

@jgb
Copy link
Author

jgb commented Aug 6, 2024

@jgb What error message do you see? Did you install with pip install git+https://github.com/simon-weber/python-libfaketime@refs/pull/82/head?

Sorry, I must have done something wrong initially, using your command it installed just fine.
Tried it out just now, even though the result isn't as bad as it was, it still makes my test suite go from taking 5 minutes before, and now 12+ minutes.
Also it makes about 20 of my tests fail, which don't fail using v2.0.0 and those failing tests are all related to selenium + chrome...

@vlaci
Copy link

vlaci commented Aug 6, 2024

@jgb @vlaci can you check if #82 solves the issue for your use cases?

I can confirm that the PR works. It indeed looks a bit slower. In my case, it's about 10%

@azmeuk azmeuk linked a pull request Aug 6, 2024 that will close this issue
@azmeuk
Copy link
Contributor

azmeuk commented Aug 6, 2024

Thank you both for you feedback.

Tried it out just now, even though the result isn't as bad as it was, it still makes my test suite go from taking 5 minutes before, and now 12+ minutes.

I can confirm that the PR works. It indeed looks a bit slower. In my case, it's about 10%

The current benchmark.py script does not show different behaviors for python-libfaketime 2.1.0 and libfaketime 0.9.8, 0.9.9 and 0.9.10 with the master branch. I could not test 0.9.7 because it won't compile on my machine. This is too bad because this is the closest version to simon's libfaketime fork that was used in python-libfaketime.

However, benchmark.py with python-libfaketime 2.0.0 show better results than 2.1.0 (something like 40%), so the good news is that this is reproducible. It seems the fault should lie either on the libfaketime 0.9.7 to 0.9.8+ upgrade, or more probably in my recent changes.

Just to be sure, can you check if libfaketime 0.9.8 improve perfs in comparison to 0.9.9 with pip install git+https://github.com/simon-weber/python-libfaketime@refs/pull/82/head 🙏 ?

Also it makes about 20 of my tests fail, which don't fail using v2.0.0 and those failing tests are all related to selenium + chrome...

Do the tests you are referring to also fail with python-libfaketime 2.1.0 (i.e. with libfaketime 0.9.10) or just with #82 (i.e. with libfaketime 0.9.9)?

@vlaci
Copy link

vlaci commented Aug 7, 2024

Just to be sure, can you check if libfaketime 0.9.8 improve perfs in comparison to 0.9.9 with pip install git+https://github.com/simon-weber/python-libfaketime@refs/pull/82/head 🙏 ?

I assume you wanted me to check #83.

It was more of a hassle to install, as I needed to explicitly set CFLAGS=-Wno-error=unused-variable to make it build. The performance seems to be back to normal though.

@jgb
Copy link
Author

jgb commented Aug 7, 2024

Do the tests you are referring to also fail with python-libfaketime 2.1.0 (i.e. with libfaketime 0.9.10) or just with #82 (i.e. with libfaketime 0.9.9)?

I didn't actually manage to test this: v2.1.0 brings my whole machine to a halt, I killed the pytest processes after a few hours.

@azmeuk
Copy link
Contributor

azmeuk commented Aug 16, 2024

It was more of a hassle to install, as I needed to explicitly set CFLAGS=-Wno-error=unused-variable to make it build. The performance seems to be back to normal though.

@jgb do you also see nominal perfs with #83?

@simon-weber what do you think of all of this? Should we downgrade?

@jgb
Copy link
Author

jgb commented Aug 16, 2024

CFLAGS=-Wno-error=unused-variable

Hi @azmeuk I just tested #83 and I can confirm it's slower by a few orders of magnitude compared to v2.0.0.
Not AS slow as v2.1.0 but still so slow that it becomes unworkable.

@simon-weber
Copy link
Owner

Hm. Maybe we can get way with fixing this with FAKETIME_FORCE_MONOTONIC_FIX? I see comments in the libfaketime thread about it breaking things for java, but maybe we don't need to worry about it since we're only running against python.

@jgb
Copy link
Author

jgb commented Nov 20, 2024

Hm. Maybe we can get way with fixing this with FAKETIME_FORCE_MONOTONIC_FIX? I see comments in the libfaketime thread about it breaking things for java, but maybe we don't need to worry about it since we're only running against python.

Worth a try! Any update on this? Can I do something to help?

@simon-weber
Copy link
Owner

Want to open a PR and test it out? I think we just need to add that env var for both platforms here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants