Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(logging): queue_listener handler for Python >= 3.12 #3185

Merged

Conversation

jderrien
Copy link
Contributor

@jderrien jderrien commented Mar 9, 2024

Description

  • Fix the queue_listener handler for Python 3.12

Python 3.12 introduced a new way to configure QueueHandler and QueueListener via logging.config.dictConfig(). This described here.

The listener still needs to be started & stopped, as previously. To do so, I've introduced LoggingQueueListener.

And as stated in the doc:

Any custom queue handler and listener classes will need to be defined with the same initialization signatures as QueueHandler and QueueListener.

Closes

Closes #2954

Copy link

codecov bot commented Mar 9, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 98.22%. Comparing base (e763cea) to head (e0b4aa9).

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #3185      +/-   ##
==========================================
- Coverage   98.23%   98.22%   -0.01%     
==========================================
  Files         320      320              
  Lines       14446    14447       +1     
  Branches     2298     2297       -1     
==========================================
  Hits        14191    14191              
- Misses        113      114       +1     
  Partials      142      142              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@provinzkraut provinzkraut force-pushed the fix-logging-queue-listener-handler branch from 179cd06 to d3f4dd0 Compare March 10, 2024 08:12
@jderrien jderrien force-pushed the fix-logging-queue-listener-handler branch 5 times, most recently from 22c8a8d to e3dd511 Compare March 11, 2024 19:47
@jderrien jderrien marked this pull request as ready for review March 11, 2024 19:53
@jderrien jderrien requested review from a team as code owners March 11, 2024 19:53
@jderrien jderrien force-pushed the fix-logging-queue-listener-handler branch 3 times, most recently from 024c8e4 to 26599cb Compare March 12, 2024 15:20
Copy link
Member

@provinzkraut provinzkraut left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JacobCoffee Can you take a look at this? Looks good from my side.

@provinzkraut
Copy link
Member

@jderrien Thanks for the PR :)
Can you expand the PR description a bit, e.g. how this changes fixes the referenced issue?

@jderrien
Copy link
Contributor Author

@jderrien Thanks for the PR :) Can you expand the PR description a bit, e.g. how this changes fixes the referenced issue?

I've updated the description, don't hesitate to comment the PR if you need more explainations. :)

Also, please wait before merging this. It seems some test jobs hung on on GitHub Actions. It runs well locally since I added cleanup_logging(), I'm not sure what's going on. :/

@jderrien
Copy link
Contributor Author

It seems some test jobs hung on on GitHub Actions

https://github.com/litestar-org/litestar/actions/runs/8238535303
https://github.com/litestar-org/litestar/actions/runs/8250579528

The test suite on Python 3.12 takes ~10 min and gets killed.

While I didn't have this issue on macOS, I'll try to run the test suite in a Linux container tomorrow. Hopefully I'll be able to reproduce the issue.

@peterschutt
Copy link
Contributor

I did some testing yesterday and found it to be flaky, sometimes it would hang and sometimes not. I re-requested the 3.12 tests in CI and they passed that time which confirms it. Unfortunately, when it did hang I'd get different behaviors after hitting ^C, sometimes the tests would continue and pass, sometimes I'd get errors out of pytest plugins, and sometimes errors out of low level threading stuff, but nothing that would actually point to any of our code :/

@jderrien
Copy link
Contributor Author

@peterschutt Thank you for checking. I cannot re-run the test suite on GitHub Actions myself, I suppose this is expected (lack of permissions), isn't it?

I tried to run the tests locally on my Mac M1 Pro with act and the flag --container-architecture linux/amd64. It hangs from time to time but ^C doesn't show me any particular errror:

| ........................................................................ [ 99%]


^C[test (3.12)/Test/test]   ❌  Failure - Main Test with coverage
[test (3.12)/Test/test] Get "http://%2Fvar%2Frun%2Fdocker.sock/v1.43/containers/a847accdb2090b23495b7f949038c8ac2ce5f2009bd4bf66e1a26f8bef671e95/archive?path=%2Fvar%2Frun%2Fact%2Fworkflow%2Fpathcmd.txt": context canceled
[test (3.12)/Test/test] Cleaning up container for job test
[test (3.12)/Test/test] 🏁  Job succeeded
Error: context canceled

The tool might hide the error.

Then I tried to run the test suite on macOS, with -n auto --cov. I guess the problem might appear when using these flags (one, another or both). I've been able to get a "hang" after 16 runs. I reproduced the problem twice, with the same error:

^C/Users/jderrien/projects/litestar/.venv/lib/python3.12/site-packages/_pytest/main.py:325: PluggyTeardownRaisedWarning: A plugin raised an exception during an old-style hookwrapper teardown.
Plugin: _cov, Hook: pytest_runtestloop
KeyboardInterrupt:
For more information see https://pluggy.readthedocs.io/en/stable/api_reference.html#pluggy.PluggyTeardownRaisedWarning
  config.hook.pytest_runtestloop(session=session)

litestar/logging/config.py Outdated Show resolved Hide resolved
Copy link
Contributor

@peterschutt peterschutt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Flaky test.

Just adding this so we don't accidentally merge.

@jderrien jderrien force-pushed the fix-logging-queue-listener-handler branch 3 times, most recently from 390f6c9 to 9c785a7 Compare March 15, 2024 15:41
@jderrien
Copy link
Contributor Author

Flaky test.

Just adding this so we don't accidentally merge.

@guacs advised me to use Github Actions directly on my fork, so I did.

I worked on a specific branch, the only change is this commit.

On these 25 runs (26 but the first one failed due to a config mistake), everything looks good, at least on Python 3.12.

There were 3 failures on Python 3.11 on tests/unit/test_channels/test_plugin.py::test_create_ws_route_handlers_arbitrary_channels_allowed (which is already marked as flaky).

The problem seems to be solved.

Copy link
Member

@JacobCoffee JacobCoffee left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm aside from weird page load

litestar/logging/standard.py Outdated Show resolved Hide resolved
litestar/logging/standard.py Outdated Show resolved Hide resolved
litestar/logging/standard.py Outdated Show resolved Hide resolved
litestar/logging/standard.py Outdated Show resolved Hide resolved
litestar/logging/standard.py Outdated Show resolved Hide resolved
@peterschutt
Copy link
Contributor

Flaky test.
Just adding this so we don't accidentally merge.

@guacs advised me to use Github Actions directly on my fork, so I did.

I worked on a specific branch, the only change is this commit.

On these 25 runs (26 but the first one failed due to a config mistake), everything looks good, at least on Python 3.12.

There were 3 failures on Python 3.11 on tests/unit/test_channels/test_plugin.py::test_create_ws_route_handlers_arbitrary_channels_allowed (which is already marked as flaky).

The problem seems to be solved.

Fantastic! Thanks very much!

@jderrien jderrien force-pushed the fix-logging-queue-listener-handler branch from 6af2103 to 7f7b2ca Compare March 18, 2024 21:39
@jderrien jderrien force-pushed the fix-logging-queue-listener-handler branch from 7f7b2ca to 0eebbb9 Compare March 20, 2024 16:04
@provinzkraut
Copy link
Member

@JacobCoffee Good to go from your end?

@jderrien jderrien force-pushed the fix-logging-queue-listener-handler branch from 0eebbb9 to a49a278 Compare March 22, 2024 10:50
@jderrien jderrien force-pushed the fix-logging-queue-listener-handler branch from a49a278 to e0b4aa9 Compare March 22, 2024 10:58
Copy link

Documentation preview will be available shortly at https://litestar-org.github.io/litestar-docs-preview/3185

@JacobCoffee JacobCoffee merged commit be9af3f into litestar-org:main Mar 22, 2024
20 checks passed
@jderrien jderrien deleted the fix-logging-queue-listener-handler branch March 23, 2024 12:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Bug: Logs and Stack traces are not visible OOTB in Python3.12
4 participants