-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Standardise loggers in python lambda functions #645
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
logger = logging.getLogger(__name__) | ||
logging.basicConfig( | ||
level=logging.INFO, | ||
force=True, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will remove any preconfigured handlers...
Is that what we want?
The accepted answer claims that the default AWS handler "might also add some metadata to the record if available", which sounds like it could be useful?
Happy to accept the change. Longer term we should probably switch to the lambda power tools...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh those docs are actually very handy!
Hmm, yes you might be right, from this below, the logger may in fact set the requestId as well.
Using structured JSON logs with Python
If you select JSON for your function’s log format, Lambda will send logs output by the Python standard logging library to CloudWatch as structured JSON. Each JSON log object contains at least four key value pairs with the following keys:
"timestamp" - the time the log message was generated
"level" - the log level assigned to the message
"message" - the contents of the log message
"requestId" - the unique request ID for the function invocation
We could rearrange from
import logging
logging.basicConfig(
level=logging.INFO,
force=True,
format='%(asctime)s %(message)s'
)
logger = logging.getLogger()
To just
import logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
And then we still get the same log levels, just lose the formatting a bit
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If that works locally...
I think the issue was that then on the local env there may not be a log handler at all, hence the fallback to the basisConfig
if no handler was found.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the issue was that then on the local env there may not be a log handler at all,
Yes, the answer that looks for handlers didn't seem very fun.
Something worth considering is this too - https://docs.powertools.aws.dev/lambda/python/latest/core/logger/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, the power tools logging support. That's what I meant earlier.
(but there are also other reasons to use the power tools)
Resolve this PR soon too, @alexiswl... |
On the backlog, closing PR for now |
Resolves #644