-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting "Can not find message descriptor by type_url" error when calling client.logging_api.write_entries() #945
Comments
After a few tests I have managed to create a minimal code sample that demonstrates this behavior. Try to run the following code to see this error: import sys
from google.cloud import logging_v2
TEST_ENTRY = {
"logName": "placeholder",
"protoPayload": {
"@type": "type.googleapis.com/google.cloud.audit.AuditLog",
"authenticationInfo": {
"principalEmail": "service-org-12345@gcp-sa-scc-notification.iam.gserviceaccount.com"
},
"authorizationInfo": [
{
"granted": True,
"permission": "bigquery.tables.update",
"resource": "projects/someproject/datasets/sccFindings/tables/findings",
"resourceAttributes": {}
}
],
"serviceData": {
'@type': 'type.googleapis.com/google.cloud.bigquery.logging.v1.AuditData',
'tableUpdateRequest': {
'resource': {
'info': {},
'schemaJson': '{}',
'tableName': {
'datasetId': 'sccFindings',
'projectId': 'someproject',
'tableId': 'findings'
},
'updateTime': '2024-08-20T15:01:48.399Z',
'view': {}
}
}
},
"methodName": "google.cloud.bigquery.v2.TableService.PatchTable",
"requestMetadata": {
"callerIp": "private",
"destinationAttributes": {},
"requestAttributes": {}
},
"resourceName": "projects/someproject/datasets/sccFindings/tables/findings",
"serviceName": "bigquery.googleapis.com",
"status": {}
},
"resource": {
"labels": {
"dataset_id": "sccFindings",
"project_id": "someproject"
},
"type": "bigquery_dataset"
},
"severity": "NOTICE",
}
def main():
client = logging_v2.Client()
TEST_ENTRY['logName'] = f"projects/{client.project}/logs/test_writing_logs"
logs = [TEST_ENTRY]
client.logging_api.write_entries(logs)
# Start script
if __name__ == "__main__":
try:
main()
except Exception as err:
print(f"Task failed: {err}")
sys.exit(1) |
If the protobuf payload uses Looks like the assumption mentioned in the code comment does not hold for the |
Changing the deprecated field |
See b/374328640 for additional information. |
@minherz I've been looking in to the failure message, and it looks like it's being raised in
From what our code does, it looks like you can add the following in your code to resolve the issue:
The only issue is what the correct value of |
Thank you, @gkevinzheng. The matter is that this particular type is a part of the official Google Cloud collection of types. I am unsure whether the proposed solution has to be implemented in this client library or it addresses the bundle that the library already using. For what it worth, I think that, given the protoPayload type for payload is used only for logs generated by Google Cloud services, the customers of this library are not supposed to register each protobuf type from Google Cloud that they plan to use with this library for just in case. |
@minherz Sorry for the late reply. I've looked at the issue and I think it would be more appropriate to file an issue in the Protobuf repository. Although this issue popped up for the customer while they interacted with the logging library, the root cause seems to be within |
@gkevinzheng I respectfully disagree. If you carefully read the description of the issue and review the minimal code that I provided you will see that it happens due to the protobuf types created specifically for the log entry structure. If you are willing to readdress this issue in the protobuf repository, please create a generic code sample that uses protobuf package and results in the described behavior so it can be reported in the issue. There is a workaround to fix the problem with the "retired" field in the log entry. For some reason the PR with workaround has not been reviewed yet: if "protoPayload" in log:
payload = log.get("protoPayload")
if "serviceData" in payload:
# the following line changes the place of metadata in the dictionary
payload["metadata"] = payload.pop("serviceData") Of course the final decision is up to you as a maintainer of this library. This is definitely a corner case when a user tries to write a log entry with a specific protobuf payload. Please consider that leaving it unattended, this behavior is hard to debug and it will end up crashing user's application. |
@minherz The code you provided fixes the issue right? Is there a proto object I could put in a unit test that replicates this without having to use I would like to add this workaround into |
Environment details
CloudRun v2 running a job as described in Import logs from storage to logging solution architecture.
google-cloud-logging
version:3.5.*
Steps to reproduce
Code example
Stack trace
Code of the solution can be found at python-docs-samples/logging/import-logs.
See below for the minimal code sample that reproduces the problem.
The text was updated successfully, but these errors were encountered: