-
Notifications
You must be signed in to change notification settings - Fork 185
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
display logs for verifications/analyses that take the form of a job #3641
Comments
I scoop up logs from all pods in a cluster and send them to a central store where they're queryable by pod name (Loki) -- having the name of the job (or the child pod itself) that Argo Rollouts kicks off would be great:
|
In our case we use the jobs to run roboframework tests. |
I'm not sure how that isn't the same use case, unless perhaps what's being stored are not raw logs, but something different like a report formatted with HTML or something. Can you clarify? |
If you use something like Allure and generate an HTML report that gets thrown in S3, you could link to it (not sure how you'd handle auth for that though). |
Yup, in my case it's an html report generated by roboframework, but having robotframework logs is also part of the requirement (logs & final report are 2 things) Edit: external url & auth for logs (i don't know if a single call works with gitlab pages) Edit2: lol "freshly merged" Gitlab : Programmatic access to private pages |
As proposed, this is strictly about logs and not reports. Displaying reports within the UI or CLI does not make sense in the same way that displaying raw logs would. Linking to external reports in a separate browser tab seems useful and doable, but also seems like a separate feature from what's proposed here and I'd rather keep the scope of this issue narrow. (Feel free to open a separate issue.)
There are some minor details to work out here, but in general, you could expect it to work the same as everything else in Kargo. Permissions are described using pure Kubernetes RBAC. The API server may have unconditional access to all the logs, but the endpoint that proxies them would perform a subject access review to verify the user's authority to view them. |
We're not going to get into the business of forwarding and storing logs, but @jessesuen has proposed that we allow operators to specify a URL template that can produce URLs as a function of stage, promotion name, freight name, analysis run name, etc., along with auth headers.
This can be used to enable a new API server endpoint to act as a proxy for logs that are accessible via an https get request.
The UI, and possibly the CLI, can make use of the new endpoint to expose verifications/analyses to users.
Ultimately, this would mean that if you can manage to ship your logs to object storage (for instance), Kargo can display them.
The text was updated successfully, but these errors were encountered: