Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Query Environment Information for Workflow Jobs #119

Open
AdnaneKhan opened this issue Nov 4, 2023 · 6 comments
Open

Query Environment Information for Workflow Jobs #119

AdnaneKhan opened this issue Nov 4, 2023 · 6 comments
Labels

Comments

@AdnaneKhan
Copy link

Is your feature request related to a problem? Please describe.

Many workflows that would be vulnerable to pwn requests or injection use a deployment environment with required approvals to protect a job from running. Usually this will manifest as a single job that runs in an environment in the beginning, and all other jobs will depend on that check succeeding.

It is possible to query a list of environments and their rules using the REST API without authentication. By adding this feature it will be possible to update cypher queries to reduce false positives.

Describe the solution you'd like

I'd like to see an Environment graph object attached to each job. The environment object should track the environment name and if the protection_rules array contains one or more entries of the required_reviewers class.

Here is an example of a repository that uses deployment environments: https://api.github.com/repos/netflix/mantis/environments

Describe alternatives you've considered

None, this is pretty clear cut because environment gating with required approvals will require manual verification to ensure a detection is not a false positive.

Additional context

Mentioned this in an earlier issue - #111, so this covers adding the environment check.

I'm actually working on implementing this and will have a PR open soon!

@elad-pticha
Copy link
Contributor

This is a good idea!
Isn't the environment should be a job property and not an entirely new node?
Or maybe each job can have a review_required boolean property so we will be able to filter out all those that require any approval.

WDYT?
@oreenlivnicode @AdnaneKhan

@oreenlivnicode
Copy link
Contributor

Are there additional significant details within the protection rules, @AdnaneKhan ? If there are, we should consider representing them as individual nodes; otherwise, I agree we can track them as a Boolean property associated with a Job.

Implementing this could be beneficial in reducing false positives. However, if an exploit is indeed present, we will still report it. The information about the protection rules would be used as context for the disclosure process.

@AdnaneKhan
Copy link
Author

Are there additional significant details within the protection rules, @AdnaneKhan ? If there are, we should consider representing them as individual nodes; otherwise, I agree we can track them as a Boolean property associated with a Job.

Implementing this could be beneficial in reducing false positives. However, if an exploit is indeed present, we will still report it. The information about the protection rules would be used as context for the disclosure process.

There are 2 information classes that are relevant from an exploitability standpoint:

I think adding it as a node with with the required approvals property to start will allow Raven to better handle future checks or conditions that GitHub adds to environments.

@AdnaneKhan
Copy link
Author

Also, curious about where in the code it would be best to add the query to the environments API endpoint?

Should Raven make the call when it is creating a job from dict and the environment field is present or at the same time it pulls the workflow from the contents API?

@oreenlivnicode
Copy link
Contributor

In the current architecture of raven, the Github API queries only take place in the downloader, so it should take place when it pulls the workflow.

If you want to pass metadata about the workflow / composite action to the indexer part, you will have to add it as a field and value to the redis hash of the object (db 1 and 2), similar to the way we implemented url or is_public.

image

@AdnaneKhan
Copy link
Author

In the current architecture of raven, the Github API queries only take place in the downloader, so it should take place when it pulls the workflow.

If you want to pass metadata about the workflow / composite action to the indexer part, you will have to add it as a field and value to the redis hash of the object (db 1 and 2), similar to the way we implemented url or is_public.

image

Thanks for the details! I'd like to avoid adding an API call for every workflow (since most probably don't use environments, and that slows the overall run down). Could add a quick check to see if environment: is present in the workflow before querying but that seems a bit messy to me. Thoughts?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants