Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating staging with master #3341

Closed
wants to merge 24 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
bfdf300
Add default value to optional fields status and template_version
bichitra95 Jul 2, 2024
864ee9f
Making type optional and auto-detecting by qName
bichitra95 Jul 3, 2024
c03fbd3
Merge pull request #3295 from atlanhq/DQ-278
bichitra95 Jul 5, 2024
bc5e933
fix: Ignore classification option validation when propogate is false
krsoninikhil Jul 5, 2024
cd6aa88
Contract version 2.0
bichitra95 Jul 5, 2024
b255bd7
GRC-25 Remove MEDIUM severity from trivy code scanning alerts (#3308)
checkaayush Jul 8, 2024
36e0b18
chore: add git action for jira id in PR title
jblaze2908 Jul 8, 2024
17b7f67
Merge pull request #3315 from atlanhq/AM-1406
jblaze2908 Jul 9, 2024
29440e0
Merge pull request #3259 from atlanhq/DG-1530_2
hr2904 Jul 10, 2024
8104930
DQ-306 Make data_source optional requirement
bichitra95 Jul 10, 2024
1cb3862
MESH-40 Fixed the TODOs left by previous part of the task.
hr2904 Jul 10, 2024
e31e6b9
Use previous values for missing restrict option
krsoninikhil Jul 10, 2024
b10a995
Merge pull request #3328 from atlanhq/DQ-306
bichitra95 Jul 10, 2024
9ec0b79
Merge pull request #3301 from atlanhq/dg-1671
PRATHAM2002-DS Jul 10, 2024
c1147c0
Merge pull request #3332 from atlanhq/DG-1530_BE-2
hr2904 Jul 15, 2024
ee17588
Merge pull request #3311 from atlanhq/ns/fix/DG-1682-propogate-classi…
krsoninikhil Jul 16, 2024
5ef2e27
update pat token
hitk6 Jul 17, 2024
3c9565d
Merge pull request #3344 from atlanhq/update-pat-token-master
arniesaha Jul 18, 2024
a213e6b
Removed verified status check to allow update entity to publish
bichitra95 Jul 19, 2024
c83f968
Add BAD_REQUEST error code
bichitra95 Jul 19, 2024
424203a
LIN-974 : [master] allow upstream & downstream expand for cyclic asse…
rmovaliya Jul 24, 2024
18d0d1c
LIN-974 : [master] add horizontal pagination node count (#3334)
rmovaliya Jul 24, 2024
649cb0f
Merge pull request #3346 from atlanhq/DQ-302-contract-versioning-2-0
bichitra95 Jul 24, 2024
acfc261
Merge pull request #3305 from atlanhq/DQ-295-type-optional
bichitra95 Jul 24, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions .github/workflows/chart-release-dispatcher.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ jobs:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v3
with:
token: ${{ secrets.my_pat }}
token: ${{ secrets.ORG_PAT_GITHUB }}
ref: ${{ steps.extract_branch.outputs.branch }}
fetch-depth: 0

Expand All @@ -50,10 +50,10 @@ jobs:
- name: Get PR url and PR User
id: get_pr_url_user
run: |
head_sha=$(curl -s -H "Authorization: Bearer ${{ secrets.my_pat }}" -H "Accept: application/vnd.github.v3+json" "https://api.github.com/repos/${{ github.repository }}/actions/runs/${{ github.event.workflow_run.id }}/jobs" | jq -r '.jobs[0].head_sha')
head_sha=$(curl -s -H "Authorization: Bearer ${{ secrets.ORG_PAT_GITHUB }}" -H "Accept: application/vnd.github.v3+json" "https://api.github.com/repos/${{ github.repository }}/actions/runs/${{ github.event.workflow_run.id }}/jobs" | jq -r '.jobs[0].head_sha')
echo "Head SHA: $head_sha"
pr_url=$(curl -s -H "Authorization: Bearer ${{ secrets.my_pat }}" -H "Accept: application/vnd.github.v3+json" "https://api.github.com/search/issues?q=sha:$head_sha+type:pr" | jq -r '.items[0].html_url')
pr_user=$(curl -s -H "Authorization: Bearer ${{ secrets.my_pat }}" -H "Accept: application/vnd.github.v3+json" "https://api.github.com/search/issues?q=sha:$head_sha+type:pr" | jq -r '.items[0].user.login')
pr_url=$(curl -s -H "Authorization: Bearer ${{ secrets.ORG_PAT_GITHUB }}" -H "Accept: application/vnd.github.v3+json" "https://api.github.com/search/issues?q=sha:$head_sha+type:pr" | jq -r '.items[0].html_url')
pr_user=$(curl -s -H "Authorization: Bearer ${{ secrets.ORG_PAT_GITHUB }}" -H "Accept: application/vnd.github.v3+json" "https://api.github.com/search/issues?q=sha:$head_sha+type:pr" | jq -r '.items[0].user.login')
echo "pr_url=$pr_url" >> $GITHUB_OUTPUT
echo "pr_user=$pr_user" >> $GITHUB_OUTPUT

Expand All @@ -65,7 +65,7 @@ jobs:
- name: Repository Dispatch
uses: peter-evans/repository-dispatch@v2
with:
token: ${{ secrets.my_pat }}
token: ${{ secrets.ORG_PAT_GITHUB }}
repository: ${{ matrix.repo }}
event-type: dispatch_chart_release_workflow
client-payload: |-
Expand Down
14 changes: 14 additions & 0 deletions .github/workflows/github-actions-pr-jira.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
name: GitHub-Jira Link Action
run-name: ${{ github.actor }} is ensuring Jira ID is present in PR title
on:
pull_request:
types: [opened, edited, synchronize, reopened]
branches: [main, staging, master, beta, develop, prod, development]

jobs:
Enforce-GitHub-Jira-Link-Action:
runs-on: ubuntu-latest
if: ${{ !contains(fromJson('["main", "staging", "master", "beta", "develop", "prod", "development"]'), github.event.pull_request.head.ref) }}
steps:
- name: Enforce Pull Request Title includes Jira Issue Key
uses: ryanvade/[email protected]
2 changes: 1 addition & 1 deletion .github/workflows/main-ecr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -196,4 +196,4 @@ jobs:
${{ steps.login-ecr.outputs.registry }}/atlanhq/${{ github.event.repository.name }}:${{ steps.get_branch.outputs.branch }}-${{ steps.semver_tag.outputs.new_tag }}
build-args: |
ACCESS_TOKEN_USR=$GITHUB_ACTOR
ACCESS_TOKEN_PWD=${{ secrets.my_pat }}
ACCESS_TOKEN_PWD=${{ secrets.ORG_PAT_GITHUB }}
6 changes: 3 additions & 3 deletions .github/workflows/maven.yml
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ jobs:
[{
"id": "github",
"username": "atlan-ci",
"password": "${{ secrets.my_pat }}"
"password": "${{ secrets.ORG_PAT_GITHUB }}"
}]

- name: Build with Maven
Expand All @@ -77,7 +77,7 @@ jobs:
shell: bash

- name: Get version tag
run: echo "##[set-output name=version;]$(echo `git ls-remote https://${{ secrets.my_pat }}@github.com/atlanhq/${REPOSITORY_NAME}.git ${{ steps.get_branch.outputs.branch }} | awk '{ print $1}' | cut -c1-7`)abcd"
run: echo "##[set-output name=version;]$(echo `git ls-remote https://${{ secrets.ORG_PAT_GITHUB }}@github.com/atlanhq/${REPOSITORY_NAME}.git ${{ steps.get_branch.outputs.branch }} | awk '{ print $1}' | cut -c1-7`)abcd"
id: get_version

- name: Set up Buildx
Expand All @@ -89,7 +89,7 @@ jobs:
with:
registry: ghcr.io
username: $GITHUB_ACTOR
password: ${{ secrets.my_pat }}
password: ${{ secrets.ORG_PAT_GITHUB }}

- name: Build and push
id: docker_build
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/trivy-docker-scan.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ jobs:
output: 'trivy-results-docker.sarif'
exit-code: '1'
#ignore-unfixed: true
severity: 'CRITICAL,HIGH,MEDIUM'
severity: 'CRITICAL,HIGH'

- name: Upload Trivy Docker Scan Results To GitHub Security tab
uses: github/codeql-action/upload-sarif@v2
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -159,6 +159,8 @@ public final class Constants {
/**
* SQL property keys.
*/

public static final String SQL_ENTITY_TYPE = "SQL";
public static final String CONNECTION_ENTITY_TYPE = "Connection";
public static final String QUERY_ENTITY_TYPE = "Query";
public static final String QUERY_FOLDER_ENTITY_TYPE = "Folder";
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -171,6 +171,8 @@ public static class LineageInfoOnDemand {
boolean hasMoreOutputs;
int inputRelationsCount;
int outputRelationsCount;
int totalInputRelationsCount;
int totalOutputRelationsCount;
boolean isInputRelationsReachedLimit;
boolean isOutputRelationsReachedLimit;
@JsonProperty
Expand All @@ -188,13 +190,15 @@ public LineageInfoOnDemand(LineageOnDemandConstraints onDemandConstraints) {
this.hasMoreOutputs = false;
this.inputRelationsCount = 0;
this.outputRelationsCount = 0;
this.totalInputRelationsCount = 0;
this.totalOutputRelationsCount = 0;
this.isInputRelationsReachedLimit = false;
this.isOutputRelationsReachedLimit = false;
this.hasUpstream = false;
this.hasDownstream = false;
this.fromCounter = 0;
}

public boolean isInputRelationsReachedLimit() {
return isInputRelationsReachedLimit;
}
Expand Down Expand Up @@ -243,10 +247,18 @@ public void setHasDownstream(boolean hasDownstream) {
this.hasDownstream = hasDownstream;
}

public int getFromCounter() {
return fromCounter;
public int getTotalInputRelationsCount() {
return totalInputRelationsCount;
}

public void setTotalInputRelationsCount(int count) {this.totalInputRelationsCount = count;}

public int getTotalOutputRelationsCount() {
return totalOutputRelationsCount;
}

public void setTotalOutputRelationsCount(int count) {this.totalOutputRelationsCount = count;}

public void incrementFromCounter() {
fromCounter++;
}
Expand All @@ -255,6 +267,10 @@ public int getInputRelationsCount() {
return inputRelationsCount;
}

public int getFromCounter() {
return fromCounter;
}

public void incrementInputRelationsCount() {
this.inputRelationsCount++;
if (inputRelationsCount == onDemandConstraints.getInputRelationsLimit()) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -343,7 +343,7 @@ private void traverseEdgesOnDemand(Iterator<AtlasEdge> processEdges, boolean isI
}

boolean isInputEdge = processEdge.getLabel().equalsIgnoreCase(PROCESS_INPUTS_EDGE);
if (incrementAndCheckIfRelationsLimitReached(processEdge, isInputEdge, atlasLineageOnDemandContext, ret, depth, entitiesTraversed, direction)) {
if (incrementAndCheckIfRelationsLimitReached(processEdge, isInputEdge, atlasLineageOnDemandContext, ret, depth, entitiesTraversed, direction, new HashSet<>())) {
break;
} else {
addEdgeToResult(processEdge, ret, atlasLineageOnDemandContext, nextLevel, traversalOrder);
Expand Down Expand Up @@ -387,7 +387,7 @@ private void traverseEdgesOnDemand(AtlasVertex datasetVertex, boolean isInput, i
continue;
}

if (incrementAndCheckIfRelationsLimitReached(incomingEdge, !isInput, atlasLineageOnDemandContext, ret, depth, entitiesTraversed, direction)) {
if (incrementAndCheckIfRelationsLimitReached(incomingEdge, !isInput, atlasLineageOnDemandContext, ret, depth, entitiesTraversed, direction, visitedVertices)) {
LineageInfoOnDemand entityOnDemandInfo = ret.getRelationsOnDemand().get(baseGuid);
if (entityOnDemandInfo == null)
continue;
Expand All @@ -414,7 +414,7 @@ private void traverseEdgesOnDemand(AtlasVertex datasetVertex, boolean isInput, i
if (checkForOffset(outgoingEdge, processVertex, atlasLineageOnDemandContext, ret)) {
continue;
}
if (incrementAndCheckIfRelationsLimitReached(outgoingEdge, isInput, atlasLineageOnDemandContext, ret, depth, entitiesTraversed, direction)) {
if (incrementAndCheckIfRelationsLimitReached(outgoingEdge, isInput, atlasLineageOnDemandContext, ret, depth, entitiesTraversed, direction, visitedVertices)) {
String processGuid = AtlasGraphUtilsV2.getIdFromVertex(processVertex);
LineageInfoOnDemand entityOnDemandInfo = ret.getRelationsOnDemand().get(processGuid);
if (entityOnDemandInfo == null)
Expand Down Expand Up @@ -597,10 +597,8 @@ private static String getId(AtlasVertex vertex) {
return vertex.getIdForDisplay();
}

private boolean incrementAndCheckIfRelationsLimitReached(AtlasEdge atlasEdge, boolean isInput, AtlasLineageOnDemandContext atlasLineageOnDemandContext, AtlasLineageOnDemandInfo ret, int depth, AtomicInteger entitiesTraversed, AtlasLineageOnDemandInfo.LineageDirection direction) {
private boolean incrementAndCheckIfRelationsLimitReached(AtlasEdge atlasEdge, boolean isInput, AtlasLineageOnDemandContext atlasLineageOnDemandContext, AtlasLineageOnDemandInfo ret, int depth, AtomicInteger entitiesTraversed, AtlasLineageOnDemandInfo.LineageDirection direction, Set<String> visitedVertices) {
AtlasPerfMetrics.MetricRecorder metricRecorder = RequestContext.get().startMetricRecord("incrementAndCheckIfRelationsLimitReached");
if (lineageContainsVisitedEdgeV2(ret, atlasEdge))
return false;

AtlasVertex inVertex = isInput ? atlasEdge.getOutVertex() : atlasEdge.getInVertex();
String inGuid = AtlasGraphUtilsV2.getIdFromVertex(inVertex);
Expand All @@ -613,7 +611,7 @@ private boolean incrementAndCheckIfRelationsLimitReached(AtlasEdge atlasEdge, bo
LineageInfoOnDemand inLineageInfo = ret.getRelationsOnDemand().containsKey(inGuid) ? ret.getRelationsOnDemand().get(inGuid) : new LineageInfoOnDemand(inGuidLineageConstraints);
LineageInfoOnDemand outLineageInfo = ret.getRelationsOnDemand().containsKey(outGuid) ? ret.getRelationsOnDemand().get(outGuid) : new LineageInfoOnDemand(outGuidLineageConstraints);

setHorizontalPaginationFlags(isInput, atlasLineageOnDemandContext, ret, depth, entitiesTraversed, inVertex, inGuid, outVertex, outGuid, inLineageInfo, outLineageInfo);
setHorizontalPaginationFlags(isInput, atlasLineageOnDemandContext, ret, depth, entitiesTraversed, inVertex, inGuid, outVertex, outGuid, inLineageInfo, outLineageInfo, visitedVertices);

boolean hasRelationsLimitReached = setVerticalPaginationFlags(entitiesTraversed, inLineageInfo, outLineageInfo);
if (!hasRelationsLimitReached) {
Expand All @@ -640,9 +638,9 @@ private boolean setVerticalPaginationFlags(AtomicInteger entitiesTraversed, Line
return hasRelationsLimitReached;
}

private void setHorizontalPaginationFlags(boolean isInput, AtlasLineageOnDemandContext atlasLineageOnDemandContext, AtlasLineageOnDemandInfo ret, int depth, AtomicInteger entitiesTraversed, AtlasVertex inVertex, String inGuid, AtlasVertex outVertex, String outGuid, LineageInfoOnDemand inLineageInfo, LineageInfoOnDemand outLineageInfo) {
boolean isOutVertexVisited = ret.getRelationsOnDemand().containsKey(outGuid);
boolean isInVertexVisited = ret.getRelationsOnDemand().containsKey(inGuid);
private void setHorizontalPaginationFlags(boolean isInput, AtlasLineageOnDemandContext atlasLineageOnDemandContext, AtlasLineageOnDemandInfo ret, int depth, AtomicInteger entitiesTraversed, AtlasVertex inVertex, String inGuid, AtlasVertex outVertex, String outGuid, LineageInfoOnDemand inLineageInfo, LineageInfoOnDemand outLineageInfo, Set<String> visitedVertices) {
boolean isOutVertexVisited = visitedVertices.contains(getId(outVertex));
boolean isInVertexVisited = visitedVertices.contains(getId(inVertex));
if (depth == 1 || entitiesTraversed.get() == getLineageMaxNodeAllowedCount()-1) { // is the vertex a leaf?
if (isInput && ! isOutVertexVisited)
setHasUpstream(atlasLineageOnDemandContext, outVertex, outLineageInfo);
Expand All @@ -652,24 +650,27 @@ else if (!isInput && ! isInVertexVisited)
}

private void setHasDownstream(AtlasLineageOnDemandContext atlasLineageOnDemandContext, AtlasVertex inVertex, LineageInfoOnDemand inLineageInfo) {
List<AtlasEdge> filteredEdges = getFilteredAtlasEdges(inVertex, PROCESS_INPUTS_EDGE, atlasLineageOnDemandContext);
if (!filteredEdges.isEmpty())
List<AtlasEdge> filteredEdges = getFilteredAtlasEdges(inVertex, IN, PROCESS_INPUTS_EDGE, atlasLineageOnDemandContext);
if (!filteredEdges.isEmpty()) {
inLineageInfo.setHasDownstream(true);
inLineageInfo.setTotalOutputRelationsCount(filteredEdges.size());
}
}

private void setHasUpstream(AtlasLineageOnDemandContext atlasLineageOnDemandContext, AtlasVertex outVertex, LineageInfoOnDemand outLineageInfo) {
List<AtlasEdge> filteredEdges = getFilteredAtlasEdges(outVertex, PROCESS_OUTPUTS_EDGE, atlasLineageOnDemandContext);
if (!filteredEdges.isEmpty())
List<AtlasEdge> filteredEdges = getFilteredAtlasEdges(outVertex, IN, PROCESS_OUTPUTS_EDGE, atlasLineageOnDemandContext);
if (!filteredEdges.isEmpty()) {
outLineageInfo.setHasUpstream(true);
outLineageInfo.setTotalInputRelationsCount(filteredEdges.size());
}
}

private List<AtlasEdge> getFilteredAtlasEdges(AtlasVertex outVertex, String processEdgeLabel, AtlasLineageOnDemandContext atlasLineageOnDemandContext) {
private List<AtlasEdge> getFilteredAtlasEdges(AtlasVertex outVertex, AtlasEdgeDirection direction, String processEdgeLabel, AtlasLineageOnDemandContext atlasLineageOnDemandContext) {
List<AtlasEdge> filteredEdges = new ArrayList<>();
Iterable<AtlasEdge> edges = outVertex.getEdges(IN, processEdgeLabel);
Iterable<AtlasEdge> edges = outVertex.getEdges(direction, processEdgeLabel);
for (AtlasEdge edge : edges) {
if (edgeMatchesEvaluation(edge, atlasLineageOnDemandContext)) {
filteredEdges.add(edge);
break;
}
}
return filteredEdges;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -540,7 +540,8 @@ public static List<String> getPropagatedVerticesIds (AtlasVertex classificationV
}

public static boolean hasEntityReferences(AtlasVertex classificationVertex) {
return classificationVertex.hasEdges(AtlasEdgeDirection.IN, CLASSIFICATION_LABEL);
Iterator edgeIterator = classificationVertex.query().direction(AtlasEdgeDirection.IN).label(CLASSIFICATION_LABEL).edges(1).iterator();
return edgeIterator != null && edgeIterator.hasNext();
}

public static List<AtlasVertex> getAllPropagatedEntityVertices(AtlasVertex classificationVertex) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3590,6 +3590,7 @@ public void updateClassifications(EntityMutationContext context, String guid, Li
if (CollectionUtils.isEmpty(classifications)) {
throw new AtlasBaseException(AtlasErrorCode.INVALID_CLASSIFICATION_PARAMS, "update", guid);
}
entityRetriever.verifyClassificationsPropagationMode(classifications);

AtlasVertex entityVertex = AtlasGraphUtilsV2.findByGuid(this.graph, guid);

Expand Down Expand Up @@ -3711,7 +3712,21 @@ public void updateClassifications(EntityMutationContext context, String guid, Li
Boolean updatedRestrictPropagationThroughLineage = classification.getRestrictPropagationThroughLineage();
Boolean currentRestrictPropagationThroughHierarchy = currentClassification.getRestrictPropagationThroughHierarchy();
Boolean updatedRestrictPropagationThroughHierarchy = classification.getRestrictPropagationThroughHierarchy();
String propagationMode = entityRetriever.determinePropagationMode(updatedRestrictPropagationThroughLineage, updatedRestrictPropagationThroughHierarchy);
if (updatedRestrictPropagationThroughLineage == null) {
updatedRestrictPropagationThroughLineage = currentRestrictPropagationThroughLineage;
classification.setRestrictPropagationThroughLineage(updatedRestrictPropagationThroughLineage);
}
if (updatedRestrictPropagationThroughHierarchy == null) {
updatedRestrictPropagationThroughHierarchy = currentRestrictPropagationThroughHierarchy;
classification.setRestrictPropagationThroughHierarchy(updatedRestrictPropagationThroughHierarchy);
}

String propagationMode = CLASSIFICATION_PROPAGATION_MODE_DEFAULT;
if (updatedTagPropagation) {
// determinePropagationMode also validates the propagation restriction option values
propagationMode = entityRetriever.determinePropagationMode(updatedRestrictPropagationThroughLineage, updatedRestrictPropagationThroughHierarchy);
}

if ((!Objects.equals(updatedRemovePropagations, currentRemovePropagations) ||
!Objects.equals(currentTagPropagation, updatedTagPropagation) ||
!Objects.equals(currentRestrictPropagationThroughLineage, updatedRestrictPropagationThroughLineage)) &&
Expand All @@ -3733,7 +3748,6 @@ public void updateClassifications(EntityMutationContext context, String guid, Li
if (updatedTagPropagation) {
if (updatedRestrictPropagationThroughLineage != null && !currentRestrictPropagationThroughLineage && updatedRestrictPropagationThroughLineage) {
deleteDelegate.getHandler().removeTagPropagation(classificationVertex);

}
if (updatedRestrictPropagationThroughHierarchy != null && !currentRestrictPropagationThroughHierarchy && updatedRestrictPropagationThroughHierarchy) {
deleteDelegate.getHandler().removeTagPropagation(classificationVertex);
Expand Down
Loading
Loading