Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate source performance harness with datadog #31410

Merged
merged 38 commits into from
Oct 17, 2023
Merged
Show file tree
Hide file tree
Changes from 30 commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
7e185e1
empty line to create a PR
xiaohansong Oct 11, 2023
9023148
Automated Commit - Formatting Changes
xiaohansong Oct 11, 2023
f167679
attempt a fix
xiaohansong Oct 11, 2023
dfc9c67
typo
xiaohansong Oct 11, 2023
8542cac
add logs
xiaohansong Oct 11, 2023
21cabaa
try adding path directly
xiaohansong Oct 11, 2023
86b6ca9
whoami?
xiaohansong Oct 11, 2023
406da8b
export path in write part
xiaohansong Oct 11, 2023
9dd5c7c
add secrets
xiaohansong Oct 12, 2023
3d98ddd
try different way to create cred
xiaohansong Oct 12, 2023
0fb5fef
Merge remote-tracking branch 'origin/master' into xiaohan/harness-fix
xiaohansong Oct 12, 2023
ba4f4f3
remove unnecessary printout
xiaohansong Oct 12, 2023
6a97539
make comment optional
xiaohansong Oct 12, 2023
4d42500
Merge branch 'master' into xiaohan/harness-fix
xiaohansong Oct 12, 2023
0adb090
update postgres catalog
xiaohansong Oct 12, 2023
c4a2bd3
Automated Commit - Formatting Changes
xiaohansong Oct 12, 2023
1ed2779
Merge branch 'master' into xiaohan/harness-fix
xiaohansong Oct 12, 2023
86f5f05
add more logs, suspecting it's not reading from the latest catalog
xiaohansong Oct 12, 2023
bd758f3
more logs
xiaohansong Oct 12, 2023
637a9df
remove gitref
xiaohansong Oct 12, 2023
2ab9f44
need to check local branch and master branch to do git diff
xiaohansong Oct 12, 2023
59eb0be
remove origin
xiaohansong Oct 12, 2023
a08ad54
need to fetch all in perf-test step
xiaohansong Oct 12, 2023
daa81f2
add back ref
xiaohansong Oct 12, 2023
4d07a76
remove logs
xiaohansong Oct 12, 2023
ceade6d
Merge branch 'master' into xiaohan/harness-fix
xiaohansong Oct 13, 2023
beee3ee
formatting
xiaohansong Oct 13, 2023
5d7fc32
datadog test
xiaohansong Oct 13, 2023
e0a6dc2
insert real datadog metrics
xiaohansong Oct 13, 2023
f06d2e0
Merge remote-tracking branch 'origin/master' into xiaohan/datadog-perf
xiaohansong Oct 13, 2023
2176e08
Automated Commit - Formatting Changes
xiaohansong Oct 13, 2023
04a1d97
format and env
xiaohansong Oct 13, 2023
1d52f3f
Merge branch 'master' into xiaohan/datadog-perf
xiaohansong Oct 16, 2023
e0d46d6
test without export envsubst would still work or not
xiaohansong Oct 16, 2023
5888e9a
export changed names only:
xiaohansong Oct 16, 2023
dcc7d8f
revert
xiaohansong Oct 16, 2023
c0a2686
one more try
xiaohansong Oct 16, 2023
f04d291
updated comment
xiaohansong Oct 16, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/workflows/connector-performance-command.yml
Original file line number Diff line number Diff line change
Expand Up @@ -180,6 +180,7 @@ jobs:
export STREAM_NUMBER=$STREAM_NUMBER
export SYNC_MODE=$SYNC_MODE
export HARNESS=$HARNESS_TYPE
export DD_API_KEY=${{ secrets.DD_API_KEY }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can set this env var in the env: section of this step

envsubst < ./tools/bin/run-harness-process.yaml | kubectl create -f -
echo "harness is ${{ steps.which-harness.outputs.harness_type }}"
POD=$(kubectl get pod -l app=performance-harness -o jsonpath="{.items[0].metadata.name}")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,5 @@ dependencies {
implementation 'org.apache.commons:commons-lang3:3.11'
implementation 'io.airbyte:airbyte-commons-worker:0.42.0'
implementation 'io.airbyte.airbyte-config:config-models:0.42.0'
implementation 'com.datadoghq:datadog-api-client:2.16.0'
}
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@ public static void main(final String[] args) {
try {
final PerformanceTest test = new PerformanceTest(
image,
dataset,
config.toString(),
catalog.toString());
test.runTest();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,15 @@

package io.airbyte.integrations.source_performance;

import com.datadog.api.client.ApiClient;
import com.datadog.api.client.ApiException;
import com.datadog.api.client.v2.api.MetricsApi;
import com.datadog.api.client.v2.model.MetricIntakeType;
import com.datadog.api.client.v2.model.MetricPayload;
import com.datadog.api.client.v2.model.MetricPoint;
import com.datadog.api.client.v2.model.MetricResource;
import com.datadog.api.client.v2.model.MetricSeries;
import com.datadog.api.client.v2.model.IntakePayloadAccepted;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
Expand All @@ -26,6 +35,9 @@
import java.net.InetAddress;
import java.nio.file.Path;
import java.time.Duration;
import java.time.Instant;
import java.time.OffsetDateTime;
import java.util.Collections;
import java.util.List;
import java.util.Optional;
import java.util.Set;
Expand All @@ -43,17 +55,24 @@ public class PerformanceTest {

public static final double MEGABYTE = Math.pow(1024, 2);
private final String imageName;
private final String dataset;
private final JsonNode config;
private final ConfiguredAirbyteCatalog catalog;

PerformanceTest(final String imageName, final String config, final String catalog) throws JsonProcessingException {
PerformanceTest(final String imageName, final String dataset, final String config, final String catalog) throws JsonProcessingException {
final ObjectMapper mapper = new ObjectMapper();
this.imageName = imageName;
this.dataset = dataset;
this.config = mapper.readTree(config);
this.catalog = Jsons.deserialize(catalog, ConfiguredAirbyteCatalog.class);
}

void runTest() throws Exception {

// Initialize datadog.
ApiClient defaultClient = ApiClient.getDefaultApiClient();
MetricsApi apiInstance = new MetricsApi(defaultClient);

KubePortManagerSingleton.init(PORTS);

final KubernetesClient fabricClient = new DefaultKubernetesClient();
Expand Down Expand Up @@ -105,8 +124,45 @@ void runTest() throws Exception {
final var totalMB = totalBytes / MEGABYTE;
final var totalTimeSecs = (end - start) / 1000.0;
final var rps = counter / totalTimeSecs;
log.info("total secs: {}. total MB read: {}, rps: {}, throughput: {}", totalTimeSecs, totalMB, rps, totalMB / totalTimeSecs);
final var throughput = totalMB / totalTimeSecs;
log.info("total secs: {}. total MB read: {}, rps: {}, throughput: {}", totalTimeSecs, totalMB, rps, throughput);
source.close();

final long reportingTimeInEpochSeconds = OffsetDateTime.now().toInstant().getEpochSecond();

List<MetricResource> metricResources = List.of(
new MetricResource().name("github").type("runner"),
new MetricResource().name(imageName).type("image"),
new MetricResource().name(dataset).type("dataset"));
MetricPayload body =
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

amazing!

new MetricPayload()
.series(
List.of(
new MetricSeries()
.metric("connectors.performance.rps")
.type(MetricIntakeType.GAUGE)
.points(
Collections.singletonList(
new MetricPoint()
.timestamp(reportingTimeInEpochSeconds)
.value(rps)))
.resources(metricResources),
new MetricSeries()
.metric("connectors.performance.throughput")
.type(MetricIntakeType.GAUGE)
.points(
Collections.singletonList(
new MetricPoint()
.timestamp(reportingTimeInEpochSeconds)
.value(throughput)))
.resources(metricResources)
));
try {
IntakePayloadAccepted result = apiInstance.submitMetrics(body);
System.out.println(result);
} catch (ApiException e) {
log.error("Exception when calling MetricsApi#submitMetrics.", e);
}
}

private static <V0, V1> V0 convertProtocolObject(final V1 v1, final Class<V0> klass) {
Expand Down
5 changes: 5 additions & 0 deletions tools/bin/run-harness-process.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,11 @@ spec:
requests:
cpu: "2.5"
memory: "2Gi"
env:
- name: DD_API_KEY
value: $DD_API_KEY
- name: DD_SITE
value: "datadoghq.com"
volumes:
- name: secrets-volume
hostPath:
Expand Down