Skip to content

Commit

Permalink
Update Logs code ownership to specific teams (#2585)
Browse files Browse the repository at this point in the history
Co-authored-by: ci.datadog-api-spec <[email protected]>
  • Loading branch information
api-clients-generation-pipeline[bot] and ci.datadog-api-spec authored Nov 26, 2024
1 parent d6c4360 commit 60b8333
Show file tree
Hide file tree
Showing 6 changed files with 75 additions and 75 deletions.
8 changes: 4 additions & 4 deletions .apigentools-info
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@
"spec_versions": {
"v1": {
"apigentools_version": "1.6.6",
"regenerated": "2024-11-25 19:59:33.247243",
"spec_repo_commit": "3c840607"
"regenerated": "2024-11-26 13:36:15.509612",
"spec_repo_commit": "cf1aa5ea"
},
"v2": {
"apigentools_version": "1.6.6",
"regenerated": "2024-11-25 19:59:33.266976",
"spec_repo_commit": "3c840607"
"regenerated": "2024-11-26 13:36:15.529715",
"spec_repo_commit": "cf1aa5ea"
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -10,70 +10,70 @@ Feature: Logs Indexes
And a valid "appKeyAuth" key in the system
And an instance of "LogsIndexes" API

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Create an index returns "Invalid Parameter Error" response
Given new "CreateLogsIndex" request
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15}
When the request is sent
Then the response status is 400 Invalid Parameter Error

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Create an index returns "OK" response
Given new "CreateLogsIndex" request
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15}
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Get all indexes returns "OK" response
Given new "ListLogIndexes" request
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Get an index returns "Not Found" response
Given new "GetLogsIndex" request
And request contains "name" parameter from "REPLACE.ME"
When the request is sent
Then the response status is 404 Not Found

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Get an index returns "OK" response
Given new "GetLogsIndex" request
And request contains "name" parameter from "REPLACE.ME"
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Get indexes order returns "OK" response
Given new "GetLogsIndexOrder" request
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Update an index returns "Invalid Parameter Error" response
Given new "UpdateLogsIndex" request
And request contains "name" parameter from "REPLACE.ME"
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "disable_daily_limit": false, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "num_flex_logs_retention_days": 360, "num_retention_days": 15}
When the request is sent
Then the response status is 400 Invalid Parameter Error

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Update an index returns "OK" response
Given new "UpdateLogsIndex" request
And request contains "name" parameter from "REPLACE.ME"
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "disable_daily_limit": false, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "num_flex_logs_retention_days": 360, "num_retention_days": 15}
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Update indexes order returns "Bad Request" response
Given new "UpdateLogsIndexOrder" request
And body with value {"index_names": ["main", "payments", "web"]}
When the request is sent
Then the response status is 400 Bad Request

@generated @skip @team:DataDog/logs-backend
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Update indexes order returns "OK" response
Given new "UpdateLogsIndexOrder" request
And body with value {"index_names": ["main", "payments", "web"]}
Expand Down
12 changes: 6 additions & 6 deletions src/test/resources/com/datadog/api/client/v2/api/logs.feature
Original file line number Diff line number Diff line change
Expand Up @@ -105,44 +105,44 @@ Feature: Logs
Then the response status is 200 OK
And the response has 3 items

@integration-only @skip-terraform-config @skip-validation @team:DataDog/event-platform-intake @team:DataDog/logs-backend
@integration-only @skip-terraform-config @skip-validation @team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment"}]
And request contains "Content-Encoding" parameter with value "deflate"
When the request is sent
Then the response status is 202 Response from server (always 202 empty JSON).

@integration-only @skip-terraform-config @skip-validation @team:DataDog/event-platform-intake @team:DataDog/logs-backend
@integration-only @skip-terraform-config @skip-validation @team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment"}]
And request contains "Content-Encoding" parameter with value "gzip"
When the request is sent
Then the response status is 202 Request accepted for processing (always 202 empty JSON).

@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend
@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send logs returns "Bad Request" response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment"}]
When the request is sent
Then the response status is 400 Bad Request

@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend
@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send logs returns "Payload Too Large" response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment"}]
When the request is sent
Then the response status is 413 Payload Too Large

@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend
@generated @skip @team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send logs returns "Request Timeout" response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment"}]
When the request is sent
Then the response status is 408 Request Timeout

@team:DataDog/event-platform-intake @team:DataDog/logs-backend
@team:DataDog/event-platform-intake @team:DataDog/logs-backend @team:DataDog/logs-ingestion
Scenario: Send logs returns "Request accepted for processing (always 202 empty JSON)." response
Given new "SubmitLog" request
And body with value [{"ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment", "status": "info"}]
Expand Down
Loading

0 comments on commit 60b8333

Please sign in to comment.