Skip to content

Commit

Permalink
Merge branch 'master' into master
Browse files Browse the repository at this point in the history
  • Loading branch information
Salil999 authored Oct 25, 2024
2 parents 65ac8d5 + 7e98dc5 commit 16610c9
Show file tree
Hide file tree
Showing 16 changed files with 223 additions and 8 deletions.
1 change: 1 addition & 0 deletions .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@
/census/ @sankalp04 [email protected] @DataDog/ecosystems-review
/cfssl/ @JeanFred
/cloudnatix/ @junm-cloudnatix @kenji-cloudnatix @somik-cloudnatix @rohit-cloudnatix
/cloudquery-cloud/ @cloudquery/cloudquery-cloud
/cloudsmith/ @cloudsmith [email protected] @DataDog/ecosystems-review
/cloudzero/ @ben-dalton @mattyellen @egafford @alinaquinones
/cockroachdb_dedicated/ @DataDog/saas-integrations
Expand Down
7 changes: 7 additions & 0 deletions cloudquery-cloud/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# CHANGELOG - CloudQuery Cloud

## 1.0.0 / 2024-10-01

***Added***:

* Initial Release
41 changes: 41 additions & 0 deletions cloudquery-cloud/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# CloudQuery Cloud

## Overview

[CloudQuery][1] is an open-source, high-performance data integration framework built for developers, with support for a wide range of plugins.

CloudQuery extracts, transforms, and loads configuration from cloud APIs to a variety of supported destinations such as databases, data lakes, or streaming platforms for further analysis.

[CloudQuery Cloud][2] is a great way to get started with CloudQuery and syncing data from source to destination without the need to deploy your own infrastructure. It is also much easier to connect to sources and destinations with the integrated OAuth authentication support. You only need to select a source and destination plugin and CloudQuery will take care of the rest.

## Setup

### Installation

1. Sign up for free at [cloud.cloudquery.io][2].
2. In Datadog, navigate to the CloudQuery Cloud integration tile
3. Click **Connect Accounts**
4. You'll be redirected to CloudQuery to log in
5. Navigate to the **Sources** page and add a Datadog source
6. Under the **Authentication** section, use the **Authenticate** button to grant access to your Datadog account using OAuth2 flow.

For more information about using CloudQuery Cloud, refer to the [quickstart guide][3].

### Configuration

Detailed documentation for the CloudQuery Datadog source is available [here][4].

## Uninstallation

1. Navigate to the **Sources** page under [CloudQuery Cloud][2] and find your Datadog source you have previously set up.
2. Under the **Edit source** tab, click the **Delete this source** button.

## Support

For support, contact [CloudQuery][1] or [CloudQuery Community][5].

[1]: https://www.cloudquery.io/
[2]: https://cloud.cloudquery.io/
[3]: https://docs.cloudquery.io/docs/quickstart/cloudquery-cloud
[4]: https://hub.cloudquery.io/plugins/source/cloudquery/datadog/latest/docs
[5]: https://community.cloudquery.io/
43 changes: 43 additions & 0 deletions cloudquery-cloud/assets/oauth_clients.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
{
"integration": {
"scopes": [
"code_analysis_read",
"synthetics_private_location_read",
"data_scanner_read",
"user_access_read",
"apm_api_catalog_read",
"teams_read",
"security_monitoring_signals_read",
"ci_visibility_read",
"monitors_read",
"apm_service_catalog_read",
"apm_read",
"slos_read",
"synthetics_read",
"hosts_read",
"security_pipelines_read",
"security_monitoring_rules_read",
"security_monitoring_suppressions_read",
"metrics_read",
"incident_read",
"synthetics_global_variable_read",
"security_monitoring_filters_read",
"security_monitoring_findings_read",
"continuous_profiler_pgo_read",
"workflows_read",
"cloud_cost_management_read",
"events_read",
"cases_read",
"usage_read",
"dashboards_read"
],
"client_role": "integration",
"name": "CloudQuery Sync (OAuth)",
"onboarding_url": "https://cloud.cloudquery.io/",
"description": "Simple, Fast and Extensible Data Movement",
"redirect_uris": [
"https://cloud.cloudquery.io/auth/connector"
],
"id": "85f67abe-7fd9-11ef-ac1f-da7ad0900002"
}
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
82 changes: 82 additions & 0 deletions cloudquery-cloud/images/cloudquery_logo_svg_light_background.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
36 changes: 36 additions & 0 deletions cloudquery-cloud/manifest.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
{
"manifest_version": "2.0.0",
"app_uuid": "727e53a9-0bbe-4277-8ed0-9e9425fe34de",
"app_id": "cloudquery-cloud",
"display_on_public_website": true,
"tile": {
"overview": "README.md#Overview",
"media": [],
"configuration": "README.md#Setup",
"support": "README.md#Support",
"uninstallation": "README.md#Uninstallation",
"changelog": "CHANGELOG.md",
"description": "Simple, Fast and Extensible Data Movement",
"title": "CloudQuery Cloud",
"classifier_tags": [
"Supported OS::Linux",
"Supported OS::Windows",
"Supported OS::macOS",
"Category::Developer Tools",
"Offering::Integration",
"Category::Cloud",
"Queried Data Type::Metrics",
"Queried Data Type::Logs",
"Queried Data Type::Events"
]
},
"assets": {
"oauth": "assets/oauth_clients.json"
},
"author": {
"support_email": "[email protected]",
"name": "CloudQuery",
"homepage": "https://cloud.cloudquery.io/",
"sales_email": "[email protected]"
}
}
Binary file not shown.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 2 additions & 2 deletions f5-distributed-cloud/manifest.json
Original file line number Diff line number Diff line change
Expand Up @@ -24,12 +24,12 @@
{
"media_type": "image",
"caption": "The F5 Distributed Cloud Services WAF events dashboard.",
"image_url": "images/F5DCS-WAF-Events-Overview.png"
"image_url": "images/waf_events_overview.png"
},
{
"media_type": "image",
"caption": "The F5 Distributed Cloud Services BOT Defense events overview dashboard.",
"image_url": "images/F5DCS-BOT-Defense-Events-Overview.png"
"image_url": "images/defense_events_overview.png"
}
],
"classifier_tags": [
Expand Down
6 changes: 4 additions & 2 deletions gigamon/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,8 @@ GigaVUE V Series Node is a virtual machine running in the customer's infrastruct
a. Alias: Name of the exporter (String).
b. Ingestor: Specify the Port as "514" and Type as "ami".
c. Cloud Tool Exports: Create a new exporter tool by selecting '+' and add details as shown in the following diagram:
![1](https://raw.githubusercontent.com/DataDog/integrations-extras/master/gigamon/images/images/gigamon1.png)
![2](https://raw.githubusercontent.com/DataDog/integrations-extras/master/gigamon/images/images/gigamon2.png)
![AMI exporter][6]
![Cloud Tools Exporter][7]


## Data Collected
Expand All @@ -36,4 +36,6 @@ Need help? Contact [Gigamon Support][5].
[3]: https://docs.gigamon.com/doclib66/Content/GigaVUE_Cloud_Suites.html?tocpath=GigaVUE%20Cloud%20Suite%7C_____0
[4]: https://docs.gigamon.com/doclib66/Content/GV-GigaSMART/Application%20Protocol%20Bundle.html
[5]: https://www.gigamon.com/support/support-and-services/contact-support.html
[6]: https://raw.githubusercontent.com/DataDog/integrations-extras/master/gigamon/images/gigamon1.png
[7]: https://raw.githubusercontent.com/DataDog/integrations-extras/master/gigamon/images/gigamon2.png

10 changes: 6 additions & 4 deletions hikaricp/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,13 @@ To install the HikariCP check on your host:
1. Install the [developer toolkit][10]
on any machine.

2. Run `ddev release build hikaricp` to build the package.
2. Clone the [integrations-extras][12] repo and navigate into the directory.

3. [Download the Datadog Agent][2].
3. Run `ddev release build hikaricp` to build the package.

4. Upload the build artifact to any host with an Agent and
4. [Download the Datadog Agent][2].

5. Upload the build artifact to any host with an Agent and
run `datadog-agent integration install -w
path/to/hikaricp/dist/<ARTIFACT_NAME>.whl`.

Expand Down Expand Up @@ -61,4 +63,4 @@ Need help? Contact [Datadog support][9].
[9]: https://docs.datadoghq.com/help/
[10]: https://docs.datadoghq.com/developers/integrations/python/
[11]: https://github.com/DataDog/integrations-extras/blob/master/hikaricp/assets/service_checks.json

[12]: https://github.com/DataDog/integrations-extras

0 comments on commit 16610c9

Please sign in to comment.