Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(cdp): update batch export instructions #10208

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions contents/docs/cdp/batch-exports/redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ Batch exports can be used to export data to Redshift, Amazon's data warehouse pr
## Creating the batch export

1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already.
2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the exports tab in your PostHog instance.
3. Click "Create export workflow".
2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the destinations tab in your PostHog instance.
3. Click "New destination".
4. Select **Redshift** as the batch export type.
5. Fill in the necessary [configuration details](#redshift-configuration).
6. Finalize the creation by clicking on "Create".
Expand Down
4 changes: 2 additions & 2 deletions contents/docs/cdp/batch-exports/s3.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ With batch exports, data can be exported to an S3 bucket.
## Creating the batch export

1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already.
2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the exports tab in your PostHog instance.
3. Click "Create export workflow".
2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the destinations tab in your PostHog instance.
3. Click "New destination".
4. Select **S3** as the batch export type.
5. Fill in the necessary [configuration details](#s3-configuration).
6. Finalize the creation by clicking on "Create".
Expand Down
4 changes: 2 additions & 2 deletions contents/docs/cdp/batch-exports/snowflake.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ With batch exports, data can be exported to a Snowflake database table.
## Creating the batch export

1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already.
2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the exports tab in your PostHog instance.
3. Click "Create export workflow".
2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the destinations tab in your PostHog instance.
3. Click "New destination".
4. Select **Snowflake** as the batch export destination.
5. Fill in the necessary [configuration details](#snowflake-configuration).
6. Finalize the creation by clicking on "Create".
Expand Down
Loading