From c02a7fb2b4fb1defb0be131674cbcf77e6f4c4f2 Mon Sep 17 00:00:00 2001 From: Marcus Hof <13001502+MarconLP@users.noreply.github.com> Date: Sun, 22 Dec 2024 21:18:59 +0100 Subject: [PATCH 1/4] update batch export instructions --- contents/docs/cdp/batch-exports/redshift.md | 4 ++-- contents/docs/cdp/batch-exports/s3.md | 4 ++-- contents/docs/cdp/batch-exports/snowflake.md | 4 ++-- 3 files changed, 6 insertions(+), 6 deletions(-) diff --git a/contents/docs/cdp/batch-exports/redshift.md b/contents/docs/cdp/batch-exports/redshift.md index a6f1a011bc80..1fdb68739899 100644 --- a/contents/docs/cdp/batch-exports/redshift.md +++ b/contents/docs/cdp/batch-exports/redshift.md @@ -13,8 +13,8 @@ Batch exports can be used to export data to Redshift, Amazon's data warehouse pr ## Creating the batch export 1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already. -2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the exports tab in your PostHog instance. -3. Click "Create export workflow". +2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the destinations tab in your PostHog instance. +3. Click "New destination". 4. Select **Redshift** as the batch export type. 5. Fill in the necessary [configuration details](#redshift-configuration). 6. Finalize the creation by clicking on "Create". diff --git a/contents/docs/cdp/batch-exports/s3.md b/contents/docs/cdp/batch-exports/s3.md index 81856fbf684d..08bf568ae677 100644 --- a/contents/docs/cdp/batch-exports/s3.md +++ b/contents/docs/cdp/batch-exports/s3.md @@ -13,8 +13,8 @@ With batch exports, data can be exported to an S3 bucket. ## Creating the batch export 1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already. -2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the exports tab in your PostHog instance. -3. Click "Create export workflow". +2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the destinations tab in your PostHog instance. +3. Click "New destination". 4. Select **S3** as the batch export type. 5. Fill in the necessary [configuration details](#s3-configuration). 6. Finalize the creation by clicking on "Create". diff --git a/contents/docs/cdp/batch-exports/snowflake.md b/contents/docs/cdp/batch-exports/snowflake.md index 9766e4b19946..07b9072eaf9b 100644 --- a/contents/docs/cdp/batch-exports/snowflake.md +++ b/contents/docs/cdp/batch-exports/snowflake.md @@ -13,8 +13,8 @@ With batch exports, data can be exported to a Snowflake database table. ## Creating the batch export 1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already. -2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the exports tab in your PostHog instance. -3. Click "Create export workflow". +2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the destinations tab in your PostHog instance. +3. Click "New destination". 4. Select **Snowflake** as the batch export destination. 5. Fill in the necessary [configuration details](#snowflake-configuration). 6. Finalize the creation by clicking on "Create". From 1a9e590717965c9048963c89d4fafb31705e36a6 Mon Sep 17 00:00:00 2001 From: Marcus Hof <13001502+MarconLP@users.noreply.github.com> Date: Mon, 23 Dec 2024 17:13:36 +0100 Subject: [PATCH 2/4] Update contents/docs/cdp/batch-exports/redshift.md Co-authored-by: Ian Vanagas <34755028+ivanagas@users.noreply.github.com> --- contents/docs/cdp/batch-exports/redshift.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/contents/docs/cdp/batch-exports/redshift.md b/contents/docs/cdp/batch-exports/redshift.md index 1fdb68739899..d22c3f8c2992 100644 --- a/contents/docs/cdp/batch-exports/redshift.md +++ b/contents/docs/cdp/batch-exports/redshift.md @@ -13,9 +13,9 @@ Batch exports can be used to export data to Redshift, Amazon's data warehouse pr ## Creating the batch export 1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already. -2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the destinations tab in your PostHog instance. -3. Click "New destination". -4. Select **Redshift** as the batch export type. +2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the **Destinations** tab in your PostHog instance. +3. Search for **Redshift**. +4. Click the **+ Create** button. 5. Fill in the necessary [configuration details](#redshift-configuration). 6. Finalize the creation by clicking on "Create". 7. Done! The batch export will schedule its first run on the start of the next period. From d5fbbeb736c50d8cea3a7b903e65e68721a2a70a Mon Sep 17 00:00:00 2001 From: Marcus Hof <13001502+MarconLP@users.noreply.github.com> Date: Mon, 23 Dec 2024 17:13:47 +0100 Subject: [PATCH 3/4] Update contents/docs/cdp/batch-exports/s3.md Co-authored-by: Ian Vanagas <34755028+ivanagas@users.noreply.github.com> --- contents/docs/cdp/batch-exports/s3.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/contents/docs/cdp/batch-exports/s3.md b/contents/docs/cdp/batch-exports/s3.md index 08bf568ae677..afd87ce69cbe 100644 --- a/contents/docs/cdp/batch-exports/s3.md +++ b/contents/docs/cdp/batch-exports/s3.md @@ -13,9 +13,9 @@ With batch exports, data can be exported to an S3 bucket. ## Creating the batch export 1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already. -2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the destinations tab in your PostHog instance. -3. Click "New destination". -4. Select **S3** as the batch export type. +2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the **Destinations** tab in your PostHog instance. +3. Search for **S3**. +4. Click the **+ Create** button. 5. Fill in the necessary [configuration details](#s3-configuration). 6. Finalize the creation by clicking on "Create". 7. Done! The batch export will schedule its first run on the start of the next period. From 0d2d2648a51a890a6081c299515e910c059e4421 Mon Sep 17 00:00:00 2001 From: Marcus Hof <13001502+MarconLP@users.noreply.github.com> Date: Mon, 23 Dec 2024 17:13:58 +0100 Subject: [PATCH 4/4] Update contents/docs/cdp/batch-exports/snowflake.md Co-authored-by: Ian Vanagas <34755028+ivanagas@users.noreply.github.com> --- contents/docs/cdp/batch-exports/snowflake.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/contents/docs/cdp/batch-exports/snowflake.md b/contents/docs/cdp/batch-exports/snowflake.md index 07b9072eaf9b..66babd680790 100644 --- a/contents/docs/cdp/batch-exports/snowflake.md +++ b/contents/docs/cdp/batch-exports/snowflake.md @@ -13,9 +13,9 @@ With batch exports, data can be exported to a Snowflake database table. ## Creating the batch export 1. Subscribe to data pipelines add-on in [your billing settings](https://us.posthog.com/organization/billing) if you haven't already. -2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the destinations tab in your PostHog instance. -3. Click "New destination". -4. Select **Snowflake** as the batch export destination. +2. Click [Data pipelines](https://app.posthog.com/pipeline) in the navigation and go to the **Destinations** tab in your PostHog instance. +3. Search for **Snowflake**. +4. Click the **+ Create** button. 5. Fill in the necessary [configuration details](#snowflake-configuration). 6. Finalize the creation by clicking on "Create". 7. Done! The batch export will schedule its first run on the start of the next period.