` should point to where your Airbyte instance will be available, including the http/https protocol.
-#### Deploying Airbyte Enterprise with Okta
+## Deploying Airbyte Enterprise with Okta
-Your Okta app is now set up and you're ready to deploy Airbyte with SSO! Take note of the following configuration values, as you will need them to configure Airbyte to use your new Okta SSO app integration:
+Once your Okta app is set up, you're ready to deploy Airbyte with SSO. Take note of the following configuration values, as you will need them to configure Airbyte to use your new Okta SSO app integration:
- Okta domain ([how to find your Okta domain](https://developer.okta.com/docs/guides/find-your-domain/main/))
- App integration name
diff --git a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md
index 794abaa46ab7..49a663b451c9 100644
--- a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md
+++ b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md
@@ -21,7 +21,7 @@ To set up email notifications:
2. Click **Notifications**.
-3. Toggle which messages you'd like to receive from Airbyte. All email notifications will be sent to the owner of this workspace. The owner of your workspace is noted at the top of the page.
+3. Toggle which messages you'd like to receive from Airbyte. All email notifications will be sent by default to the creator of the workspace. To change the recipient, edit and save the **notification email recipient**. If you would like to send email notifications to more than one recipient, you can enter an email distribution list (ie Google Group) as the recipient.
4. Click **Save changes**.
diff --git a/docs/cloud/managing-airbyte-cloud/manage-credits.md b/docs/cloud/managing-airbyte-cloud/manage-credits.md
index fd07cbf0a421..040a083e58d5 100644
--- a/docs/cloud/managing-airbyte-cloud/manage-credits.md
+++ b/docs/cloud/managing-airbyte-cloud/manage-credits.md
@@ -17,12 +17,12 @@ To buy credits:
:::note
Purchase limits:
- * Minimum: 100 credits
+ * Minimum: 20 credits
* Maximum: 2,500 credits
:::
- To buy more credits or a subscription plan, reach out to [Sales](https://airbyte.com/talk-to-sales).
+ To buy more credits or a custom plan, reach out to [Sales](https://airbyte.com/talk-to-sales).
5. Fill out the payment information.
@@ -40,10 +40,31 @@ To buy credits:
:::
+## Automatic reload of credits (Beta)
+
+You can enroll in automatic top-ups of your credit balance. This is a beta feature for those who do not want to manually add credits each time.
+
+To enroll, [email us](mailto:natalie@airbyte.io) with:
+
+1. A link to your workspace that you'd like to enable this feature for.
+2. **Recharge threshold** The number under what credit balance you would like the automatic top up to occur.
+3. **Recharge balance** The amount of credits you would like to refill to.
+
+As an example, if the recharge threshold is 10 credits and recharge balance is 30 credits, anytime your workspace's credit balance dipped below 10 credits, Airbyte will automatically add enough credits to bring the balance back to 30 credits by charging the difference between your credit balance and 30 credits.
+
+To take a real example, if:
+1. The credit balance reached 3 credits.
+2. 27 credits are automatically charged to the card on file and added to the balance.
+3. The ending credit balance is 30 credits.
+
+Note that the difference between the recharge credit amount and recharge threshold must be at least 20 as our minimum purchase is 20 credits.
+
+If you are enrolled and want to change your limits or cancel your enrollment, [email us](mailto:natalie@airbyte.io).
+
## View invoice history
1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Billing** in the navigation bar.
2. Click **Invoice History**. You will be redirected to a Stripe portal.
-3. Enter the email address used to make the purchase to see your invoice history. [Email us](mailto:ar@airbyte.io) for an invoice.
\ No newline at end of file
+3. Enter the email address used to make the purchase to see your invoice history. [Email us](mailto:ar@airbyte.io) for an invoice.
diff --git a/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md b/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md
index d0ab753e6616..bbc2211fd2e6 100644
--- a/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md
+++ b/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md
@@ -9,7 +9,7 @@ Understanding the following limitations will help you more effectively manage Ai
* Max number of days with consecutive sync failures before a connection is paused: 14 days
* Max number of streams that can be returned by a source in a discover call: 1K
* Max number of streams that can be configured to sync in a single connection: 1K
-* Size of a single record: 100MB
+* Size of a single record: 20MB
* Shortest sync schedule: Every 60 min
* Schedule accuracy: +/- 30 min
diff --git a/docs/connector-development/README.md b/docs/connector-development/README.md
index 9a011cb3f4e3..b19dac14552c 100644
--- a/docs/connector-development/README.md
+++ b/docs/connector-development/README.md
@@ -53,7 +53,7 @@ If you are building a connector in any of the following languages/frameworks, th
#### Sources
* **Python Source Connector**
-* [**Singer**](https://singer.io)**-based Python Source Connector**. [Singer.io](https://singer.io/) is an open source framework with a large community and many available connectors \(known as taps & targets\). To build an Airbyte connector from a Singer tap, wrap the tap in a thin Python package to make it Airbyte Protocol-compatible. See the [Github Connector](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-github-singer) for an example of an Airbyte Connector implemented on top of a Singer tap.
+* [**Singer**](https://singer.io)**-based Python Source Connector**. [Singer.io](https://singer.io/) is an open source framework with a large community and many available connectors \(known as taps & targets\). To build an Airbyte connector from a Singer tap, wrap the tap in a thin Python package to make it Airbyte Protocol-compatible. See the [Github Connector](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-github) for an example of an Airbyte Connector implemented on top of a Singer tap.
* **Generic Connector**: This template provides a basic starting point for any language.
#### Destinations
diff --git a/docs/connector-development/connector-builder-ui/incremental-sync.md b/docs/connector-development/connector-builder-ui/incremental-sync.md
index c6f780fe12cc..5801267fea9d 100644
--- a/docs/connector-development/connector-builder-ui/incremental-sync.md
+++ b/docs/connector-development/connector-builder-ui/incremental-sync.md
@@ -11,7 +11,6 @@ To use incremental syncs, the API endpoint needs to fullfil the following requir
- Records contain a top-level date/time field that defines when this record was last updated (the "cursor field")
- If the record's cursor field is nested, you can use an "Add Field" transformation to copy it to the top-level, and a Remove Field to remove it from the object. This will effectively move the field to the top-level of the record
- It's possible to filter/request records by the cursor field
-- The records are sorted in ascending order based on their cursor field
The knowledge of a cursor value also allows the Airbyte system to automatically keep a history of changes to records in the destination. To learn more about how different modes of incremental syncs, check out the [Incremental Sync - Append](/understanding-airbyte/connections/incremental-append/) and [Incremental Sync - Append + Deduped](/understanding-airbyte/connections/incremental-append-deduped) pages.
@@ -59,19 +58,13 @@ As this fulfills the requirements for incremental syncs, we can configure the "I
-This API orders records by default from new to old, which is not optimal for a reliable sync as the last encountered cursor value will be the most recent date even if some older records did not get synced (for example if a sync fails halfway through). It's better to start with the oldest records and work your way up to make sure that all older records are synced already once a certain date is encountered on a record. In this case the API can be configured to behave like this by setting an additional parameter:
-
-- Add a new "Query Parameter" near the top of the page
-- Set the key to `order-by`
-- Set the value to `oldest`
-
Setting the start date in the "Testing values" to a date in the past like **2023-04-09T00:00:00Z** results in the following request:
-curl 'https://content.guardianapis.com/search?order-by=oldest&from-date=2023-04-09T00:00:00Z&to-date={`now`}'
+curl 'https://content.guardianapis.com/search?from-date=2023-04-09T00:00:00Z&to-date={`now`}'
-The last encountered date will be saved as part of the connection - when the next sync is running, it picks up from the last record. Let's assume the last ecountered article looked like this:
+The most recent encountered date will be saved as part of the connection - when the next sync is running, it picks up from that date as the new start date. Let's assume the last ecountered article looked like this:
{`{
@@ -86,7 +79,7 @@ The last encountered date will be saved as part of the connection - when the nex
Then when a sync is triggered for the same connection the next day, the following request is made:
-curl 'https://content.guardianapis.com/search?order-by=oldest&from-date=2023-04-15T07:30:58Z&to-date={``}'
+curl 'https://content.guardianapis.com/search?from-date=2023-04-15T07:30:58Z&to-date={``}'
The `from-date` is set to the cutoff date of articles synced already and the `to-date` is set to the current date.
@@ -118,9 +111,9 @@ The "cursor granularity" also needs to be set to an ISO 8601 duration - it repre
For example if the "Step" is set to 10 days (`P10D`) and the "Cursor granularity" set to second (`PT1S`) for the Guardian articles stream described above and a longer time range, then the following requests will be performed:
-curl 'https://content.guardianapis.com/search?order-by=oldest&from-date=2023-01-01T00:00:00Z&to-date=2023-01-10T00:00:00Z'{`\n`}
-curl 'https://content.guardianapis.com/search?order-by=oldest&from-date=2023-01-10T00:00:00Z&to-date=2023-01-20T00:00:00Z'{`\n`}
-curl 'https://content.guardianapis.com/search?order-by=oldest&from-date=2023-01-20T00:00:00Z&to-date=2023-01-30T00:00:00Z'{`\n`}
+curl 'https://content.guardianapis.com/search?from-date=2023-01-01T00:00:00Z&to-date=2023-01-10T00:00:00Z'{`\n`}
+curl 'https://content.guardianapis.com/search?from-date=2023-01-10T00:00:00Z&to-date=2023-01-20T00:00:00Z'{`\n`}
+curl 'https://content.guardianapis.com/search?from-date=2023-01-20T00:00:00Z&to-date=2023-01-30T00:00:00Z'{`\n`}
...
@@ -157,7 +150,7 @@ Reiterating the example from above with a "Lookback window" of 2 days configured
Then when a sync is triggered for the same connection the next day, the following request is made:
-curl 'https://content.guardianapis.com/search?order-by=oldest&from-date=2023-04-13T07:30:58Z&to-date={``}'
+curl 'https://content.guardianapis.com/search?from-date=2023-04-13T07:30:58Z&to-date={``}'
## Custom parameter injection
diff --git a/docs/deploying-airbyte/on-kubernetes-via-helm.md b/docs/deploying-airbyte/on-kubernetes-via-helm.md
index 9f8f04dab34c..a79f74e46e9d 100644
--- a/docs/deploying-airbyte/on-kubernetes-via-helm.md
+++ b/docs/deploying-airbyte/on-kubernetes-via-helm.md
@@ -118,7 +118,7 @@ helm install --values path/to/values.yaml %release_name% airbyte/airbyte
### (Early Access) Airbyte Enterprise deployment
-[Airbyte Enterprise](/airbyte-enterprise) is in an early access stage, so this section will likely evolve. That said, if you have an Airbyte Enterprise license key and wish to install Airbyte Enterprise via helm, follow these steps:
+[Airbyte Enterprise](/airbyte-enterprise) is in an early access stage for select priority users. Once you [are qualified for an Airbyte Enterprise license key](https://airbyte.com/company/talk-to-sales), you can install Airbyte Enterprise via helm by following these steps:
1. Checkout the latest revision of the [airbyte-platform repository](https://github.com/airbytehq/airbyte-platform)
diff --git a/docs/integrations/destinations/bigquery.md b/docs/integrations/destinations/bigquery.md
index 84b48838823a..2f0c4fd686f0 100644
--- a/docs/integrations/destinations/bigquery.md
+++ b/docs/integrations/destinations/bigquery.md
@@ -88,7 +88,7 @@ Airbyte converts any invalid characters into `_` characters when writing data. H
## Data type map
| Airbyte type | BigQuery type |
-| :---------------------------------- | :------------ |
+|:------------------------------------|:--------------|
| DATE | DATE |
| STRING (BASE64) | STRING |
| NUMBER | FLOAT |
@@ -126,6 +126,13 @@ Now that you have set up the BigQuery destination connector, check out the follo
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:-----------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| 2.1.2 | 2023-10-10 | [\#31194](https://github.com/airbytehq/airbyte/pull/31194) | Deallocate unused per stream buffer memory when empty |
+| 2.1.1 | 2023-10-10 | [\#31083](https://github.com/airbytehq/airbyte/pull/31083) | Fix precision of numeric values in async destinations |
+| 2.1.0 | 2023-10-09 | [\#31149](https://github.com/airbytehq/airbyte/pull/31149) | No longer fail syncs when PKs are null - try do dedupe anyway |
+| 2.0.26 | 2023-10-09 | [\#31198](https://github.com/airbytehq/airbyte/pull/31198) | Clarify configuration groups |
+| 2.0.25 | 2023-10-09 | [\#31185](https://github.com/airbytehq/airbyte/pull/31185) | Increase staging file upload timeout to 5 minutes |
+| 2.0.24 | 2023-10-06 | [\#31139](https://github.com/airbytehq/airbyte/pull/31139) | Bump CDK version |
+| 2.0.23 | 2023-10-06 | [\#31129](https://github.com/airbytehq/airbyte/pull/31129) | Reduce async buffer size |
| 2.0.22 | 2023-10-04 | [\#31082](https://github.com/airbytehq/airbyte/pull/31082) | Revert null PK checks |
| 2.0.21 | 2023-10-03 | [\#31028](https://github.com/airbytehq/airbyte/pull/31028) | Update timeout |
| 2.0.20 | 2023-09-26 | [\#30779](https://github.com/airbytehq/airbyte/pull/30779) | Final table PK columns become non-null and skip check for null PKs in raw records (performance) |
diff --git a/docs/integrations/destinations/chroma.md b/docs/integrations/destinations/chroma.md
index 6efd9f3d478b..7ebc7b78ca1d 100644
--- a/docs/integrations/destinations/chroma.md
+++ b/docs/integrations/destinations/chroma.md
@@ -75,5 +75,6 @@ You should now have all the requirements needed to configure Chroma as a destina
| Version | Date | Pull Request | Subject |
| :------ | :--------- | :--------------------------------------------------------- | :----------------------------------------- |
+| 0.0.3 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size |
| 0.0.2 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK |
| 0.0.1 | 2023-09-08 | [#30023](https://github.com/airbytehq/airbyte/pull/30023) | π New Destination: Chroma (Vector Database) |
diff --git a/docs/integrations/destinations/milvus.md b/docs/integrations/destinations/milvus.md
index 8ff7e93304fb..9a9a7808601a 100644
--- a/docs/integrations/destinations/milvus.md
+++ b/docs/integrations/destinations/milvus.md
@@ -105,6 +105,7 @@ vector_store.similarity_search("test")
| Version | Date | Pull Request | Subject |
|:--------| :--------- |:--------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------|
+| 0.0.4 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size |
| 0.0.3 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK |
| 0.0.2 | 2023-08-25 | [#30689](https://github.com/airbytehq/airbyte/pull/30689) | Update CDK to support azure OpenAI embeddings and text splitting options, make sure primary key field is not accidentally set, promote to certified |
| 0.0.1 | 2023-08-12 | [#29442](https://github.com/airbytehq/airbyte/pull/29442) | Milvus connector with some embedders |
diff --git a/docs/integrations/destinations/pinecone.md b/docs/integrations/destinations/pinecone.md
index 6ac74fd933da..3d4153f66ecf 100644
--- a/docs/integrations/destinations/pinecone.md
+++ b/docs/integrations/destinations/pinecone.md
@@ -74,6 +74,7 @@ OpenAI and Fake embeddings produce vectors with 1536 dimensions, and the Cohere
| Version | Date | Pull Request | Subject |
|:--------| :--------- |:--------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------|
+| 0.0.15 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size |
| 0.0.14 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK |
| 0.0.13 | 2023-09-26 | [#30649](https://github.com/airbytehq/airbyte/pull/30649) | Allow more text splitting options |
| 0.0.12 | 2023-09-25 | [#30649](https://github.com/airbytehq/airbyte/pull/30649) | Fix bug with stale documents left on starter pods |
diff --git a/docs/integrations/destinations/qdrant.md b/docs/integrations/destinations/qdrant.md
index 2a49d11b2fa3..eb55b7a1669b 100644
--- a/docs/integrations/destinations/qdrant.md
+++ b/docs/integrations/destinations/qdrant.md
@@ -70,6 +70,7 @@ You should now have all the requirements needed to configure Qdrant as a destina
| Version | Date | Pull Request | Subject |
| :------ | :--------- | :--------------------------------------------------------- | :----------------------------------------- |
+| 0.0.4 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size |
| 0.0.3 | 2023-09-29 | [#30820](https://github.com/airbytehq/airbyte/pull/30820) | Update CDK |
| 0.0.2 | 2023-09-25 | [#30689](https://github.com/airbytehq/airbyte/pull/30689) | Update CDK to support Azure OpenAI embeddings and text splitting options |
| 0.0.1 | 2023-09-22 | [#30332](https://github.com/airbytehq/airbyte/pull/30332) | π New Destination: Qdrant (Vector Database) |
diff --git a/docs/integrations/destinations/redshift.md b/docs/integrations/destinations/redshift.md
index 4dfedfd02f34..bd92c29289e5 100644
--- a/docs/integrations/destinations/redshift.md
+++ b/docs/integrations/destinations/redshift.md
@@ -141,7 +141,7 @@ Each stream will be output into its own raw table in Redshift. Each table will c
## Data type mapping
| Redshift Type | Airbyte Type | Notes |
-| :-------------------- | :------------------------ | :---- |
+|:----------------------|:--------------------------|:------|
| `boolean` | `boolean` | |
| `int` | `integer` | |
| `float` | `number` | |
@@ -156,6 +156,10 @@ Each stream will be output into its own raw table in Redshift. Each table will c
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:-----------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| 0.6.9 | 2023-10-10 | [\#31083](https://github.com/airbytehq/airbyte/pull/31083) | Fix precision of numeric values in async destinations |
+| 0.6.8 | 2023-10-10 | [\#31218](https://github.com/airbytehq/airbyte/pull/31218) | Clarify configuration groups |
+| 0.6.7 | 2023-10-06 | [\#31153](https://github.com/airbytehq/airbyte/pull/31153) | Increase jvm GC retries |
+| 0.6.6 | 2023-10-06 | [\#31129](https://github.com/airbytehq/airbyte/pull/31129) | Reduce async buffer size |
| 0.6.5 | 2023-08-18 | [\#28619](https://github.com/airbytehq/airbyte/pull/29640) | Fix duplicate staging object names in concurrent environment (e.g. async) |
| 0.6.4 | 2023-08-10 | [\#28619](https://github.com/airbytehq/airbyte/pull/28619) | Use async method for staging |
| 0.6.3 | 2023-08-07 | [\#29188](https://github.com/airbytehq/airbyte/pull/29188) | Internal code refactoring |
@@ -197,7 +201,7 @@ Each stream will be output into its own raw table in Redshift. Each table will c
| 0.3.33 | 2022-05-04 | [\#12601](https://github.com/airbytehq/airbyte/pull/12601) | Apply buffering strategy for S3 staging |
| 0.3.32 | 2022-04-20 | [\#12085](https://github.com/airbytehq/airbyte/pull/12085) | Fixed bug with switching between INSERT and COPY config |
| 0.3.31 | 2022-04-19 | [\#12064](https://github.com/airbytehq/airbyte/pull/12064) | Added option to support SUPER datatype in \_airbyte_raw\*\*\* table |
-| 0.3.29 | 2022-04-05 | [\#11729](https://github.com/airbytehq/airbyte/pull/11729) | Fixed bug with dashes in schema name | |
+| 0.3.29 | 2022-04-05 | [\#11729](https://github.com/airbytehq/airbyte/pull/11729) | Fixed bug with dashes in schema name |
| 0.3.28 | 2022-03-18 | [\#11254](https://github.com/airbytehq/airbyte/pull/11254) | Fixed missing records during S3 staging |
| 0.3.27 | 2022-02-25 | [\#10421](https://github.com/airbytehq/airbyte/pull/10421) | Refactor JDBC parameters handling |
| 0.3.25 | 2022-02-14 | [\#9920](https://github.com/airbytehq/airbyte/pull/9920) | Updated the size of staging files for S3 staging. Also, added closure of S3 writers to staging files when data has been written to an staging file. |
diff --git a/docs/integrations/destinations/snowflake.md b/docs/integrations/destinations/snowflake.md
index e32e5b1652c2..1c8f51791b1e 100644
--- a/docs/integrations/destinations/snowflake.md
+++ b/docs/integrations/destinations/snowflake.md
@@ -157,7 +157,7 @@ Navigate to the Airbyte UI to set up Snowflake as a destination. You can authent
### Login and Password
| Field | Description |
-| ----------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+|-------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| [Host](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html) | The host domain of the snowflake instance (must include the account, region, cloud environment, and end with snowflakecomputing.com). Example: `accountname.us-east-2.aws.snowflakecomputing.com` |
| [Role](https://docs.snowflake.com/en/user-guide/security-access-control-overview.html#roles) | The role you created in Step 1 for Airbyte to access Snowflake. Example: `AIRBYTE_ROLE` |
| [Warehouse](https://docs.snowflake.com/en/user-guide/warehouses-overview.html#overview-of-warehouses) | The warehouse you created in Step 1 for Airbyte to sync data into. Example: `AIRBYTE_WAREHOUSE` |
@@ -170,7 +170,7 @@ Navigate to the Airbyte UI to set up Snowflake as a destination. You can authent
### OAuth 2.0
| Field | Description |
-| :---------------------------------------------------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
+|:------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| [Host](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html) | The host domain of the snowflake instance (must include the account, region, cloud environment, and end with snowflakecomputing.com). Example: `accountname.us-east-2.aws.snowflakecomputing.com` |
| [Role](https://docs.snowflake.com/en/user-guide/security-access-control-overview.html#roles) | The role you created in Step 1 for Airbyte to access Snowflake. Example: `AIRBYTE_ROLE` |
| [Warehouse](https://docs.snowflake.com/en/user-guide/warehouses-overview.html#overview-of-warehouses) | The warehouse you created in Step 1 for Airbyte to sync data into. Example: `AIRBYTE_WAREHOUSE` |
@@ -207,7 +207,7 @@ Navigate to the Airbyte UI to set up Snowflake as a destination. You can authent
To use AWS S3 as the cloud storage, enter the information for the S3 bucket you created in Step 2:
| Field | Description |
-| ------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+|--------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| S3 Bucket Name | The name of the staging S3 bucket (Example: `airbyte.staging`). Airbyte will write files to this bucket and read them via statements on Snowflake. |
| S3 Bucket Region | The S3 staging bucket region used. |
| S3 Key Id \* | The Access Key ID granting access to the S3 staging bucket. Airbyte requires Read and Write permissions for the bucket. |
@@ -220,7 +220,7 @@ To use AWS S3 as the cloud storage, enter the information for the S3 bucket you
To use a Google Cloud Storage bucket, enter the information for the bucket you created in Step 2:
| Field | Description |
-| ------------------------------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+|--------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| GCP Project ID | The name of the GCP project ID for your credentials. (Example: `my-project`) |
| GCP Bucket Name | The name of the staging bucket. Airbyte will write files to this bucket and read them via statements on Snowflake. (Example: `airbyte-staging`) |
| Google Application Credentials | The contents of the JSON key file that has read/write permissions to the staging GCS bucket. You will separately need to grant bucket access to your Snowflake GCP service account. See the [Google Cloud docs](https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating_service_account_keys) for more information on how to generate a JSON key for your service account. |
@@ -230,7 +230,7 @@ To use a Google Cloud Storage bucket, enter the information for the bucket you c
Airbyte outputs each stream into its own table with the following columns in Snowflake:
| Airbyte field | Description | Column type |
-| -------------------- | -------------------------------------------------------------- | ------------------------ |
+|----------------------|----------------------------------------------------------------|--------------------------|
| \_airbyte_ab_id | A UUID assigned to each processed event | VARCHAR |
| \_airbyte_emitted_at | A timestamp for when the event was pulled from the data source | TIMESTAMP WITH TIME ZONE |
| \_airbyte_data | A JSON blob with the event data. | VARIANT |
@@ -271,6 +271,12 @@ Otherwise, make sure to grant the role the required permissions in the desired n
| Version | Date | Pull Request | Subject |
|:----------------|:-----------|:-----------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| 3.2.2 | 2023-10-10 | [\#31194](https://github.com/airbytehq/airbyte/pull/31194) | Deallocate unused per stream buffer memory when empty |
+| 3.2.1 | 2023-10-10 | [\#31083](https://github.com/airbytehq/airbyte/pull/31083) | Fix precision of numeric values in async destinations |
+| 3.2.0 | 2023-10-09 | [\#31149](https://github.com/airbytehq/airbyte/pull/31149) | No longer fail syncs when PKs are null - try do dedupe anyway |
+| 3.1.22 | 2023-10-06 | [\#31153](https://github.com/airbytehq/airbyte/pull/31153) | Increase jvm GC retries |
+| 3.1.21 | 2023-10-06 | [\#31139](https://github.com/airbytehq/airbyte/pull/31139) | Bump CDK version |
+| 3.1.20 | 2023-10-06 | [\#31129](https://github.com/airbytehq/airbyte/pull/31129) | Reduce async buffer size |
| 3.1.19 | 2023-10-04 | [\#31082](https://github.com/airbytehq/airbyte/pull/31082) | Revert null PK checks |
| 3.1.18 | 2023-10-01 | [\#30779](https://github.com/airbytehq/airbyte/pull/30779) | Final table PK columns become non-null and skip check for null PKs in raw records (performance) |
| 3.1.17 | 2023-09-29 | [\#30938](https://github.com/airbytehq/airbyte/pull/30938) | Upgrade snowflake-jdbc driver |
diff --git a/docs/integrations/destinations/weaviate.md b/docs/integrations/destinations/weaviate.md
index 92e5dcf3a2c8..bc23c60823eb 100644
--- a/docs/integrations/destinations/weaviate.md
+++ b/docs/integrations/destinations/weaviate.md
@@ -77,12 +77,13 @@ If a class doesn't exist in the schema of the cluster, it will be created using
You can also create the class in Weaviate in advance if you need more control over the schema in Weaviate. In this case, the text properies `_ab_stream` and `_ab_record_id` need to be created for bookkeeping reasons. In case a sync is run in `Overwrite` mode, the class will be deleted and recreated.
-As properties have to start will a lowercase letter in Weaviate, field names might be updated during the loading process.
+As properties have to start will a lowercase letter in Weaviate, field names might be updated during the loading process. The field names `id`, `_id` and `_additional` are reserved keywords in Weaviate, so they will be renamed to `raw_id`, `raw__id` and `raw_additional` respectively.
## Changelog
| Version | Date | Pull Request | Subject |
| :------ | :--------- | :--------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------- |
+| 0.2.1 | 2023-10-04 | [#31075](https://github.com/airbytehq/airbyte/pull/31075) | Fix OpenAI embedder batch size and conflict field name handling |
| 0.2.0 | 2023-09-22 | [#30151](https://github.com/airbytehq/airbyte/pull/30151) | Add embedding capabilities, overwrite and dedup support and API key auth mode, make certified. π¨ Breaking changes - check migrations guide. |
| 0.1.1 | 2022-02-08 | [\#22527](https://github.com/airbytehq/airbyte/pull/22527) | Multiple bug fixes: Support String based IDs, arrays of uknown type and additionalProperties of type object and array of objects |
| 0.1.0 | 2022-12-06 | [\#20094](https://github.com/airbytehq/airbyte/pull/20094) | Add Weaviate destination |
diff --git a/docs/integrations/getting-started/source-facebook-marketing.md b/docs/integrations/getting-started/source-google-ads.md
similarity index 98%
rename from docs/integrations/getting-started/source-facebook-marketing.md
rename to docs/integrations/getting-started/source-google-ads.md
index cb0303519372..f1558cddf335 100644
--- a/docs/integrations/getting-started/source-facebook-marketing.md
+++ b/docs/integrations/getting-started/source-google-ads.md
@@ -1,4 +1,4 @@
-# Getting Started: Source Facebook Marketing
+# Getting Started: Source Google Ads
## Requirements
diff --git a/docs/integrations/sources/airtable-migrations.md b/docs/integrations/sources/airtable-migrations.md
new file mode 100644
index 000000000000..66a0d6526f01
--- /dev/null
+++ b/docs/integrations/sources/airtable-migrations.md
@@ -0,0 +1,4 @@
+# Airtable Migration Guide
+
+## Upgrading to 4.0.0
+Columns with Formulas are narrowing from `array` to `string` or `number`. You may need to refresh the connection schema (with the reset), and run a sync.
\ No newline at end of file
diff --git a/docs/integrations/sources/airtable.md b/docs/integrations/sources/airtable.md
index 59d427c2f6a3..d485d07710ec 100644
--- a/docs/integrations/sources/airtable.md
+++ b/docs/integrations/sources/airtable.md
@@ -2,10 +2,6 @@
This page contains the setup guide and reference information for the [Airtable](https://airtable.com/api) source connector.
-:::caution
-Currently, this source connector works with `Standard` subscription plan only. `Enterprise` level accounts are not supported yet.
-:::
-
## Prerequisites
* An active Airtable account
@@ -15,9 +11,23 @@ Currently, this source connector works with `Standard` subscription plan only. `
- `schema.bases:read`
## Setup guide
-
### Step 1: Set up Airtable
+
+#### For Airbyte Open Source:
+1. Go to https://airtable.com/create/tokens to create new token.
+ ![Generate new Token](../../.gitbook/assets/source/airtable/generate_new_token.png)
+2. Add following scopes and press the `Create Token` button:
+ - `data.records:read`
+ - `data.recordComments:read`
+ - `schema.bases:read`
+
+ ![Add Scopes](../../.gitbook/assets/source/airtable/add_scopes.png)
+3. Save token from the popup window.
+
+
+### Step 2: Set up Airtable connector in Airbyte
+
### For Airbyte Cloud:
@@ -51,51 +61,52 @@ Please keep in mind that if you start syncing a table via Airbyte, then rename i
The airtable source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes):
-| Feature | Supported?\(Yes/No\) | Notes |
-|:------------------|:---------------------|:------|
-| Full Refresh Sync | Yes | |
-| Incremental Sync | No | |
-
+- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/glossary#full-refresh-sync)
+- [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append)
## Supported Tables
-This source allows you to pull all available tables and bases using `Metadata API` for a given authenticated user. In case you rename or add a column to any existing table, you will need to recreate the source to update the Airbyte catalog.
+This source allows you to pull all available tables and bases using `Metadata API` for a given authenticated user. In case you rename or add a column to any existing table, you will need to recreate the source to update the Airbyte catalog.
+
+### Performance Considerations
+
+See information about rate limits [here](https://airtable.com/developers/web/api/rate-limits).
## Data type map
-| Integration Type | Airbyte Type | Nullable |
-|:------------------------|:--------------------------------|----------|
-| `multipleAttachments` | `string` | Yes |
-| `autoNumber` | `string` | Yes |
-| `barcode` | `string` | Yes |
-| `button` | `string` | Yes |
-| `checkbox` | `boolean` | Yes |
-| `singleCollaborator` | `string` | Yes |
-| `count` | `number` | Yes |
-| `createdBy` | `string` | Yes |
-| `createdTime` | `datetime`, `format: date-time` | Yes |
-| `currency` | `number` | Yes |
-| `email` | `string` | Yes |
-| `date` | `string`, `format: date` | Yes |
-| `duration` | `number` | Yes |
-| `lastModifiedBy` | `string` | Yes |
-| `lastModifiedTime` | `datetime`, `format: date-time` | Yes |
-| `multipleRecordLinks` | `array with strings` | Yes |
-| `multilineText` | `string` | Yes |
-| `multipleCollaborators` | `array with strings` | Yes |
-| `multipleSelects` | `array with strings` | Yes |
-| `number` | `number` | Yes |
-| `percent` | `number` | Yes |
-| `phoneNumber` | `string` | Yes |
-| `rating` | `number` | Yes |
-| `richText` | `string` | Yes |
-| `singleLineText` | `string` | Yes |
-| `externalSyncSource` | `string` | Yes |
-| `url` | `string` | Yes |
-| `formula` | `array with any` | Yes |
-| `lookup` | `array with any` | Yes |
-| `multipleLookupValues` | `array with any` | Yes |
-| `rollup` | `array with any` | Yes |
+| Integration Type | Airbyte Type | Nullable |
+|:------------------------|:---------------------------------------|----------|
+| `multipleAttachments` | `string` | Yes |
+| `autoNumber` | `string` | Yes |
+| `barcode` | `string` | Yes |
+| `button` | `string` | Yes |
+| `checkbox` | `boolean` | Yes |
+| `singleCollaborator` | `string` | Yes |
+| `count` | `number` | Yes |
+| `createdBy` | `string` | Yes |
+| `createdTime` | `datetime`, `format: date-time` | Yes |
+| `currency` | `number` | Yes |
+| `email` | `string` | Yes |
+| `date` | `string`, `format: date` | Yes |
+| `duration` | `number` | Yes |
+| `lastModifiedBy` | `string` | Yes |
+| `lastModifiedTime` | `datetime`, `format: date-time` | Yes |
+| `multipleRecordLinks` | `array with strings` | Yes |
+| `multilineText` | `string` | Yes |
+| `multipleCollaborators` | `array with strings` | Yes |
+| `multipleSelects` | `array with strings` | Yes |
+| `number` | `number` | Yes |
+| `percent` | `number` | Yes |
+| `phoneNumber` | `string` | Yes |
+| `rating` | `number` | Yes |
+| `richText` | `string` | Yes |
+| `singleLineText` | `string` | Yes |
+| `externalSyncSource` | `string` | Yes |
+| `url` | `string` | Yes |
+| `formula` | `string`, `number` or `array with any` | Yes |
+| `lookup` | `array with any` | Yes |
+| `multipleLookupValues` | `array with any` | Yes |
+| `rollup` | `array with any` | Yes |
* All the fields are `nullable` by default, meaning that the field could be empty.
* The `array with any` - represents the classic array with one of the other Airtable data types inside, such as:
@@ -103,24 +114,24 @@ This source allows you to pull all available tables and bases using `Metadata AP
- number/integer
- nested lists/objects
-### Performance Considerations (Airbyte Open-Source)
-
-See information about rate limits [here](https://airtable.com/developers/web/api/rate-limits).
-
## Changelog
-| Version | Date | Pull Request | Subject |
-|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------------|
-| 3.0.1 | 2023-05-10 | [25946](https://github.com/airbytehq/airbyte/pull/25946) | Skip stream if it does not appear in catalog |
-| 3.0.0 | 2023-03-20 | [22704](https://github.com/airbytehq/airbyte/pull/22704) | Fix for stream name uniqueness |
-| 2.0.4 | 2023-03-15 | [24093](https://github.com/airbytehq/airbyte/pull/24093) | Update spec and doc |
-| 2.0.3 | 2023-02-02 | [22311](https://github.com/airbytehq/airbyte/pull/22311) | Fix for `singleSelect` types when discovering the schema |
-| 2.0.2 | 2023-02-01 | [22245](https://github.com/airbytehq/airbyte/pull/22245) | Fix for empty `result` object when discovering the schema |
-| 2.0.1 | 2023-02-01 | [22224](https://github.com/airbytehq/airbyte/pull/22224) | Fixed broken `API Key` authentication |
-| 2.0.0 | 2023-01-27 | [21962](https://github.com/airbytehq/airbyte/pull/21962) | Added casting of native Airtable data types to JsonSchema types |
-| 1.0.2 | 2023-01-25 | [20934](https://github.com/airbytehq/airbyte/pull/20934) | Added `OAuth2.0` authentication support |
-| 1.0.1 | 2023-01-10 | [21215](https://github.com/airbytehq/airbyte/pull/21215) | Fix field names |
-| 1.0.0 | 2022-12-22 | [20846](https://github.com/airbytehq/airbyte/pull/20846) | Migrated to Metadata API for dynamic schema generation |
-| 0.1.3 | 2022-10-26 | [18491](https://github.com/airbytehq/airbyte/pull/18491) | Improve schema discovery logic |
-| 0.1.2 | 2022-04-30 | [12500](https://github.com/airbytehq/airbyte/pull/12500) | Improve input configuration copy |
-| 0.1.1 | 2021-12-06 | [8425](https://github.com/airbytehq/airbyte/pull/8425) | Update title, description fields in spec |
+| Version | Date | Pull Request | Subject |
+|:--------|:-----------|:---------------------------------------------------------|:---------------------------------------------------------------------------------------|
+| 4.1.2 | 2023-10-10 | [31215](https://github.com/airbytehq/airbyte/pull/31215) | Exclude bases without permission |
+| 4.1.1 | 2023-10-10 | [31119](https://github.com/airbytehq/airbyte/pull/31119) | Add user-friendly error message when refresh token has expired |
+| 4.1.0 | 2023-10-10 | [31044](https://github.com/airbytehq/airbyte/pull/31044) | Add source table name to output records |
+| 4.0.0 | 2023-10-09 | [31181](https://github.com/airbytehq/airbyte/pull/31181) | Additional schema processing for the FORMULA schema type: Convert to simple data types |
+| 3.0.1 | 2023-05-10 | [25946](https://github.com/airbytehq/airbyte/pull/25946) | Skip stream if it does not appear in catalog |
+| 3.0.0 | 2023-03-20 | [22704](https://github.com/airbytehq/airbyte/pull/22704) | Fix for stream name uniqueness |
+| 2.0.4 | 2023-03-15 | [24093](https://github.com/airbytehq/airbyte/pull/24093) | Update spec and doc |
+| 2.0.3 | 2023-02-02 | [22311](https://github.com/airbytehq/airbyte/pull/22311) | Fix for `singleSelect` types when discovering the schema |
+| 2.0.2 | 2023-02-01 | [22245](https://github.com/airbytehq/airbyte/pull/22245) | Fix for empty `result` object when discovering the schema |
+| 2.0.1 | 2023-02-01 | [22224](https://github.com/airbytehq/airbyte/pull/22224) | Fixed broken `API Key` authentication |
+| 2.0.0 | 2023-01-27 | [21962](https://github.com/airbytehq/airbyte/pull/21962) | Added casting of native Airtable data types to JsonSchema types |
+| 1.0.2 | 2023-01-25 | [20934](https://github.com/airbytehq/airbyte/pull/20934) | Added `OAuth2.0` authentication support |
+| 1.0.1 | 2023-01-10 | [21215](https://github.com/airbytehq/airbyte/pull/21215) | Fix field names |
+| 1.0.0 | 2022-12-22 | [20846](https://github.com/airbytehq/airbyte/pull/20846) | Migrated to Metadata API for dynamic schema generation |
+| 0.1.3 | 2022-10-26 | [18491](https://github.com/airbytehq/airbyte/pull/18491) | Improve schema discovery logic |
+| 0.1.2 | 2022-04-30 | [12500](https://github.com/airbytehq/airbyte/pull/12500) | Improve input configuration copy |
+| 0.1.1 | 2021-12-06 | [8425](https://github.com/airbytehq/airbyte/pull/8425) | Update title, description fields in spec |
diff --git a/docs/integrations/sources/alloydb.md b/docs/integrations/sources/alloydb.md
deleted file mode 100644
index 9cb17080c38b..000000000000
--- a/docs/integrations/sources/alloydb.md
+++ /dev/null
@@ -1,364 +0,0 @@
-# AlloyDB for PostgreSQL
-
-This page contains the setup guide and reference information for the AlloyDB for PostgreSQL.
-
-## Prerequisites
-
-- For Airbyte Open Source users, [upgrade](https://docs.airbyte.com/operator-guides/upgrading-airbyte/) your Airbyte platform to version `v0.40.0-alpha` or newer
-- For Airbyte Cloud (and optionally for Airbyte Open Source), ensure SSL is enabled in your environment
-
-## Setup guide
-
-## When to use AlloyDB with CDC
-
-Configure AlloyDB with CDC if:
-
-- You need a record of deletions
-- Your table has a primary key but doesn't have a reasonable cursor field for incremental syncing (`updated_at`). CDC allows you to sync your table incrementally
-
-If your goal is to maintain a snapshot of your table in the destination but the limitations prevent you from using CDC, consider using [non-CDC incremental sync](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) and occasionally reset the data and re-sync.
-
-If your dataset is small and you just want a snapshot of your table in the destination, consider using [Full Refresh replication](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite) for your table instead of CDC.
-
-### Step 1: (Optional) Create a dedicated read-only user
-
-We recommend creating a dedicated read-only user for better permission control and auditing. Alternatively, you can use an existing AlloyDB user in your database.
-
-To create a dedicated user, run the following command:
-
-```
-CREATE USER PASSWORD 'your_password_here';
-```
-
-Grant access to the relevant schema:
-
-```
-GRANT USAGE ON SCHEMA TO
-```
-
-:::note
-To replicate data from multiple AlloyDB schemas, re-run the command to grant access to all the relevant schemas. Note that you'll need to set up multiple Airbyte sources connecting to the same AlloyDB database on multiple schemas.
-:::
-
-Grant the user read-only access to the relevant tables:
-
-```
-GRANT SELECT ON ALL TABLES IN SCHEMA TO ;
-```
-
-Allow user to see tables created in the future:
-
-```
-ALTER DEFAULT PRIVILEGES IN SCHEMA GRANT SELECT ON TABLES TO ;
-```
-
-Additionally, if you plan to configure CDC for the AlloyDB source connector, grant `REPLICATION` permissions to the user:
-
-```
-ALTER USER REPLICATION;
-```
-
-**Syncing a subset of columnsβ**
-
-Currently, there is no way to sync a subset of columns using the AlloyDB source connector:
-
-- When setting up a connection, you can only choose which tables to sync, but not columns.
-- If the user can only access a subset of columns, the connection check will pass. However, the data sync will fail with a permission denied exception.
-
-The workaround for partial table syncing is to create a view on the specific columns, and grant the user read access to that view:
-
-```
-CREATE VIEW as SELECT FROM ;
-```
-
-```
-GRANT SELECT ON TABLE IN SCHEMA to ;
-```
-
-**Note:** The workaround works only for non-CDC setups since CDC requires data to be in tables and not views.
-This issue is tracked in [#9771](https://github.com/airbytehq/airbyte/issues/9771).
-
-### Step 2: Set up the AlloyDB connector in Airbyte
-
-1. Log into your [Airbyte Cloud](https://cloud.airbyte.com/workspaces) or Airbyte Open Source account.
-2. Click **Sources** and then click **+ New source**.
-3. On the Set up the source page, select **AlloyDB** from the Source type dropdown.
-4. Enter a name for your source.
-5. For the **Host**, **Port**, and **DB Name**, enter the hostname, port number, and name for your AlloyDB database.
-6. List the **Schemas** you want to sync.
- :::note
- The schema names are case sensitive. The 'public' schema is set by default. Multiple schemas may be used at one time. No schemas set explicitly - will sync all of existing.
- :::
-7. For **User** and **Password**, enter the username and password you created in [Step 1](#step-1-optional-create-a-dedicated-read-only-user).
-8. To customize the JDBC connection beyond common options, specify additional supported [JDBC URL parameters](https://jdbc.postgresql.org/documentation/head/connect.html) as key-value pairs separated by the symbol & in the **JDBC URL Parameters (Advanced)** field.
-
- Example: key1=value1&key2=value2&key3=value3
-
- These parameters will be added at the end of the JDBC URL that the AirByte will use to connect to your AlloyDB database.
-
- The connector now supports `connectTimeout` and defaults to 60 seconds. Setting connectTimeout to 0 seconds will set the timeout to the longest time available.
-
- **Note:** Do not use the following keys in JDBC URL Params field as they will be overwritten by Airbyte:
- `currentSchema`, `user`, `password`, `ssl`, and `sslmode`.
-
- :::warning
- This is an advanced configuration option. Users are advised to use it with caution.
- :::
-
-9. For Airbyte Open Source, toggle the switch to connect using SSL. Airbyte Cloud uses SSL by default.
-10. For Replication Method, select Standard or [Logical CDC](https://www.postgresql.org/docs/10/logical-replication.html) from the dropdown. Refer to [Configuring AlloyDB connector with Change Data Capture (CDC)](#configuring-alloydb-connector-with-change-data-capture-cdc) for more information.
-11. For SSH Tunnel Method, select:
- - No Tunnel for a direct connection to the database
- - SSH Key Authentication to use an RSA Private as your secret for establishing the SSH tunnel
- - Password Authentication to use a password as your secret for establishing the SSH tunnel
- Refer to [Connect via SSH Tunnel](#connect-via-ssh-tunnelβ) for more information.
-12. Click **Set up source**.
-
-### Connect via SSH Tunnelβ
-
-You can connect to a AlloyDB instance via an SSH tunnel.
-
-When using an SSH tunnel, you are configuring Airbyte to connect to an intermediate server (also called a bastion server) that has direct access to the database. Airbyte connects to the bastion and then asks the bastion to connect directly to the server.
-
-To connect to a AlloyDB instance via an SSH tunnel:
-
-1. While [setting up](#setup-guide) the AlloyDB source connector, from the SSH tunnel dropdown, select:
- - SSH Key Authentication to use an RSA Private as your secret for establishing the SSH tunnel
- - Password Authentication to use a password as your secret for establishing the SSH Tunnel
-2. For **SSH Tunnel Jump Server Host**, enter the hostname or IP address for the intermediate (bastion) server that Airbyte will connect to.
-3. For **SSH Connection Port**, enter the port on the bastion server. The default port for SSH connections is 22.
-4. For **SSH Login Username**, enter the username to use when connecting to the bastion server. **Note:** This is the operating system username and not the AlloyDB username.
-5. For authentication:
- - If you selected **SSH Key Authentication**, set the **SSH Private Key** to the [RSA Private Key](#generating-an-rsa-private-keyβ) that you are using to create the SSH connection.
- - If you selected **Password Authentication**, enter the password for the operating system user to connect to the bastion server. **Note:** This is the operating system password and not the AlloyDB password.
-
-#### Generating an RSA Private Keyβ
-
-The connector expects an RSA key in PEM format. To generate this key, run:
-
-```
-ssh-keygen -t rsa -m PEM -f myuser_rsa
-```
-
-The command produces the private key in PEM format and the public key remains in the standard format used by the `authorized_keys` file on your bastion server. Add the public key to your bastion host to the user you want to use with Airbyte. The private key is provided via copy-and-paste to the Airbyte connector configuration screen to allow it to log into the bastion server.
-
-## Configuring AlloyDB connector with Change Data Capture (CDC)
-
-Airbyte uses [logical replication](https://www.postgresql.org/docs/10/logical-replication.html) of the Postgres write-ahead log (WAL) to incrementally capture deletes using a replication plugin. To learn more how Airbyte implements CDC, refer to [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc/)
-
-### CDC Considerations
-
-- Incremental sync is only supported for tables with primary keys. For tables without primary keys, use [Full Refresh sync](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite).
-- Data must be in tables and not views.
-- The modifications you want to capture must be made using `DELETE`/`INSERT`/`UPDATE`. For example, changes made using `TRUNCATE`/`ALTER` will not appear in logs and therefore in your destination.
-- Schema changes are not supported automatically for CDC sources. Reset and resync data if you make a schema change.
-- The records produced by `DELETE` statements only contain primary keys. All other data fields are unset.
-- Log-based replication only works for master instances of AlloyDB.
-- Using logical replication increases disk space used on the database server. The additional data is stored until it is consumed.
- - Set frequent syncs for CDC to ensure that the data doesn't fill up your disk space.
- - If you stop syncing a CDC-configured AlloyDB instance with Airbyte, delete the replication slot. Otherwise, it may fill up your disk space.
-
-### Setting up CDC for AlloyDB
-
-Airbyte requires a replication slot configured only for its use. Only one source should be configured that uses this replication slot. See Setting up CDC for AlloyDB for instructions.
-
-#### Step 2: Select a replication pluginβ
-
-We recommend using a [pgoutput](https://www.postgresql.org/docs/9.6/logicaldecoding-output-plugin.html) plugin (the standard logical decoding plugin in AlloyDB). If the replication table contains multiple JSON blobs and the table size exceeds 1 GB, we recommend using a [wal2json](https://github.com/eulerto/wal2json) instead. Note that wal2json may require additional installation for Bare Metal, VMs (EC2/GCE/etc), Docker, etc. For more information read the [wal2json documentation](https://github.com/eulerto/wal2json).
-
-#### Step 3: Create replication slotβ
-
-To create a replication slot called `airbyte_slot` using pgoutput, run:
-
-```
-SELECT pg_create_logical_replication_slot('airbyte_slot', 'pgoutput');
-```
-
-To create a replication slot called `airbyte_slot` using wal2json, run:
-
-```
-SELECT pg_create_logical_replication_slot('airbyte_slot', 'wal2json');
-```
-
-#### Step 4: Create publications and replication identities for tablesβ
-
-For each table you want to replicate with CDC, add the replication identity (the method of distinguishing between rows) first:
-
-To use primary keys to distinguish between rows, run:
-
-```
-ALTER TABLE tbl1 REPLICA IDENTITY DEFAULT;
-```
-
-After setting the replication identity, run:
-
-```
-CREATE PUBLICATION airbyte_publication FOR TABLE ;`
-```
-
-The publication name is customizable. Refer to the [Postgres docs](https://www.postgresql.org/docs/10/sql-alterpublication.html) if you need to add or remove tables from your publication in the future.
-
-:::note
-You must add the replication identity before creating the publication. Otherwise, `ALTER`/`UPDATE`/`DELETE` statements may fail if AlloyDB cannot determine how to uniquely identify rows.
-Also, the publication should include all the tables and only the tables that need to be synced. Otherwise, data from these tables may not be replicated correctly.
-:::
-
-:::warning
-The Airbyte UI currently allows selecting any tables for CDC. If a table is selected that is not part of the publication, it will not be replicated even though it is selected. If a table is part of the publication but does not have a replication identity, that replication identity will be created automatically on the first run if the Airbyte user has the necessary permissions.
-:::
-
-#### Step 5: [Optional] Set up initial waiting time
-
-:::warning
-This is an advanced feature. Use it if absolutely necessary.
-:::
-
-The AlloyDB connector may need some time to start processing the data in the CDC mode in the following scenarios:
-
-- When the connection is set up for the first time and a snapshot is needed
-- When the connector has a lot of change logs to process
-
-The connector waits for the default initial wait time of 5 minutes (300 seconds). Setting the parameter to a longer duration will result in slower syncs, while setting it to a shorter duration may cause the connector to not have enough time to create the initial snapshot or read through the change logs. The valid range is 120 seconds to 1200 seconds.
-
-If you know there are database changes to be synced, but the connector cannot read those changes, the root cause may be insufficient waiting time. In that case, you can increase the waiting time (example: set to 600 seconds) to test if it is indeed the root cause. On the other hand, if you know there are no database changes, you can decrease the wait time to speed up the zero record syncs.
-
-#### Step 6: Set up the AlloyDB source connector
-
-In [Step 2](#step-2-set-up-the-alloydb-connector-in-airbyte) of the connector setup guide, enter the replication slot and publication you just created.
-
-## Supported sync modes
-
-The AlloyDB source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes):
-
-- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/)
-- [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append)
-- [Incremental Sync - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append)
-- [Incremental Sync - Append + Deduped](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append-deduped)
-
-## Supported cursors
-
-- `TIMESTAMP`
-- `TIMESTAMP_WITH_TIMEZONE`
-- `TIME`
-- `TIME_WITH_TIMEZONE`
-- `DATE`
-- `BIT`
-- `BOOLEAN`
-- `TINYINT/SMALLINT`
-- `INTEGER`
-- `BIGINT`
-- `FLOAT/DOUBLE`
-- `REAL`
-- `NUMERIC/DECIMAL`
-- `CHAR/NCHAR/NVARCHAR/VARCHAR/LONGVARCHAR`
-- `BINARY/BLOB`
-
-## Data type mapping
-
-The AlloyDb is a fully managed PostgreSQL-compatible database service.
-
-According to Postgres [documentation](https://www.postgresql.org/docs/14/datatype.html), Postgres data types are mapped to the following data types when synchronizing data. You can check the test values examples [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-postgres/src/test-integration/java/io/airbyte/integrations/io/airbyte/integration_tests/sources/PostgresSourceDatatypeTest.java). If you can't find the data type you are looking for or have any problems feel free to add a new test!
-
-| Postgres Type | Resulting Type | Notes |
-| :------------------------------------ | :------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------- |
-| `bigint` | number | |
-| `bigserial`, `serial8` | number | |
-| `bit` | string | Fixed-length bit string (e.g. "0100"). |
-| `bit varying`, `varbit` | string | Variable-length bit string (e.g. "0100"). |
-| `boolean`, `bool` | boolean | |
-| `box` | string | |
-| `bytea` | string | Variable length binary string with hex output format prefixed with "\x" (e.g. "\x6b707a"). |
-| `character`, `char` | string | |
-| `character varying`, `varchar` | string | |
-| `cidr` | string | |
-| `circle` | string | |
-| `date` | string | Parsed as ISO8601 date time at midnight. CDC mode doesn't support era indicators. Issue: [#14590](https://github.com/airbytehq/airbyte/issues/14590) |
-| `double precision`, `float`, `float8` | number | `Infinity`, `-Infinity`, and `NaN` are not supported and converted to `null`. Issue: [#8902](https://github.com/airbytehq/airbyte/issues/8902). |
-| `hstore` | string | |
-| `inet` | string | |
-| `integer`, `int`, `int4` | number | |
-| `interval` | string | |
-| `json` | string | |
-| `jsonb` | string | |
-| `line` | string | |
-| `lseg` | string | |
-| `macaddr` | string | |
-| `macaddr8` | string | |
-| `money` | number | |
-| `numeric`, `decimal` | number | `Infinity`, `-Infinity`, and `NaN` are not supported and converted to `null`. Issue: [#8902](https://github.com/airbytehq/airbyte/issues/8902). |
-| `path` | string | |
-| `pg_lsn` | string | |
-| `point` | string | |
-| `polygon` | string | |
-| `real`, `float4` | number | |
-| `smallint`, `int2` | number | |
-| `smallserial`, `serial2` | number | |
-| `serial`, `serial4` | number | |
-| `text` | string | |
-| `time` | string | Parsed as a time string without a time-zone in the ISO-8601 calendar system. |
-| `timetz` | string | Parsed as a time string with time-zone in the ISO-8601 calendar system. |
-| `timestamp` | string | Parsed as a date-time string without a time-zone in the ISO-8601 calendar system. |
-| `timestamptz` | string | Parsed as a date-time string with time-zone in the ISO-8601 calendar system. |
-| `tsquery` | string | |
-| `tsvector` | string | |
-| `uuid` | string | |
-| `xml` | string | |
-| `enum` | string | |
-| `tsrange` | string | |
-| `array` | array | E.g. "[\"10001\",\"10002\",\"10003\",\"10004\"]". |
-| composite type | string | |
-
-## Limitations
-
-- The AlloyDB source connector currently does not handle schemas larger than 4MB.
-- The AlloyDB source connector does not alter the schema present in your database. Depending on the destination connected to this source, however, the schema may be altered. See the destination's documentation for more details.
-- The following two schema evolution actions are currently supported:
- - Adding/removing tables without resetting the entire connection at the destination
- Caveat: In the CDC mode, adding a new table to a connection may become a temporary bottleneck. When a new table is added, the next sync job takes a full snapshot of the new table before it proceeds to handle any changes.
- - Resetting a single table within the connection without resetting the rest of the destination tables in that connection
-- Changing a column data type or removing a column might break connections.
-
-## Changelog
-
-| Version | Date | Pull Request | Subject |
-|:--------|:-----------| :------------------------------------------------------- |:------------------------------------------------------------------------------------------------------------------------------------------|
-| 3.1.8 | 2023-09-20 | [30125](https://github.com/airbytehq/airbyte/pull/30125) | Improve initial load performance for older versions of PostgreSQL |
-| 3.1.5 | 2023-08-22 | [29534](https://github.com/airbytehq/airbyte/pull/29534) | Support "options" JDBC URL parameter |
-| 3.1.3 | 2023-08-03 | [28708](https://github.com/airbytehq/airbyte/pull/28708) | Enable checkpointing snapshots in CDC connections |
-| 3.1.2 | 2023-08-01 | [28954](https://github.com/airbytehq/airbyte/pull/28954) | Fix an issue that prevented use of tables with names containing uppercase letters |
-| 3.1.1 | 2023-07-31 | [28892](https://github.com/airbytehq/airbyte/pull/28892) | Fix an issue that prevented use of cursor columns with names containing uppercase letters |
-| 3.1.0 | 2023-07-25 | [28339](https://github.com/airbytehq/airbyte/pull/28339) | Checkpointing initial load for incremental syncs: enabled for xmin and cursor based only. |
-| 2.0.28 | 2023-04-26 | [25401](https://github.com/airbytehq/airbyte/pull/25401) | CDC : Upgrade Debezium to version 2.2.0 |
-| 2.0.23 | 2023-04-19 | [24582](https://github.com/airbytehq/airbyte/pull/24582) | CDC : Enable frequent state emission during incremental syncs + refactor for performance improvement |
-| 2.0.22 | 2023-04-17 | [25220](https://github.com/airbytehq/airbyte/pull/25220) | Logging changes : Log additional metadata & clean up noisy logs |
-| 2.0.21 | 2023-04-12 | [25131](https://github.com/airbytehq/airbyte/pull/25131) | Make Client Certificate and Client Key always show |
-| 2.0.19 | 2023-04-11 | [24656](https://github.com/airbytehq/airbyte/pull/24656) | CDC minor refactor |
-| 2.0.17 | 2023-04-05 | [24622](https://github.com/airbytehq/airbyte/pull/24622) | Allow streams not in CDC publication to be synced in Full-refresh mode |
-| 2.0.15 | 2023-04-04 | [24833](https://github.com/airbytehq/airbyte/pull/24833) | Disallow the "disable" SSL Modes; fix Debezium retry policy configuration |
-| 2.0.13 | 2023-03-28 | [24166](https://github.com/airbytehq/airbyte/pull/24166) | Fix InterruptedException bug during Debezium shutdown |
-| 2.0.11 | 2023-03-27 | [24529](https://github.com/airbytehq/airbyte/pull/24373) | Preparing the connector for CDC checkpointing |
-| 2.0.10 | 2023-03-24 | [24529](https://github.com/airbytehq/airbyte/pull/24529) | Set SSL Mode to required on strict-encrypt variant |
-| 2.0.9 | 2023-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting |
-| 2.0.6 | 2023-03-21 | [24271](https://github.com/airbytehq/airbyte/pull/24271) | Fix NPE in CDC mode |
-| 2.0.3 | 2023-03-21 | [24147](https://github.com/airbytehq/airbyte/pull/24275) | Fix error with CDC checkpointing |
-| 2.0.2 | 2023-03-13 | [23112](https://github.com/airbytehq/airbyte/pull/21727) | Add state checkpointing for CDC sync. |
-| 2.0.1 | 2023-03-08 | [23596](https://github.com/airbytehq/airbyte/pull/23596) | For network isolation, source connector accepts a list of hosts it is allowed to connect |
-| 2.0.0 | 2023-03-06 | [23112](https://github.com/airbytehq/airbyte/pull/23112) | Upgrade Debezium version to 2.1.2 |
-| 1.0.51 | 2023-03-02 | [23642](https://github.com/airbytehq/airbyte/pull/23642) | Revert : Support JSONB datatype for Standard sync mode |
-| 1.0.49 | 2023-02-27 | [21695](https://github.com/airbytehq/airbyte/pull/21695) | Support JSONB datatype for Standard sync mode |
-| 1.0.48 | 2023-02-24 | [23383](https://github.com/airbytehq/airbyte/pull/23383) | Fixed bug with non readable double-quoted values within a database name or column name |
-| 1.0.47 | 2023-02-22 | [22221](https://github.com/airbytehq/airbyte/pull/23138) | Fix previous versions which doesn't verify privileges correctly, preventing CDC syncs to run. |
-| 1.0.46 | 2023-02-21 | [23105](https://github.com/airbytehq/airbyte/pull/23105) | Include log levels and location information (class, method and line number) with source connector logs published to Airbyte Platform. |
-| 1.0.45 | 2023-02-09 | [22221](https://github.com/airbytehq/airbyte/pull/22371) | Ensures that user has required privileges for CDC syncs. |
-| | 2023-02-15 | [23028](https://github.com/airbytehq/airbyte/pull/23028) | |
-| 1.0.44 | 2023-02-06 | [22221](https://github.com/airbytehq/airbyte/pull/22221) | Exclude new set of system tables when using `pg_stat_statements` extension. |
-| 1.0.43 | 2023-02-06 | [21634](https://github.com/airbytehq/airbyte/pull/21634) | Improve Standard sync performance by caching objects. |
-| 1.0.36 | 2023-01-24 | [21825](https://github.com/airbytehq/airbyte/pull/21825) | Put back the original change that will cause an incremental sync to error if table contains a NULL value in cursor column. |
-| 1.0.35 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources |
-| 1.0.34 | 2022-12-13 | [20378](https://github.com/airbytehq/airbyte/pull/20378) | Improve descriptions |
-| 1.0.17 | 2022-10-31 | [18538](https://github.com/airbytehq/airbyte/pull/18538) | Encode database name |
-| 1.0.16 | 2022-10-25 | [18256](https://github.com/airbytehq/airbyte/pull/18256) | Disable allow and prefer ssl modes in CDC mode |
-| | 2022-10-13 | [15535](https://github.com/airbytehq/airbyte/pull/16238) | Update incremental query to avoid data missing when new data is inserted at the same time as a sync starts under non-CDC incremental mode |
-| 1.0.15 | 2022-10-11 | [17782](https://github.com/airbytehq/airbyte/pull/17782) | Align with Postgres source v.1.0.15 |
-| 1.0.0 | 2022-09-15 | [16776](https://github.com/airbytehq/airbyte/pull/16776) | Align with strict-encrypt version |
-| 0.1.0 | 2022-09-05 | [16323](https://github.com/airbytehq/airbyte/pull/16323) | Initial commit. Based on source-postgres v.1.0.7 |
diff --git a/docs/integrations/sources/amazon-ads.md b/docs/integrations/sources/amazon-ads.md
index 680307eac499..4d860dc68f95 100644
--- a/docs/integrations/sources/amazon-ads.md
+++ b/docs/integrations/sources/amazon-ads.md
@@ -66,6 +66,7 @@ This source is capable of syncing the following streams:
* [Sponsored Display Ad groups](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Ad%20groups)
* [Sponsored Display Product Ads](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Product%20ads)
* [Sponsored Display Targetings](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Targeting)
+* [Sponsored Display Creatives](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Creatives)
* [Sponsored Display Budget Rules](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi/prod#/BudgetRules/GetSDBudgetRulesForAdvertiser)
* [Sponsored Products Campaigns](https://advertising.amazon.com/API/docs/en-us/sponsored-display/3-0/openapi#/Campaigns)
* [Sponsored Products Ad groups](https://advertising.amazon.com/API/docs/en-us/sponsored-products/2-0/openapi#/Ad%20groups)
@@ -109,6 +110,7 @@ Information about expected report generation waiting time you may find [here](ht
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------|
+| 3.4.0 | 2023-06-09 | [25913](https://github.com/airbytehq/airbyte/pull/26203) | Add Stream `DisplayCreatives` |
| 3.3.0 | 2023-09-22 | [30679](https://github.com/airbytehq/airbyte/pull/30679) | Fix unexpected column for `SponsoredProductCampaigns` and `SponsoredBrandsKeywords` |
| 3.2.0 | 2023-09-18 | [30517](https://github.com/airbytehq/airbyte/pull/30517) | Add suggested streams; fix unexpected column issue |
| 3.1.2 | 2023-08-16 | [29233](https://github.com/airbytehq/airbyte/pull/29233) | Add filter for Marketplace IDs |
diff --git a/docs/integrations/sources/apify-dataset-migrations.md b/docs/integrations/sources/apify-dataset-migrations.md
index 9e1c23898419..e2c4a948a077 100644
--- a/docs/integrations/sources/apify-dataset-migrations.md
+++ b/docs/integrations/sources/apify-dataset-migrations.md
@@ -1,5 +1,9 @@
# Apify Dataset Migration Guide
+## Upgrading to 2.0.0
+
+Major update: The old broken Item Collection stream has been removed and replaced with a new Item Collection (WCC) stream specific for the datasets produced by [Website Content Crawler](https://apify.com/apify/website-content-crawler) Actor. Please update your connector configuration setup. Note: The schema of the Apify Dataset is at least Actor-specific, so we cannot have a general Stream with a static schema for getting data from a Dataset.
+
## Upgrading to 1.0.0
A major update fixing the data ingestion to retrieve properly data from Apify.
diff --git a/docs/integrations/sources/apify-dataset.md b/docs/integrations/sources/apify-dataset.md
index 5d3e5fbb4359..69f060a2ab12 100644
--- a/docs/integrations/sources/apify-dataset.md
+++ b/docs/integrations/sources/apify-dataset.md
@@ -6,49 +6,64 @@ description: Web scraping and automation platform.
## Overview
-[Apify](https://www.apify.com) is a web scraping and web automation platform providing both ready-made and custom solutions, an open-source [SDK](https://sdk.apify.com/) for web scraping, proxies, and many other tools to help you build and run web automation jobs at scale.
+[Apify](https://apify.com/) is a web scraping and web automation platform providing both ready-made and custom solutions, an open-source [JavaScript SDK](https://docs.apify.com/sdk/js/) and [Python SDK](https://docs.apify.com/sdk/python/) for web scraping, proxies, and many other tools to help you build and run web automation jobs at scale.
-The results of a scraping job are usually stored in [Apify Dataset](https://docs.apify.com/storage/dataset). This Airbyte connector allows you to automatically sync the contents of a dataset to your chosen destination using Airbyte.
+The results of a scraping job are usually stored in the [Apify Dataset](https://docs.apify.com/storage/dataset). This Airbyte connector provides streams to work with the datasets, including syncing their contents to your chosen destination using Airbyte.
To sync data from a dataset, all you need to know is its ID. You will find it in [Apify console](https://my.apify.com/) under storages.
+Currently, only datasets provided by the Website Content Crawler Actor are supported. Adding streams for other Actors or a stream for the general dataset (with dynamic schema) will be added soon.
+
### Running Airbyte sync from Apify webhook
-When your Apify job \(aka [actor run](https://docs.apify.com/actors/running)\) finishes, it can trigger an Airbyte sync by calling the Airbyte [API](https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html#post-/v1/connections/sync) manual connection trigger \(`POST /v1/connections/sync`\). The API can be called from Apify [webhook](https://docs.apify.com/webhooks) which is executed when your Apify run finishes.
+When your Apify job (aka [Actor run](https://docs.apify.com/platform/actors/running)) finishes, it can trigger an Airbyte sync by calling the Airbyte [API](https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html#post-/v1/connections/sync) manual connection trigger (`POST /v1/connections/sync`). The API can be called from Apify [webhook](https://docs.apify.com/platform/integrations/webhooks) which is executed when your Apify run finishes.
![](../../.gitbook/assets/apify_trigger_airbyte_connection.png)
-### Output schema
-
-Since the dataset items do not have strongly typed schema, they are synced as objects stored in the `data` field, without any assumption on their content.
-
### Features
-| Feature | Supported? |
-| :------------------------ | :--------------- |
-| Full Refresh Sync | Yes |
-| Incremental Sync | Yes |
+| Feature | Supported? |
+| :---------------- | :--------- |
+| Full Refresh Sync | Yes |
+| Incremental Sync | Yes |
### Performance considerations
The Apify dataset connector uses [Apify Python Client](https://docs.apify.com/apify-client-python) under the hood and should handle any API limitations under normal usage.
-## Getting started
+## Streams
+
+### `dataset_collection`
+
+- Calls `api.apify.com/v2/datasets` ([docs](https://docs.apify.com/api/v2#/reference/datasets/dataset-collection/get-list-of-datasets))
+- Properties:
+ - Apify Personal API token (you can find it [here](https://console.apify.com/account/integrations))
+
+### `dataset`
-### Requirements
+- Calls `https://api.apify.com/v2/datasets/{datasetId}` ([docs](https://docs.apify.com/api/v2#/reference/datasets/dataset/get-dataset))
+- Properties:
+ - Apify Personal API token (you can find it [here](https://console.apify.com/account/integrations))
+ - Dataset ID (check the [docs](https://docs.apify.com/platform/storage/dataset))
-* Apify [token](https://console.apify.com/account/integrations) token
-* Parameter clean: true or false
+### `item_collection_website_content_crawler`
-### Changelog
+- Calls `api.apify.com/v2/datasets/{datasetId}/items` ([docs](https://docs.apify.com/api/v2#/reference/datasets/item-collection/get-items))
+- Properties:
+ - Apify Personal API token (you can find it [here](https://console.apify.com/account/integrations))
+ - Dataset ID (check the [docs](https://docs.apify.com/platform/storage/dataset))
+- Limitations:
+ - Currently works only for the datasets produced by [Website Content Crawler](https://apify.com/apify/website-content-crawler).
-| Version | Date | Pull Request | Subject |
-| :-------- | :---------- | :------------------------------------------------------------ | :-------------------------------------------------------------------------- |
-| 1.0.0 | 2023-08-25 | [29859](https://github.com/airbytehq/airbyte/pull/29859) | Migrate to lowcode |
-| 0.2.0 | 2022-06-20 | [28290](https://github.com/airbytehq/airbyte/pull/28290) | Make connector work with platform changes not syncing empty stream schemas. |
-| 0.1.11 | 2022-04-27 | [12397](https://github.com/airbytehq/airbyte/pull/12397) | No changes. Used connector to test publish workflow changes. |
-| 0.1.9 | 2022-04-05 | [PR\#11712](https://github.com/airbytehq/airbyte/pull/11712) | No changes from 0.1.4. Used connector to test publish workflow changes. |
-| 0.1.4 | 2021-12-23 | [PR\#8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications |
-| 0.1.2 | 2021-11-08 | [PR\#7499](https://github.com/airbytehq/airbyte/pull/7499) | Remove base-python dependencies |
-| 0.1.0 | 2021-07-29 | [PR\#5069](https://github.com/airbytehq/airbyte/pull/5069) | Initial version of the connector |
+## Changelog
+| Version | Date | Pull Request | Subject |
+| :------ | :--------- | :----------------------------------------------------------- | :-------------------------------------------------------------------------- |
+| 2.0.0 | 2023-09-18 | [30428](https://github.com/airbytehq/airbyte/pull/30428) | Fix broken stream, manifest refactor |
+| 1.0.0 | 2023-08-25 | [29859](https://github.com/airbytehq/airbyte/pull/29859) | Migrate to lowcode |
+| 0.2.0 | 2022-06-20 | [28290](https://github.com/airbytehq/airbyte/pull/28290) | Make connector work with platform changes not syncing empty stream schemas. |
+| 0.1.11 | 2022-04-27 | [12397](https://github.com/airbytehq/airbyte/pull/12397) | No changes. Used connector to test publish workflow changes. |
+| 0.1.9 | 2022-04-05 | [PR\#11712](https://github.com/airbytehq/airbyte/pull/11712) | No changes from 0.1.4. Used connector to test publish workflow changes. |
+| 0.1.4 | 2021-12-23 | [PR\#8434](https://github.com/airbytehq/airbyte/pull/8434) | Update fields in source-connectors specifications |
+| 0.1.2 | 2021-11-08 | [PR\#7499](https://github.com/airbytehq/airbyte/pull/7499) | Remove base-python dependencies |
+| 0.1.0 | 2021-07-29 | [PR\#5069](https://github.com/airbytehq/airbyte/pull/5069) | Initial version of the connector |
diff --git a/docs/integrations/sources/auth0.md b/docs/integrations/sources/auth0.md
index e6e39fc7d5e5..72ca3fb0c755 100644
--- a/docs/integrations/sources/auth0.md
+++ b/docs/integrations/sources/auth0.md
@@ -57,6 +57,7 @@ The connector is restricted by Auth0 [rate limits](https://auth0.com/docs/troubl
| Version | Date | Pull Request | Subject |
| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------- |
+| 0.5.0 | 2023-10-11 | [30467](https://github.com/airbytehq/airbyte/pull/30467) | Use Python base image |
| 0.4.1 | 2023-08-24 | [29804](https://github.com/airbytehq/airbyte/pull/29804) | Fix low code migration bugs |
| 0.4.0 | 2023-08-03 | [28972](https://github.com/airbytehq/airbyte/pull/28972) | Migrate to Low-Code CDK |
| 0.3.0 | 2023-06-20 | [29001](https://github.com/airbytehq/airbyte/pull/29001) | Add Organizations, OrganizationMembers, OrganizationMemberRoles streams |
diff --git a/docs/integrations/sources/bing-ads-migrations.md b/docs/integrations/sources/bing-ads-migrations.md
new file mode 100644
index 000000000000..3d378d2517ad
--- /dev/null
+++ b/docs/integrations/sources/bing-ads-migrations.md
@@ -0,0 +1,10 @@
+# Bing Ads Migration Guide
+
+## Upgrading to 1.0.0
+
+This version update only affects the geographic performance reports streams.
+
+Version 1.0.0 prevents the data loss by removing the primary keys from the `GeographicPerformanceReportMonthly`, `GeographicPerformanceReportWeekly`, `GeographicPerformanceReportDaily`, `GeographicPerformanceReportHourly` streams.
+Due to multiple records with the same primary key, users could experience data loss in the incremental append+dedup mode because of deduplication.
+
+For the changes to take effect, please reset your data and refresh the stream schemas after you have applied the upgrade.
\ No newline at end of file
diff --git a/docs/integrations/sources/bing-ads.md b/docs/integrations/sources/bing-ads.md
index 3f4453fbca2b..f5d8bbc64b88 100644
--- a/docs/integrations/sources/bing-ads.md
+++ b/docs/integrations/sources/bing-ads.md
@@ -123,6 +123,7 @@ The Bing Ads API limits the number of requests for all Microsoft Advertising cli
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:---------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------|
+| 1.0.0 | 2023-10-11 | [31277](https://github.com/airbytehq/airbyte/pull/31277) | Remove primary keys from the geographic performance reports. |
| 0.2.3 | 2023-09-28 | [30834](https://github.com/airbytehq/airbyte/pull/30834) | Wrap auth error with the config error. |
| 0.2.2 | 2023-09-27 | [30791](https://github.com/airbytehq/airbyte/pull/30791) | Fix missing fields for geographic performance reports. |
| 0.2.1 | 2023-09-04 | [30128](https://github.com/airbytehq/airbyte/pull/30128) | Add increasing download timeout if ReportingDownloadException occurs |
diff --git a/docs/integrations/sources/chargebee.md b/docs/integrations/sources/chargebee.md
index 81a64c2a707c..f369a67b5d67 100644
--- a/docs/integrations/sources/chargebee.md
+++ b/docs/integrations/sources/chargebee.md
@@ -4,7 +4,11 @@ This page contains the setup guide and reference information for the Chargebee s
## Prerequisites
-To set up the Chargebee source connector, you'll need the [Chargebee API key](https://apidocs.chargebee.com/docs/api?prod_cat_ver=2#api_authentication) and the [Product Catalog version](https://apidocs.chargebee.com/docs/api?prod_cat_ver=2).
+To set up the Chargebee source connector, you will need a valid [Chargebee API key](https://apidocs.chargebee.com/docs/api?prod_cat_ver=2#api_authentication) and the [Product Catalog version](https://www.chargebee.com/docs/1.0/upgrade-product-catalog.html) of the Chargebee site you are syncing data from.
+
+:::info
+All Chargebee sites created from May 5, 2021 onward will have [Product Catalog 2.0](https://www.chargebee.com/docs/2.0/product-catalog.html) enabled by default. Sites created prior to this date will use [Product Catalog 1.0](https://www.chargebee.com/docs/1.0/product-catalog.html).
+:::
## Set up the Chargebee connector in Airbyte
@@ -26,45 +30,38 @@ The Chargebee source connector supports the following [sync modes](https://docs.
* [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append)
* [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append)
-## Supported Streams
-
-* [Subscriptions](https://apidocs.chargebee.com/docs/api/subscriptions?prod_cat_ver=2#list_subscriptions)
-* [Customers](https://apidocs.chargebee.com/docs/api/customers?prod_cat_ver=2#list_customers)
-* [Invoices](https://apidocs.chargebee.com/docs/api/invoices?prod_cat_ver=2#list_invoices)
-* [Orders](https://apidocs.chargebee.com/docs/api/orders?prod_cat_ver=2#list_orders)
-* [Plans](https://apidocs.chargebee.com/docs/api/plans?prod_cat_ver=1&lang=curl#list_plans)
-* [Addons](https://apidocs.chargebee.com/docs/api/addons?prod_cat_ver=1&lang=curl#list_addons)
-* [Items](https://apidocs.chargebee.com/docs/api/items?prod_cat_ver=2#list_items)
-* [Item Prices](https://apidocs.chargebee.com/docs/api/item_prices?prod_cat_ver=2#list_item_prices)
-* [Attached Items](https://apidocs.chargebee.com/docs/api/attached_items?prod_cat_ver=2#list_attached_items)
-
-Some streams are available only for specific on Product Catalog versions:
-
-1. Available in `Product Catalog 1.0` and `Product Catalog 2.0`:
- * Customers
- * Events
- * Invoices
- * Credit Notes
- * Orders
- * Coupons
- * Subscriptions
- * Transactions
-2. Available only in `Product Catalog 1.0`:
- * Plans
- * Addons
-3. Available only in `Product Catalog 2.0`:
- * Items
- * Item Prices
- * Attached Items
-
-Note that except the `Attached Items` stream, all the streams listed above are incremental streams, which means they:
-
-* Read only new records
-* Output only new records
-
-The `Attached Items` stream is also incremental but it reads _all_ records and outputs only new records, which is why syncing the `Attached Items` stream, even in incremental mode, is expensive in terms of your Chargebee API quota.
-
-Generally speaking, it incurs a number of API calls equal to the total number of attached items in your chargebee instance divided by 100, regardless of how many `AttachedItems` were actually changed or synced in a particular sync job.
+## Supported streams
+
+Most streams are supported regardless of your Chargebee site's [Product Catalog version](https://www.chargebee.com/docs/1.0/upgrade-product-catalog.html), with a few version-specific exceptions.
+
+| Stream | Product Catalog 1.0 | Product Catalog 2.0 |
+|------------------------|---------------------|---------------------|
+| [Addons](https://apidocs.chargebee.com/docs/api/addons?prod_cat_ver=1) | β | |
+| [Attached Items](https://apidocs.chargebee.com/docs/api/attached_items?prod_cat_ver=2) | | β |
+| [Contacts](https://apidocs.chargebee.com/docs/api/customers?lang=curl#list_of_contacts_for_a_customer) | β | β |
+| [Coupons](https://apidocs.chargebee.com/docs/api/coupons) | β | β |
+| [Credit Notes](https://apidocs.chargebee.com/docs/api/credit_notes) | β | β |
+| [Customers](https://apidocs.chargebee.com/docs/api/customers) | β | β |
+| [Events](https://apidocs.chargebee.com/docs/api/events) | β | β |
+| [Gifts](https://apidocs.chargebee.com/docs/api/gifts) | β | β |
+| [Hosted Pages](https://apidocs.chargebee.com/docs/api/hosted_pages) | β | β |
+| [Invoices](https://apidocs.chargebee.com/docs/api/invoices) | β | β |
+| [Items](https://apidocs.chargebee.com/docs/api/items?prod_cat_ver=2) | | β |
+| [Item Prices](https://apidocs.chargebee.com/docs/api/item_prices?prod_cat_ver=2) | | β |
+| [Orders](https://apidocs.chargebee.com/docs/api/orders) | β | β |
+| [Payment Sources](https://apidocs.chargebee.com/docs/api/payment_sources) | β | β |
+| [Plans](https://apidocs.chargebee.com/docs/api/plans?prod_cat_ver=1) | β | |
+| [Promotional Credits](https://apidocs.chargebee.com/docs/api/promotional_credits) | β | β |
+| [Quotes](https://apidocs.chargebee.com/docs/api/quotes) | β | β |
+| [Quote Line Groups](https://apidocs.chargebee.com/docs/api/quote_line_groups) | β | β |
+| [Subscriptions](https://apidocs.chargebee.com/docs/api/subscriptions) | β | β |
+| [Transactions](https://apidocs.chargebee.com/docs/api/transactions) | β | β |
+| [Unbilled Charges](https://apidocs.chargebee.com/docs/api/unbilled_charges) | β | β |
+| [Virtual Bank Accounts](https://apidocs.chargebee.com/docs/api/virtual_bank_accounts) | β | β |
+
+:::note
+When using incremental sync mode, the `Attached Items` stream behaves differently than the other streams. Whereas other incremental streams read and output _only new_ records, the `Attached Items` stream reads _all_ records but only outputs _new_ records, making it more demanding on your Chargebee API quota. Each sync incurs API calls equal to the total number of attached items in your Chargebee instance divided by 100, regardless of the actual number of `Attached Items` changed or synced.
+:::
## Performance considerations
diff --git a/docs/integrations/sources/e2e-test-cloud.md b/docs/integrations/sources/e2e-test-cloud.md
index f83816364112..be70af977245 100644
--- a/docs/integrations/sources/e2e-test-cloud.md
+++ b/docs/integrations/sources/e2e-test-cloud.md
@@ -30,5 +30,6 @@ The OSS and Cloud variants have the same version number. The Cloud variant was i
| Version | Date | Pull request | Notes |
|---------|------------|----------------------------------------------------------|-----------------------------------------------------|
+| 2.1.5 | 2023-10-06 | [31092](https://github.com/airbytehq/airbyte/pull/31092) | Bring in changes from oss |
| 2.1.4 | 2023-03-01 | [23656](https://github.com/airbytehq/airbyte/pull/23656) | Fix inheritance between e2e-test and e2e-test-cloud |
| 0.1.0 | 2021-07-23 | [9720](https://github.com/airbytehq/airbyte/pull/9720) | Initial release. |
diff --git a/docs/integrations/sources/e2e-test.md b/docs/integrations/sources/e2e-test.md
index ad3de2470800..9988c9f497a7 100644
--- a/docs/integrations/sources/e2e-test.md
+++ b/docs/integrations/sources/e2e-test.md
@@ -70,15 +70,16 @@ This mode is also excluded from the Cloud variant of this connector.
The OSS and Cloud variants have the same version number. The Cloud variant was initially released at version `1.0.0`.
-| Version | Date | Pull request | Notes |
-|---------|------------| ----------------------------------------------------------------------------------------------------------------- |-------------------------------------------------------------------------------------------------------|
-| 2.1.4 | 2023-03-01 | [23656](https://github.com/airbytehq/airbyte/pull/23656) | Add speed benchmark mode to e2e test |
-| 2.1.3 | 2022-08-25 | [15591](https://github.com/airbytehq/airbyte/pull/15591) | Declare supported sync modes in catalogs |
-| 2.1.1 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors |
-| 2.1.0 | 2021-02-12 | [\#10298](https://github.com/airbytehq/airbyte/pull/10298) | Support stream duplication to quickly create a multi-stream catalog. |
-| 2.0.0 | 2021-02-01 | [\#9954](https://github.com/airbytehq/airbyte/pull/9954) | Remove legacy modes. Use more efficient Json generator. |
-| 1.0.1 | 2021-01-29 | [\#9745](https://github.com/airbytehq/airbyte/pull/9745) | Integrate with Sentry. |
-| 1.0.0 | 2021-01-23 | [\#9720](https://github.com/airbytehq/airbyte/pull/9720) | Add new continuous feed mode that supports arbitrary catalog specification. Initial release to cloud. |
-| 0.1.2 | 2022-10-18 | [\#18100](https://github.com/airbytehq/airbyte/pull/18100) | Set supported sync mode on streams |
-| 0.1.1 | 2021-12-16 | [\#8217](https://github.com/airbytehq/airbyte/pull/8217) | Fix sleep time in infinite feed mode. |
+| Version | Date | Pull request | Notes |
+|---------|------------| ------------------------------------------------------------------ |-------------------------------------------------------------------------------------------------------|
+| 2.1.5 | 2023-10-04 | [31092](https://github.com/airbytehq/airbyte/pull/31092) | Bump jsonschemafriend dependency version to fix bug |
+| 2.1.4 | 2023-03-01 | [23656](https://github.com/airbytehq/airbyte/pull/23656) | Add speed benchmark mode to e2e test |
+| 2.1.3 | 2022-08-25 | [15591](https://github.com/airbytehq/airbyte/pull/15591) | Declare supported sync modes in catalogs |
+| 2.1.1 | 2022-06-17 | [13864](https://github.com/airbytehq/airbyte/pull/13864) | Updated stacktrace format for any trace message errors |
+| 2.1.0 | 2021-02-12 | [\#10298](https://github.com/airbytehq/airbyte/pull/10298) | Support stream duplication to quickly create a multi-stream catalog. |
+| 2.0.0 | 2021-02-01 | [\#9954](https://github.com/airbytehq/airbyte/pull/9954) | Remove legacy modes. Use more efficient Json generator. |
+| 1.0.1 | 2021-01-29 | [\#9745](https://github.com/airbytehq/airbyte/pull/9745) | Integrate with Sentry. |
+| 1.0.0 | 2021-01-23 | [\#9720](https://github.com/airbytehq/airbyte/pull/9720) | Add new continuous feed mode that supports arbitrary catalog specification. Initial release to cloud. |
+| 0.1.2 | 2022-10-18 | [\#18100](https://github.com/airbytehq/airbyte/pull/18100) | Set supported sync mode on streams |
+| 0.1.1 | 2021-12-16 | [\#8217](https://github.com/airbytehq/airbyte/pull/8217) | Fix sleep time in infinite feed mode. |
| 0.1.0 | 2021-07-23 | [\#3290](https://github.com/airbytehq/airbyte/pull/3290) [\#4939](https://github.com/airbytehq/airbyte/pull/4939) | Initial release. |
diff --git a/docs/integrations/sources/everhour.md b/docs/integrations/sources/everhour.md
index 3475806ec1f0..d8a99925fc5e 100644
--- a/docs/integrations/sources/everhour.md
+++ b/docs/integrations/sources/everhour.md
@@ -1,6 +1,6 @@
# Everhour
-This page contains the setup guide and reference information for the Everhour source connector.
+This page contains the setup guide and reference information for the [Everhour](https://everhour.com/) source connector.
## Prerequisites
@@ -25,5 +25,4 @@ This project supports the following streams:
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:---------------------------------------------------------|:-------------------------------------------------------------------------------|
-
| 0.1.0 | 2023-02-28 | [23593](https://github.com/airbytehq/airbyte/pull/23593) | Initial Release |
diff --git a/docs/integrations/sources/facebook-marketing.md b/docs/integrations/sources/facebook-marketing.md
index f0a6fb317fa5..818b8996f07d 100644
--- a/docs/integrations/sources/facebook-marketing.md
+++ b/docs/integrations/sources/facebook-marketing.md
@@ -178,6 +178,8 @@ The Facebook Marketing connector uses the `lookback_window` parameter to repeate
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:---------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| 1.1.16 | 2023-10-11 | [31284](https://github.com/airbytehq/airbyte/pull/31284) | Fix error occurring when trying to access the `funding_source_details` field of the `AdAccount` stream |
+| 1.1.15 | 2023-10-06 | [31132](https://github.com/airbytehq/airbyte/pull/31132) | Fix permission error for `AdAccount` stream |
| 1.1.14 | 2023-09-26 | [30758](https://github.com/airbytehq/airbyte/pull/30758) | Exception should not be raises if a stream is not found |
| 1.1.13 | 2023-09-22 | [30706](https://github.com/airbytehq/airbyte/pull/30706) | Performance testing - include socat binary in docker image |
| 1.1.12 | 2023-09-22 | [30655](https://github.com/airbytehq/airbyte/pull/30655) | Updated doc; improved schema for custom insight streams; updated SAT or custom insight streams; removed obsolete optional max_batch_size option from spec |
diff --git a/docs/integrations/sources/github.md b/docs/integrations/sources/github.md
index 12230912edc3..99aed3ba0d76 100644
--- a/docs/integrations/sources/github.md
+++ b/docs/integrations/sources/github.md
@@ -45,7 +45,12 @@ Log into [GitHub](https://github.com) and then generate a [personal access token
- **For Airbyte Open Source**: Authenticate with **Personal Access Token**.
-6. **GitHub Repositories** - List of GitHub organizations/repositories, e.g. `airbytehq/airbyte` for single repository, `airbytehq/airbyte airbytehq/another-repo` for multiple repositories. If you want to specify the organization to receive data from all its repositories, then you should specify it according to the following example: `airbytehq/*`. Repositories with the wrong name, or repositories that do not exist, or have the wrong name format are not allowed.
+6. **GitHub Repositories** - List of GitHub organizations/repositories, e.g. `airbytehq/airbyte` for single repository, `airbytehq/airbyte airbytehq/another-repo` for multiple repositories. If you want to specify the organization to receive data from all its repositories, then you should specify it according to the following example: `airbytehq/*`.
+
+:::caution
+Repositories with the wrong name or repositories that do not exist or have the wrong name format will be skipped with `WARN` message in the logs.
+:::
+
7. **Start date (Optional)** - The date from which you'd like to replicate data for streams. If the date is not set, all data will be replicated. Using for streams: `Comments`, `Commit comment reactions`, `Commit comments`, `Commits`, `Deployments`, `Events`, `Issue comment reactions`, `Issue events`, `Issue milestones`, `Issue reactions`, `Issues`, `Project cards`, `Project columns`, `Projects`, `Pull request comment reactions`, `Pull requests`, `Pull request stats`, `Releases`, `Review comments`, `Reviews`, `Stargazers`, `Workflow runs`, `Workflows`.
8. **Branch (Optional)** - List of GitHub repository branches to pull commits for, e.g. `airbytehq/airbyte/master`. If no branches are specified for a repository, the default branch will be pulled. (e.g. `airbytehq/airbyte/master airbytehq/airbyte/my-branch`).
9. **Max requests per hour (Optional)** - The GitHub API allows for a maximum of 5000 requests per hour (15000 for Github Enterprise). You can specify a lower value to limit your use of the API quota.
@@ -103,6 +108,7 @@ This connector outputs the following incremental streams:
- [Review comments](https://docs.github.com/en/rest/pulls/comments?apiVersion=2022-11-28#list-review-comments-in-a-repository)
- [Reviews](https://docs.github.com/en/rest/pulls/reviews?apiVersion=2022-11-28#list-reviews-for-a-pull-request)
- [Stargazers](https://docs.github.com/en/rest/activity/starring?apiVersion=2022-11-28#list-stargazers)
+- [WorkflowJobs](https://docs.github.com/pt/rest/actions/workflow-jobs?apiVersion=2022-11-28#list-jobs-for-a-workflow-run)
- [WorkflowRuns](https://docs.github.com/en/rest/actions/workflow-runs?apiVersion=2022-11-28#list-workflow-runs-for-a-repository)
- [Workflows](https://docs.github.com/en/rest/actions/workflows?apiVersion=2022-11-28#list-repository-workflows)
@@ -158,6 +164,8 @@ The GitHub connector should not run into GitHub API limitations under normal usa
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| 1.5.1 | 2023-10-12 | [31307](https://github.com/airbytehq/airbyte/pull/31307) | Increase backoff_time for stream `ContributorActivity` |
+| 1.5.0 | 2023-10-11 | [31300](https://github.com/airbytehq/airbyte/pull/31300) | Update Schemas: Add date-time format to fields |
| 1.4.6 | 2023-10-04 | [31056](https://github.com/airbytehq/airbyte/pull/31056) | Migrate spec properties' `repository` and `branch` type to \ |
| 1.4.5 | 2023-10-02 | [31023](https://github.com/airbytehq/airbyte/pull/31023) | Increase backoff for stream `Contributor Activity` |
| 1.4.4 | 2023-10-02 | [30971](https://github.com/airbytehq/airbyte/pull/30971) | Mark `start_date` as optional. |
diff --git a/docs/integrations/sources/gitlab.md b/docs/integrations/sources/gitlab.md
index caf45350163f..dd1a84b8fd19 100644
--- a/docs/integrations/sources/gitlab.md
+++ b/docs/integrations/sources/gitlab.md
@@ -94,6 +94,7 @@ This connector outputs the following streams:
- [Group and Project members](https://docs.gitlab.com/ee/api/members.html)
- [Tags](https://docs.gitlab.com/ee/api/tags.html)
- [Releases](https://docs.gitlab.com/ee/api/releases/index.html)
+- [Deployments](https://docs.gitlab.com/ee/api/deployments/index.html)
- [Group Labels](https://docs.gitlab.com/ee/api/group_labels.html)
- [Project Labels](https://docs.gitlab.com/ee/api/labels.html)
- [Epics](https://docs.gitlab.com/ee/api/epics.html) \(only available for GitLab Ultimate and GitLab.com Gold accounts\)
@@ -111,6 +112,7 @@ Gitlab has the [rate limits](https://docs.gitlab.com/ee/user/gitlab_com/index.ht
| Version | Date | Pull Request | Subject |
| :------ | :--------- | :------------------------------------------------------- | :----------------------------------------------------------------------------------------- |
+| 1.7.0 | 2023-08-08 | [27869](https://github.com/airbytehq/airbyte/pull/29203) | Add Deployments stream |
| 1.6.0 | 2023-06-30 | [27869](https://github.com/airbytehq/airbyte/pull/27869) | Add `shared_runners_setting` field to groups |
| 1.5.1 | 2023-06-24 | [27679](https://github.com/airbytehq/airbyte/pull/27679) | Fix formatting |
| 1.5.0 | 2023-06-15 | [27392](https://github.com/airbytehq/airbyte/pull/27392) | Make API URL an optional parameter in spec. |
diff --git a/docs/integrations/sources/google-sheets.md b/docs/integrations/sources/google-sheets.md
index 7194cde645b2..92b8ec1ed73a 100644
--- a/docs/integrations/sources/google-sheets.md
+++ b/docs/integrations/sources/google-sheets.md
@@ -132,6 +132,7 @@ Airbyte batches requests to the API in order to efficiently pull data and respec
| Version | Date | Pull Request | Subject |
|---------|------------|----------------------------------------------------------|-----------------------------------------------------------------------------------|
+| 0.3.10 | 2023-09-27 | [30487](https://github.com/airbytehq/airbyte/pull/30487) | Fix bug causing rows to be skipped when batch size increased due to rate limits. |
| 0.3.9 | 2023-09-25 | [30749](https://github.com/airbytehq/airbyte/pull/30749) | Performance testing - include socat binary in docker image |
| 0.3.8 | 2023-09-25 | [30747](https://github.com/airbytehq/airbyte/pull/30747) | Performance testing - include socat binary in docker image |
| 0.3.7 | 2023-08-25 | [29826](https://github.com/airbytehq/airbyte/pull/29826) | Remove row batch size from spec, add auto increase this value when rate limits |
diff --git a/docs/integrations/sources/linkedin-ads.md b/docs/integrations/sources/linkedin-ads.md
index 0b752afae000..fde3a37c227b 100644
--- a/docs/integrations/sources/linkedin-ads.md
+++ b/docs/integrations/sources/linkedin-ads.md
@@ -171,6 +171,7 @@ After 5 unsuccessful attempts - the connector will stop the sync operation. In s
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------|
+| 0.6.2 | 2023-08-23 | [29600](https://github.com/airbytehq/airbyte/pull/31221) | Increase max time between messages to 24 hours |
| 0.6.1 | 2023-08-23 | [29600](https://github.com/airbytehq/airbyte/pull/29600) | Update field descriptions |
| 0.6.0 | 2023-08-22 | [29721](https://github.com/airbytehq/airbyte/pull/29721) | Add `Conversions` stream |
| 0.5.0 | 2023-08-14 | [29175](https://github.com/airbytehq/airbyte/pull/29175) | Add Custom report Constructor |
diff --git a/docs/integrations/sources/mysql.md b/docs/integrations/sources/mysql.md
index a6dd79349942..13f7644be39b 100644
--- a/docs/integrations/sources/mysql.md
+++ b/docs/integrations/sources/mysql.md
@@ -264,6 +264,8 @@ WHERE actor_definition_id ='435bb9a5-7887-4809-aa58-28c27df0d7ad' AND (configura
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:-----------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------|
+| 3.1.3 | 2023-10-11 | [31322](https://github.com/airbytehq/airbyte/pull/31322) | Correct pevious release |
+| 3.1.2 | 2023-09-29 | [30806](https://github.com/airbytehq/airbyte/pull/30806) | Cap log line length to 32KB to prevent loss of records |
| 3.1.1 | 2023-09-26 | [30744](https://github.com/airbytehq/airbyte/pull/30744) | Update MySQL JDBC connection configs to keep default auto-commit behavior |
| 3.1.0 | 2023-09-21 | [30270](https://github.com/airbytehq/airbyte/pull/30270) | Enhanced Standard Sync with initial load via Primary Key with a switch to cursor for incremental syncs |
| 3.0.9 | 2023-09-20 | [30620](https://github.com/airbytehq/airbyte/pull/30620) | Airbyte Certified MySQL Source connector |
diff --git a/docs/integrations/sources/notion-migrations.md b/docs/integrations/sources/notion-migrations.md
new file mode 100644
index 000000000000..d08cd0e331e0
--- /dev/null
+++ b/docs/integrations/sources/notion-migrations.md
@@ -0,0 +1,11 @@
+# Notion Migration Guide
+
+## Upgrading to 2.0.0
+
+Version 2.0.0 introduces a number of changes to the JSON schemas of all streams. These changes are being introduced to reflect updates to the Notion API. Some breaking changes have been introduced that will affect the Blocks, Databases and Pages stream.
+
+- The type of the `rich_text` property in the Pages stream has been updated from an object to an array of `rich_text` objects
+- The type of the `phone_number` property in the Pages stream has been updated from a string to an object
+- The deprecated `text` property in content blocks has been renamed to `rich_text`. This change affects the Blocks, Databases and Pages streams.
+
+A full schema refresh and data reset are required when upgrading to this version.
diff --git a/docs/integrations/sources/notion.inapp.md b/docs/integrations/sources/notion.inapp.md
deleted file mode 100644
index 019885f44abf..000000000000
--- a/docs/integrations/sources/notion.inapp.md
+++ /dev/null
@@ -1,33 +0,0 @@
-## Prerequisite
-
-* Access to a Notion workspace
-β
-## Setup guide
-
-1. Enter a **Source name** to help you identify this source in Airbyte.
-2. Choose the method of authentication:
-
-
-:::note
-We highly recommend using OAuth2.0 authorization to connect to Notion, as this method significantly simplifies the setup process. If you use OAuth2.0 authorization, you do _not_ need to create and configure a new integration in Notion. Instead, you can authenticate your Notion account directly in Airbyte Cloud.
-:::
-
-- **OAuth2.0** (Recommended): Click **Authenticate your Notion account**. When the popup appears, click **Select pages**. Check the pages you want to give Airbyte access to, and click **Allow access**.
-- **Access Token**: Copy and paste the Access Token found in the **Secrets** tab of your Notion integration's page. For more information on how to create and configure an integration in Notion, refer to our
-[full documentation](https://docs.airbyte.io/integrations/sources/notion#setup-guide).
-
-
-
-- **Access Token**: Copy and paste the Access Token found in the **Secrets** tab of your Notion integration's page.
-- **OAuth2.0**: Copy and paste the Client ID, Client Secret and Access Token you acquired.
-
-To obtain the necessary authorization credentials, you need to create and configure an integration in Notion. For more information on how to create and configure an integration in Notion, refer to our
-[full documentation](https://docs.airbyte.io/integrations/sources/notion#setup-guide).
-
-
-3. Enter the **Start Date** using the provided datepicker, or by programmatically entering a UTC date and time in the format: `YYYY-MM-DDTHH:mm:ss.SSSZ`. All data generated after this date will be replicated.
-4. Click **Set up source** and wait for the tests to complete.
-β
-
-For detailed information on supported sync modes, supported streams, performance considerations, refer to the
-[full documentation for Notion](https://docs.airbyte.com/integrations/sources/notion).
diff --git a/docs/integrations/sources/notion.md b/docs/integrations/sources/notion.md
index 1cb3e43f6af6..aa07862e0649 100644
--- a/docs/integrations/sources/notion.md
+++ b/docs/integrations/sources/notion.md
@@ -4,7 +4,7 @@ This page contains the setup guide and reference information for the Notion sour
## Prerequisites
-- Access to a Notion workspace
+- Access to a [Notion](https://notion.so/login) workspace
## Setup guideβ
@@ -14,14 +14,12 @@ To authenticate the Notion source connector, you need to use **one** of the foll
- Access Token
:::note
-**For Airbyte Cloud users:** We highly recommend using OAuth2.0 authorization to connect to Notion, as this method significantly simplifies the setup process. If you use OAuth2.0 authorization in Airbyte Cloud, you do **not** need to create and configure a new integration in Notion. Instead, you can proceed straight to
-[setting up the connector in Airbyte](#step-3-set-up-the-notion-connector-in-airbyte).
+**For Airbyte Cloud users:** We highly recommend using OAuth2.0 authorization to connect to Notion, as this method significantly simplifies the setup process. If you use OAuth2.0 authorization in Airbyte Cloud, you do **not** need to create and configure a new integration in Notion. Instead, you can proceed straight to [setting up the connector in Airbyte](#step-3-set-up-the-notion-connector-in-airbyte).
:::
-We have provided a quick setup guide for creating an integration in Notion below. If you would like more detailed information and context on Notion integrations, or experience any difficulties with the integration setup process, please refer to the
-[official Notion documentation](https://developers.notion.com/docs).
+We have provided a quick setup guide for creating an integration in Notion below. If you would like more detailed information and context on Notion integrations, or experience any difficulties with the integration setup process, please refer to the [official Notion documentation](https://developers.notion.com/docs).
-### Step 1: Create an integration in Notionβ
+### Step 1: Create an integration in Notionβ and set capabilities
1. Log in to your Notion workspace and navigate to the [My integrations](https://www.notion.so/my-integrations) page. Select **+ New integration**.
@@ -29,24 +27,27 @@ We have provided a quick setup guide for creating an integration in Notion below
You must be the owner of the Notion workspace to create a new integration associated with it.
:::
-2. Enter a **Name** for your integration. Make sure you have selected the workspace containing your data to replicate from the **Associated workspace** dropdown menu, and click **Submit**.
-3. In the navbar, select **Capabilities** and make sure to check the **Read content** checkbox to authorize Airbyte to read the content of your pages. You may also wish to check the **Read comments** box, as well as set a User capability to allow access to user information. For more details on the capabilities you can enable, please refer to the [Notion documentation on capabilities](https://developers.notion.com/reference/capabilities).
+2. Enter a **Name** for your integration. Make sure you have selected the correct workspace from the **Associated workspace** dropdown menu, and click **Submit**.
+3. In the navbar, select [**Capabilities**](https://developers.notion.com/reference/capabilities). Check the following capabilities based on your use case:
-### Step 2: Set permissions and acquire authorization credentials
+- [**Read content**](https://developers.notion.com/reference/capabilities#content-capabilities): required for all connections.
+- [**Read comments**](https://developers.notion.com/reference/capabilities#comment-capabilities): required if you wish to sync the `Comments` stream
+- [**Read user information**](https://developers.notion.com/reference/capabilities#user-capabilities) (either with or without emails): required if you wish to sync the `Users` stream
+
+### Step 2: Share pages and acquire authorization credentials
#### Access Token (Cloud and Open Source)
-If you are authenticating via Access Token, you will need to manually set permissions for each page you want to share with Airbyte.
+If you are authenticating via Access Token, you will need to manually share each page you want to sync with Airbyte.
1. Navigate to the page(s) you want to share with Airbyte. Click the **β’β’β’** menu at the top right of the page, select **Add connections**, and choose the integration you created in Step 1.
-2. Once you have selected all the pages to share, you can find and copy the Access Token from the **Secrets** tab of your Notion integration's page. Then proceed to
- [setting up the connector in Airbyte](#step-2-set-up-the-notion-connector-in-airbyte).
+2. Once you have selected all the pages to share, you can find and copy the Access Token from the **Secrets** tab of your Notion integration's page. Then proceed to [setting up the connector in Airbyte](#step-2-set-up-the-notion-connector-in-airbyte).
#### OAuth2.0 (Open Source only)
-If you are authenticating via OAuth2.0 for Airbyte Open Source, you will need to make your integration public and acquire your Client ID, Client Secret and Access Token.
+If you are authenticating via OAuth2.0 for **Airbyte Open Source**, you will need to make your integration public and acquire your Client ID, Client Secret and Access Token.
1. Navigate to the **Distribution** tab in your integration page, and toggle the switch to make the integration public.
2. Fill out the required fields in the **Organization information** and **OAuth Domain & URIs** section, then click **Submit**.
@@ -65,39 +66,37 @@ If you are authenticating via OAuth2.0 for Airbyte Open Source, you will need to
#### Authentication for Airbyte Cloud
- **OAuth2.0** (Recommended): Click **Authenticate your Notion account**. When the popup appears, click **Select pages**. Check the pages you want to give Airbyte access to, and click **Allow access**.
-- **Access Token**: Copy and paste the Access Token found in the **Secrets** tab of your Notion integration's page.
+- **Access Token**: Copy and paste the Access Token found in the **Secrets** tab of your private integration's page.
#### Authentication for Airbyte Open Source
-- **Access Token**: Copy and paste the Access Token found in the **Secrets** tab of your Notion integration's page.
-- **OAuth2.0**: Copy and paste the Client ID, Client Secret and Access Token you acquired.
+- **Access Token**: Copy and paste the Access Token found in the **Secrets** tab of your private integration's page.
+- **OAuth2.0**: Copy and paste the Client ID, Client Secret and Access Token you acquired after setting up your public integration.
-6. Enter the **Start Date** using the provided datepicker, or by programmatically entering a UTC date and time in the format: `YYYY-MM-DDTHH:mm:ss.SSSZ`. All data generated after this date will be replicated.
+6. (Optional) You may optionally provide a **Start Date** using the provided datepicker, or by programmatically entering a UTC date and time in the format: `YYYY-MM-DDTHH:mm:ss.SSSZ`. When using incremental syncs, only data generated after this date will be replicated. If left blank, Airbyte will set the start date two years from the current date by default.
7. Click **Set up source** and wait for the tests to complete.
## Supported sync modes
The Notion source connector supports the following [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes):
-- [Full Refresh - Overwrite](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-overwrite/)
-- [Full Refresh - Append](https://docs.airbyte.com/understanding-airbyte/connections/full-refresh-append)
-- [Incremental - Append](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append) (partially)
-- [Incremental - Append + Deduped](https://docs.airbyte.com/understanding-airbyte/connections/incremental-append-deduped)
+| Stream | Full Refresh (Overwrite/Append) | Incremental (Append/Append + Deduped) |
+|-----------|:------------:|:-----------:|
+| Blocks | β | β |
+| Comments | β | β |
+| Databases | β | β |
+| Pages | β | β |
+| Users | β | |
## Supported Streams
-The Notion source connector supports the following streams. For more information, see the [Notion API](https://developers.notion.com/reference/intro).
-
-- [blocks](https://developers.notion.com/reference/retrieve-a-block)
-- [databases](https://developers.notion.com/reference/retrieve-a-database)
-- [pages](https://developers.notion.com/reference/retrieve-a-page)
-- [users](https://developers.notion.com/reference/get-user)
-
-:::note
-
-The users stream does not support Incremental - Append sync mode.
+The Notion source connector supports the following streams:
-:::
+- [Blocks](https://developers.notion.com/reference/retrieve-a-block)
+- [Comments](https://developers.notion.com/reference/retrieve-a-comment)
+- [Databases](https://developers.notion.com/reference/retrieve-a-database)
+- [Pages](https://developers.notion.com/reference/retrieve-a-page)
+- [Users](https://developers.notion.com/reference/get-users)
## Performance considerations
@@ -107,7 +106,12 @@ The connector is restricted by Notion [request limits](https://developers.notion
| Version | Date | Pull Request | Subject |
| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------- |
-| 1.1.2 | 2023-08-30 | [29999](https://github.com/airbytehq/airbyte/pull/29999) | Update error handling during connection check
+| 2.0.0 | 2023-10-09 | [30587](https://github.com/airbytehq/airbyte/pull/30587) | Source-wide schema update |
+| 1.3.0 | 2023-10-09 | [30324](https://github.com/airbytehq/airbyte/pull/30324) | Add `Comments` stream |
+| 1.2.2 | 2023-10-09 | [30780](https://github.com/airbytehq/airbyte/pull/30780) | Update Start Date in config to optional field |
+| 1.2.1 | 2023-10-08 | [30750](https://github.com/airbytehq/airbyte/pull/30750) | Add availability strategy |
+| 1.2.0 | 2023-10-04 | [31053](https://github.com/airbytehq/airbyte/pull/31053) | Add undeclared fields for blocks and pages streams |
+| 1.1.2 | 2023-08-30 | [29999](https://github.com/airbytehq/airbyte/pull/29999) | Update error handling during connection check |
| 1.1.1 | 2023-06-14 | [26535](https://github.com/airbytehq/airbyte/pull/26535) | Migrate from deprecated `authSpecification` to `advancedAuth` |
| 1.1.0 | 2023-06-08 | [27170](https://github.com/airbytehq/airbyte/pull/27170) | Fix typo in `blocks` schema |
| 1.0.9 | 2023-06-08 | [27062](https://github.com/airbytehq/airbyte/pull/27062) | Skip streams with `invalid_start_cursor` error |
diff --git a/docs/integrations/sources/pokeapi.md b/docs/integrations/sources/pokeapi.md
index 797a58d71857..dd6605758c33 100644
--- a/docs/integrations/sources/pokeapi.md
+++ b/docs/integrations/sources/pokeapi.md
@@ -36,6 +36,8 @@ The PokΓ©API uses the same [JSONSchema](https://json-schema.org/understanding-js
| Version | Date | Pull Request | Subject |
| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------- |
+| 0.2.0 | 2023-10-02 | [30969](https://github.com/airbytehq/airbyte/pull/30969) | Migrated to Low code
+ |
| 0.1.5 | 2022-05-18 | [12942](https://github.com/airbytehq/airbyte/pull/12942) | Fix example inputs |
| 0.1.4 | 2021-12-07 | [8582](https://github.com/airbytehq/airbyte/pull/8582) | Update connector fields title/description |
| 0.1.3 | 2021-12-03 | [8432](https://github.com/airbytehq/airbyte/pull/8432) | Migrate from base_python to CDK, add SAT tests. |
diff --git a/docs/integrations/sources/postgres.md b/docs/integrations/sources/postgres.md
index 564d461619fd..5af77f7d88ed 100644
--- a/docs/integrations/sources/postgres.md
+++ b/docs/integrations/sources/postgres.md
@@ -291,6 +291,8 @@ According to Postgres [documentation](https://www.postgresql.org/docs/14/datatyp
| Version | Date | Pull Request | Subject |
|---------|------------|----------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| 3.1.11 | 2023-10-11 | [31322](https://github.com/airbytehq/airbyte/pull/31322) | Correct pevious release |
+| 3.1.10 | 2023-09-29 | [30806](https://github.com/airbytehq/airbyte/pull/30806) | Cap log line length to 32KB to prevent loss of records. |
| 3.1.9 | 2023-09-25 | [30534](https://github.com/airbytehq/airbyte/pull/30534) | Fix JSONB[] column type handling bug. |
| 3.1.8 | 2023-09-20 | [30125](https://github.com/airbytehq/airbyte/pull/30125) | Improve initial load performance for older versions of PostgreSQL. |
| 3.1.7 | 2023-09-05 | [29672](https://github.com/airbytehq/airbyte/pull/29672) | Handle VACUUM happening during initial sync |
diff --git a/docs/integrations/sources/slack.md b/docs/integrations/sources/slack.md
index 1996d2791733..9eaea88e101d 100644
--- a/docs/integrations/sources/slack.md
+++ b/docs/integrations/sources/slack.md
@@ -92,7 +92,7 @@ We recommend creating a restricted, read-only key specifically for Airbyte acces
3. **Required** Enter your `start_date`.
4. **Required** Enter your `lookback_window`, which corresponds to amount of days in the past from which you want to sync data.
5. Toggle `join_channels`, if you want to join all channels or to sync data only from channels the bot is already in. If not set, you'll need to manually add the bot to all the channels from which you'd like to sync messages.
-6. Enter your `channel_filter`, this should be list of channel names (without leading '#' char) that limits the channels from which you'd like to sync. If no channels are specified, Airbyte will replicate all data.
+6. Enter your `channel_filter`, this should be list of channel names (without leading '#' char) that limits the channels from which you'd like to sync. If no channels are specified, Airbyte will replicate all data.
7. Enter your `api_token`.
8. Click **Set up source**.
@@ -137,6 +137,7 @@ It is recommended to sync required channels only, this can be done by specifying
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:---------------------------------------------------------|:------------------------------------------------------------------------------------|
+| 0.3.4 | 2023-10-06 | [31134](https://github.com/airbytehq/airbyte/pull/31134) | Update CDK and remove non iterable return from records |
| 0.3.3 | 2023-09-28 | [30580](https://github.com/airbytehq/airbyte/pull/30580) | Add `bot_id` field to threads schema |
| 0.3.2 | 2023-09-20 | [30613](https://github.com/airbytehq/airbyte/pull/30613) | Set default value for channel_filters during discover |
| 0.3.1 | 2023-09-19 | [30570](https://github.com/airbytehq/airbyte/pull/30570) | Use default availability strategy |
diff --git a/docs/integrations/sources/snowflake.md b/docs/integrations/sources/snowflake.md
index 0e93412be2e1..f79171a7fe0d 100644
--- a/docs/integrations/sources/snowflake.md
+++ b/docs/integrations/sources/snowflake.md
@@ -4,7 +4,7 @@
The Snowflake source allows you to sync data from Snowflake. It supports both Full Refresh and Incremental syncs. You can choose if this connector will copy only the new or updated data, or all rows in the tables and columns you set up for replication, every time a sync is run.
-This Snowflake source connector is built on top of the source-jdbc code base and is configured to rely on JDBC 3.13.22 [Snowflake driver](https://github.com/snowflakedb/snowflake-jdbc) as described in Snowflake [documentation](https://docs.snowflake.com/en/user-guide/jdbc.html).
+This Snowflake source connector is built on top of the source-jdbc code base and is configured to rely on JDBC 3.14.1 [Snowflake driver](https://github.com/snowflakedb/snowflake-jdbc) as described in Snowflake [documentation](https://docs.snowflake.com/en/user-guide/jdbc.html).
#### Resulting schema
@@ -121,41 +121,42 @@ To read more please check official [Snowflake documentation](https://docs.snowfl
## Changelog
-| Version | Date | Pull Request | Subject |
-| :------ | :--------- | :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------- |
-| 0.2.0 | 2023-06-26 | [27737](https://github.com/airbytehq/airbyte/pull/27737) | License Update: Elv2 |
-| 0.1.36 | 2023-06-20 | [27212](https://github.com/airbytehq/airbyte/pull/27212) | Fix silent exception swallowing in StreamingJdbcDatabase |
-| 0.1.35 | 2023-06-14 | [27335](https://github.com/airbytehq/airbyte/pull/27335) | Remove noisy debug logs |
-| 0.1.34 | 2023-03-30 | [24693](https://github.com/airbytehq/airbyte/pull/24693) | Fix failure with TIMESTAMP_WITH_TIMEZONE column being used as cursor |
-| 0.1.33 | 2023-03-29 | [24667](https://github.com/airbytehq/airbyte/pull/24667) | Fix bug which wont allow TIMESTAMP_WITH_TIMEZONE column to be used as a cursor |
-| 0.1.32 | 2023-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting |
-| 0.1.31 | 2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect to |
-| 0.1.30 | 2023-02-21 | [22358](https://github.com/airbytehq/airbyte/pull/22358) | Improved handling of big integer cursor type values. |
-| 0.1.29 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources. |
-| 0.1.28 | 2023-01-06 | [20465](https://github.com/airbytehq/airbyte/pull/20465) | Improve the schema config field to only discover tables from the specified scehma and make the field optional |
-| 0.1.27 | 2022-12-14 | [20407](https://github.com/airbytehq/airbyte/pull/20407) | Fix an issue with integer values converted to floats during replication |
-| 0.1.26 | 2022-11-10 | [19314](https://github.com/airbytehq/airbyte/pull/19314) | Set application id in JDBC URL params based on OSS/Cloud environment |
-| 0.1.25 | 2022-11-10 | [15535](https://github.com/airbytehq/airbyte/pull/15535) | Update incremental query to avoid data missing when new data is inserted at the same time as a sync starts under non-CDC incremental mode |
-| 0.1.24 | 2022-09-26 | [17144](https://github.com/airbytehq/airbyte/pull/17144) | Fixed bug with incorrect date-time datatypes handling |
-| 0.1.23 | 2022-09-26 | [17116](https://github.com/airbytehq/airbyte/pull/17116) | added connection string identifier |
-| 0.1.22 | 2022-09-21 | [16766](https://github.com/airbytehq/airbyte/pull/16766) | Update JDBC Driver version to 3.13.22 |
-| 0.1.21 | 2022-09-14 | [15668](https://github.com/airbytehq/airbyte/pull/15668) | Wrap logs in AirbyteLogMessage |
-| 0.1.20 | 2022-09-01 | [16258](https://github.com/airbytehq/airbyte/pull/16258) | Emit state messages more frequently |
-| 0.1.19 | 2022-08-19 | [15797](https://github.com/airbytehq/airbyte/pull/15797) | Allow using role during oauth |
-| 0.1.18 | 2022-08-18 | [14356](https://github.com/airbytehq/airbyte/pull/14356) | DB Sources: only show a table can sync incrementally if at least one column can be used as a cursor field |
-| 0.1.17 | 2022-08-09 | [15314](https://github.com/airbytehq/airbyte/pull/15314) | Discover integer columns as integers rather than floats |
-| 0.1.16 | 2022-08-04 | [15314](https://github.com/airbytehq/airbyte/pull/15314) | (broken, do not use) Discover integer columns as integers rather than floats |
-| 0.1.15 | 2022-07-22 | [14828](https://github.com/airbytehq/airbyte/pull/14828) | Source Snowflake: Source/Destination doesn't respect DATE data type |
-| 0.1.14 | 2022-07-22 | [14714](https://github.com/airbytehq/airbyte/pull/14714) | Clarified error message when invalid cursor column selected |
-| 0.1.13 | 2022-07-14 | [14574](https://github.com/airbytehq/airbyte/pull/14574) | Removed additionalProperties:false from JDBC source connectors |
-| 0.1.12 | 2022-04-29 | [12480](https://github.com/airbytehq/airbyte/pull/12480) | Query tables with adaptive fetch size to optimize JDBC memory consumption |
-| 0.1.11 | 2022-04-27 | [10953](https://github.com/airbytehq/airbyte/pull/10953) | Implement OAuth flow |
-| 0.1.9 | 2022-02-21 | [10242](https://github.com/airbytehq/airbyte/pull/10242) | Fixed cursor for old connectors that use non-microsecond format. Now connectors work with both formats |
-| 0.1.8 | 2022-02-18 | [10242](https://github.com/airbytehq/airbyte/pull/10242) | Updated timestamp transformation with microseconds |
-| 0.1.7 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option |
-| 0.1.6 | 2022-01-25 | [9623](https://github.com/airbytehq/airbyte/pull/9623) | Add jdbc_url_params support for optional JDBC parameters |
-| 0.1.5 | 2022-01-19 | [9567](https://github.com/airbytehq/airbyte/pull/9567) | Added parameter for keeping JDBC session alive |
-| 0.1.4 | 2021-12-30 | [9203](https://github.com/airbytehq/airbyte/pull/9203) | Update connector fields title/description |
-| 0.1.3 | 2021-01-11 | [9304](https://github.com/airbytehq/airbyte/pull/9304) | Upgrade version of JDBC driver |
-| 0.1.2 | 2021-10-21 | [7257](https://github.com/airbytehq/airbyte/pull/7257) | Fixed parsing of extreme values for FLOAT and NUMBER data types |
-| 0.1.1 | 2021-08-13 | [4699](https://github.com/airbytehq/airbyte/pull/4699) | Added json config validator |
+| Version | Date | Pull Request | Subject |
+|:--------|:-----------|:---------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------|
+| 0.2.1 | 2023-10-11 | [31252](https://github.com/airbytehq/airbyte/pull/31252) | Snowflake JDBC version upgrade |
+| 0.2.0 | 2023-06-26 | [27737](https://github.com/airbytehq/airbyte/pull/27737) | License Update: Elv2 |
+| 0.1.36 | 2023-06-20 | [27212](https://github.com/airbytehq/airbyte/pull/27212) | Fix silent exception swallowing in StreamingJdbcDatabase |
+| 0.1.35 | 2023-06-14 | [27335](https://github.com/airbytehq/airbyte/pull/27335) | Remove noisy debug logs |
+| 0.1.34 | 2023-03-30 | [24693](https://github.com/airbytehq/airbyte/pull/24693) | Fix failure with TIMESTAMP_WITH_TIMEZONE column being used as cursor |
+| 0.1.33 | 2023-03-29 | [24667](https://github.com/airbytehq/airbyte/pull/24667) | Fix bug which wont allow TIMESTAMP_WITH_TIMEZONE column to be used as a cursor |
+| 0.1.32 | 2023-03-22 | [20760](https://github.com/airbytehq/airbyte/pull/20760) | Removed redundant date-time datatypes formatting |
+| 0.1.31 | 2023-03-06 | [23455](https://github.com/airbytehq/airbyte/pull/23455) | For network isolation, source connector accepts a list of hosts it is allowed to connect to |
+| 0.1.30 | 2023-02-21 | [22358](https://github.com/airbytehq/airbyte/pull/22358) | Improved handling of big integer cursor type values. |
+| 0.1.29 | 2022-12-14 | [20436](https://github.com/airbytehq/airbyte/pull/20346) | Consolidate date/time values mapping for JDBC sources. |
+| 0.1.28 | 2023-01-06 | [20465](https://github.com/airbytehq/airbyte/pull/20465) | Improve the schema config field to only discover tables from the specified scehma and make the field optional |
+| 0.1.27 | 2022-12-14 | [20407](https://github.com/airbytehq/airbyte/pull/20407) | Fix an issue with integer values converted to floats during replication |
+| 0.1.26 | 2022-11-10 | [19314](https://github.com/airbytehq/airbyte/pull/19314) | Set application id in JDBC URL params based on OSS/Cloud environment |
+| 0.1.25 | 2022-11-10 | [15535](https://github.com/airbytehq/airbyte/pull/15535) | Update incremental query to avoid data missing when new data is inserted at the same time as a sync starts under non-CDC incremental mode |
+| 0.1.24 | 2022-09-26 | [17144](https://github.com/airbytehq/airbyte/pull/17144) | Fixed bug with incorrect date-time datatypes handling |
+| 0.1.23 | 2022-09-26 | [17116](https://github.com/airbytehq/airbyte/pull/17116) | added connection string identifier |
+| 0.1.22 | 2022-09-21 | [16766](https://github.com/airbytehq/airbyte/pull/16766) | Update JDBC Driver version to 3.13.22 |
+| 0.1.21 | 2022-09-14 | [15668](https://github.com/airbytehq/airbyte/pull/15668) | Wrap logs in AirbyteLogMessage |
+| 0.1.20 | 2022-09-01 | [16258](https://github.com/airbytehq/airbyte/pull/16258) | Emit state messages more frequently |
+| 0.1.19 | 2022-08-19 | [15797](https://github.com/airbytehq/airbyte/pull/15797) | Allow using role during oauth |
+| 0.1.18 | 2022-08-18 | [14356](https://github.com/airbytehq/airbyte/pull/14356) | DB Sources: only show a table can sync incrementally if at least one column can be used as a cursor field |
+| 0.1.17 | 2022-08-09 | [15314](https://github.com/airbytehq/airbyte/pull/15314) | Discover integer columns as integers rather than floats |
+| 0.1.16 | 2022-08-04 | [15314](https://github.com/airbytehq/airbyte/pull/15314) | (broken, do not use) Discover integer columns as integers rather than floats |
+| 0.1.15 | 2022-07-22 | [14828](https://github.com/airbytehq/airbyte/pull/14828) | Source Snowflake: Source/Destination doesn't respect DATE data type |
+| 0.1.14 | 2022-07-22 | [14714](https://github.com/airbytehq/airbyte/pull/14714) | Clarified error message when invalid cursor column selected |
+| 0.1.13 | 2022-07-14 | [14574](https://github.com/airbytehq/airbyte/pull/14574) | Removed additionalProperties:false from JDBC source connectors |
+| 0.1.12 | 2022-04-29 | [12480](https://github.com/airbytehq/airbyte/pull/12480) | Query tables with adaptive fetch size to optimize JDBC memory consumption |
+| 0.1.11 | 2022-04-27 | [10953](https://github.com/airbytehq/airbyte/pull/10953) | Implement OAuth flow |
+| 0.1.9 | 2022-02-21 | [10242](https://github.com/airbytehq/airbyte/pull/10242) | Fixed cursor for old connectors that use non-microsecond format. Now connectors work with both formats |
+| 0.1.8 | 2022-02-18 | [10242](https://github.com/airbytehq/airbyte/pull/10242) | Updated timestamp transformation with microseconds |
+| 0.1.7 | 2022-02-14 | [10256](https://github.com/airbytehq/airbyte/pull/10256) | Add `-XX:+ExitOnOutOfMemoryError` JVM option |
+| 0.1.6 | 2022-01-25 | [9623](https://github.com/airbytehq/airbyte/pull/9623) | Add jdbc_url_params support for optional JDBC parameters |
+| 0.1.5 | 2022-01-19 | [9567](https://github.com/airbytehq/airbyte/pull/9567) | Added parameter for keeping JDBC session alive |
+| 0.1.4 | 2021-12-30 | [9203](https://github.com/airbytehq/airbyte/pull/9203) | Update connector fields title/description |
+| 0.1.3 | 2021-01-11 | [9304](https://github.com/airbytehq/airbyte/pull/9304) | Upgrade version of JDBC driver |
+| 0.1.2 | 2021-10-21 | [7257](https://github.com/airbytehq/airbyte/pull/7257) | Fixed parsing of extreme values for FLOAT and NUMBER data types |
+| 0.1.1 | 2021-08-13 | [4699](https://github.com/airbytehq/airbyte/pull/4699) | Added json config validator |
diff --git a/docs/integrations/sources/square.md b/docs/integrations/sources/square.md
index 2dfa772284b3..fe9a86105d42 100644
--- a/docs/integrations/sources/square.md
+++ b/docs/integrations/sources/square.md
@@ -69,6 +69,7 @@ The Square source connector supports the following [ sync modes](https://docs.ai
- [Customers](https://developer.squareup.com/explorer/square/customers-api/list-customers)
- [Shifts](https://developer.squareup.com/reference/square/labor-api/search-shifts)
- [Orders](https://developer.squareup.com/reference/square/orders-api/search-orders)
+- [Cash drawers](https://developer.squareup.com/explorer/square/cash-drawers-api/list-cash-drawer-shifts)
## Connector-specific features & highlights
@@ -96,7 +97,9 @@ Exponential [Backoff](https://developer.squareup.com/forums/t/current-square-api
## Changelog
| Version | Date | Pull Request | Subject |
-| :------ | :--------- | :------------------------------------------------------- | :------------------------------------------------------------------------ |
+| :------ |:-----------| :------------------------------------------------------- |:--------------------------------------------------------------------------|
+| 1.2.0 | 2023-10-10 | [31065](https://github.com/airbytehq/airbyte/pull/31065) | Add new stream Cash drawers shifts |
+| 1.1.3 | 2023-10-10 | [30960](https://github.com/airbytehq/airbyte/pull/30960) | Update `airbyte-cdk` version to `>=0.51.31` |
| 1.1.2 | 2023-07-10 | [28019](https://github.com/airbytehq/airbyte/pull/28019) | fix display order of spec fields |
| 1.1.1 | 2023-06-28 | [27762](https://github.com/airbytehq/airbyte/pull/27762) | Update following state breaking changes |
| 1.1.0 | 2023-05-24 | [26485](https://github.com/airbytehq/airbyte/pull/26485) | Remove deprecated authSpecification in favour of advancedAuth |
diff --git a/docs/integrations/sources/stripe.md b/docs/integrations/sources/stripe.md
index 7c4478034289..bb73919c3236 100644
--- a/docs/integrations/sources/stripe.md
+++ b/docs/integrations/sources/stripe.md
@@ -192,6 +192,7 @@ The Stripe connector should not run into Stripe API limitations under normal usa
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:---------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------|
+| 4.4.0 | 2023-10-04 | [31046](https://github.com/airbytehq/airbyte/pull/31046) | Added margins field to invoice_line_items stream. |
| 4.3.1 | 2023-09-27 | [30800](https://github.com/airbytehq/airbyte/pull/30800) | Handle permission issues a non breaking |
| 4.3.0 | 2023-09-26 | [30752](https://github.com/airbytehq/airbyte/pull/30752) | Do not sync upcoming invoices, extend stream schemas |
| 4.2.0 | 2023-09-21 | [30660](https://github.com/airbytehq/airbyte/pull/30660) | Fix updated state for the incremental syncs |
diff --git a/docs/integrations/sources/zendesk-chat.md b/docs/integrations/sources/zendesk-chat.md
index eab821953fd8..31cfcba59e2c 100644
--- a/docs/integrations/sources/zendesk-chat.md
+++ b/docs/integrations/sources/zendesk-chat.md
@@ -80,6 +80,7 @@ The connector is restricted by Zendesk's [requests limitation](https://developer
| Version | Date | Pull Request | Subject |
| :------ | :--------- | :------------------------------------------------------- | :--------------------------------------------------------------------------------------------------------------- |
+| 0.2.0 | 2023-10-11 | [30526](https://github.com/airbytehq/airbyte/pull/30526) | Use the python connector base image, remove dockerfile and implement build_customization.py |
| 0.1.14 | 2023-02-10 | [24190](https://github.com/airbytehq/airbyte/pull/24190) | Fix remove too high min/max from account stream |
| 0.1.13 | 2023-02-10 | [22819](https://github.com/airbytehq/airbyte/pull/22819) | Specified date formatting in specification |
| 0.1.12 | 2023-01-27 | [22026](https://github.com/airbytehq/airbyte/pull/22026) | Set `AvailabilityStrategy` for streams explicitly to `None` |
diff --git a/docs/snowflake-native-apps/facebook-marketing.md b/docs/snowflake-native-apps/facebook-marketing.md
new file mode 100644
index 000000000000..f461d3cdf2df
--- /dev/null
+++ b/docs/snowflake-native-apps/facebook-marketing.md
@@ -0,0 +1,207 @@
+# Facebook Marketing Connector
+
+The Facebook Marketing Connector by Airbyte is a Snowflake Native Application that allows you to extract data from your Facebook Marketing account and load records into a Snowflake database of your choice.
+
+:::info
+The Snowflake Native Apps platform is new and rapidly evolving. The Facebook Marketing Connector by Airbyte is in _private preview_ and is subject to further development that may affect setup and configuration of the application. Please note that, at this time, only a [full table refresh](../understanding-airbyte/connections/full-refresh-overwrite.md) without deduplication is supported.
+:::
+
+# Getting started
+
+## Prerequisites
+A Facebook Marketing account with permission to access data from accounts you want to sync.
+
+## Installing the App
+
+:::warning
+Do not refresh the Apps page while the application is being installed. This may cause installation to fail.
+:::
+
+1. Log into your Snowflake account.
+2. On the left sidebar, click `Marketplace`.
+3. Search for `Facebook Marketing Connector` by Airbyte or navigate to https://app.snowflake.com/marketplace/listing/GZTYZ9BCRT8/airbyte-facebook-marketing-connector-by-airbyte
+4. Click `Request`. This will send a request that we will manually service as soon as we can.
+5. On the left sidebar, click `Apps`.
+6. Under the `Recently Shared with You` section, you should see the `Facebook Marketing Connector by Airbyte`. Click `Get`.
+7. Expand `Options`.
+ 1. You can rename the application or leave the default. This is how you will reference the application from a worksheet.
+ 2. Specify the warehouse that the application will be installed to.
+8. Click `Get`.
+9. Wait for the application to install. Once complete, the pop-up window should automatically close.
+
+You should now see the Facebook Marketing Connector by Airbyte application under `Installed Apps`. You may need to refresh the page.
+
+## Facebook Marketing Account
+In order for the Facebook Marketing Connector by Airbyte to query Facebook's APIs, you will need an account with the right permissions. Please follow the [Facebook Marketing authentication guide](https://docs.airbyte.com/integrations/sources/facebook-marketing#for-airbyte-open-source-generate-an-access-token-and-request-a-rate-limit-increase) for further information.
+
+## Snowflake Native App Authorizations
+
+:::note
+By default the app will be installed using the name `AIRBYTE_FACEBOOK_MARKETING`, but if you renamed the app during installation, you will have to use that name as a reference.
+:::
+
+1. Create the database where the app will access the authorization. This database can be different from the database where the sync will output records.
+```
+CREATE DATABASE ;
+USE ;
+```
+
+2. The native app will validate the output database and create it if it does not exist. In order to do that, the app needs access to the database:
+```
+GRANT CREATE DATABASE ON ACCOUNT TO APPLICATION ;
+```
+
+3. You will need to allow outgoing network traffic based on the domain of the source. In the case of Facebook Marketing, simply run:
+```
+CREATE OR REPLACE NETWORK RULE facebook_marketing_apis_network_rule
+ MODE = EGRESS
+ TYPE = HOST_PORT
+ VALUE_LIST = ('graph.facebook.com');
+```
+
+:::note
+As of 2023-09-13, the [Snowflake documentation](https://docs.snowflake.com/en/sql-reference/sql/create-external-access-integration) mentions that direct external access is a preview feature and that it is `available to all accounts on AWS` which might restrict the number of users able to use the connector.
+:::
+
+4. Once you have external access configured, you need define your authorization/authentication. Provide the credentials to the app as such:
+```
+CREATE OR REPLACE SECRET integration_facebook_marketing_oauth
+ TYPE = GENERIC_STRING
+ SECRET_STRING = '{
+ "access_token": ""
+ }';
+```
+... where `client_id`, `client_secret` and `refresh_token` are strings. For more information, see the [Facebook Marketing authentication guide](https://docs.airbyte.com/integrations/sources/facebook-marketing#for-airbyte-open-source-generate-an-access-token-and-request-a-rate-limit-increase).
+
+5. Once the network rule and the secret are defined in Snowflake, you need to make them available to the app by using an external access integration.
+```
+CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION integration_facebook_marketing
+ ALLOWED_NETWORK_RULES = (facebook_marketing_apis_network_rule)
+ ALLOWED_AUTHENTICATION_SECRETS = (integration_facebook_marketing_oauth)
+ ENABLED = true;
+```
+
+6. Grant permission for the app to access the integration.
+```
+GRANT USAGE ON INTEGRATION integration_facebook_marketing TO APPLICATION AIRBYTE_FACEBOOK_MARKETING;
+```
+
+7. Grant permissions for the app to access the database that houses the secret and read the secret.
+```
+GRANT USAGE ON DATABASE TO APPLICATION AIRBYTE_FACEBOOK_MARKETING;
+GRANT USAGE ON SCHEMA TO APPLICATION AIRBYTE_FACEBOOK_MARKETING;
+GRANT READ ON SECRET integration_facebook_marketing_oauth TO APPLICATION AIRBYTE_FACEBOOK_MARKETING;
+```
+
+
+## Configure a connection
+Once this is all set up, you can now configure a connection. To do so, use the Streamlit app by going in the `Apps` section and selecting `AIRBYTE_FACEBOOK_MARKETING`. You will have to accept the Anaconda terms in order to use Streamlit.
+
+Once you have access to the app, select `New Connection` and fill the following fields:
+
+---
+
+`Secret`
+
+The name of the secret prefixed by which database and schema. Based on the previous steps: `..integration_facebook_marketing_oauth`.
+
+---
+
+`External Access Integration`
+
+Name of the Snowflake integration where the secret and network rules are configured. Based on the previous steps: `integration_facebook_marketing`.
+
+---
+
+`account_id`
+
+The Facebook Ad account ID to use when pulling data from the Facebook Marketing API. The Ad account ID number is in the account dropdown menu or in your browser's address bar of your [Meta Ads Manager](https://adsmanager.facebook.com/adsmanager/).
+
+---
+
+`start_date`
+
+UTC date in the format 2021-09-29T12:13:14Z. Any data before this date will not be replicated.
+
+---
+
+`end_date`
+
+UTC date in the format 2021-09-29T12:13:14Z. Any data after this date will not be replicated.
+
+---
+
+`include_deleted`
+
+The Facebook Marketing API does not have a concept of deleting records, and it maintains a record of Campaigns, Ads, and Ad Sets. Enabling this setting allows you to extract data that includes these objects that were archived or deleted from the Facebook platform.
+
+---
+
+`fetch_thumbnail_images`
+
+When extracting Ad Creatives, retrieve the thumbnail_url and store it as thumbnail_data_url in each record.
+
+---
+
+`custom_insights`
+
+Custom insights allow you to define ad statistic entries representing the performance of your campaigns against specific metrics. For more information about how to configure custom insights, please refer to the [Facebook Marketing documentation](https://docs.airbyte.com/integrations/sources/facebook-marketing#set-up-facebook-marketing-as-a-source-in-airbyte).
+
+---
+
+`page_size`
+
+The number of records per page for paginated responses. The default is 100, but most users should not need to set this field except for unique use cases that require tuning the settings.
+
+---
+
+`insights_lookback_window`
+
+The window in days to revisit data during syncing to capture updated conversion data from the API. Facebook allows for attribution windows of up to 28 days, during which time a conversion can be attributed to an ad. If you have set a custom attribution window in your Facebook account, please set the same value here.
+
+---
+
+`Output Database`
+
+The database where the records will be saved. Snowflake's database [naming convention](https://docs.snowflake.com/en/sql-reference/identifiers-syntax) applies here.
+
+---
+
+`Output Schema`
+
+The table where the schema will be saved. Snowflake's table [naming convention](https://docs.snowflake.com/en/sql-reference/identifiers-syntax) applies here.
+
+---
+
+`Connection name`
+
+How the connection will be referred in the Streamlit app.
+
+---
+
+`Replication Frequency`
+
+How often records are fetched.
+
+---
+
+## Enabling Logging and Event Sharing for an Application
+Sharing the logging and telemetry data of your installed application helps us improve the application and can allow us to better triage problems that your run into. To configure your application for logging and telemetry data please refer to the documentation for [Enabling Logging and Event Sharing](event-sharing.md).
+
+## Run a sync
+Once a connection is configured, go in `Connections List` and click on `Sync Now` for the connection you want to sync. Once the sync is complete, you should be able to validate that the records have been stored in `.`
+
+### Supported Streams
+As of now, all supported streams perform a full refresh. Incremental syncs are not yet supported. Here are the list of supported streams:
+* Activities
+* Ad Account
+* Ad Creatives
+* Ad Insights
+* Ad Sets
+* Ads
+* Campaigns
+* Custom Audiences
+* Custom Conversions
+
+# Contact Us
+snowflake-native-apps@airbyte.io
diff --git a/docs/snowflake-native-apps/linkedin-ads.md b/docs/snowflake-native-apps/linkedin-ads.md
index d9c0b33a0f7a..7cc1cf3a1ae0 100644
--- a/docs/snowflake-native-apps/linkedin-ads.md
+++ b/docs/snowflake-native-apps/linkedin-ads.md
@@ -3,7 +3,7 @@
The LinkedIn Ads Connector by Airbyte is a Snowflake Native Application that allows you to extract data from your LinkedIn Ads account and load records into a Snowflake database of your choice.
:::info
-The Snowflake Native Apps platform is new and rapidly evolving. The LinkedIn Ads Connector by Airbyte is in _private preview_ and is subject to further development that may affect setup and configuration of the application. Please note that, at this time, only a full table refresh without dedupe is supported.
+The Snowflake Native Apps platform is new and rapidly evolving. The LinkedIn Ads Connector by Airbyte is in _private preview_ and is subject to further development that may affect setup and configuration of the application. Please note that, at this time, only a [full table refresh](../understanding-airbyte/connections/full-refresh-overwrite.md) without deduplication is supported.
:::
# Getting started
@@ -96,6 +96,12 @@ GRANT USAGE ON SCHEMA TO APPLICATION AIRBYTE_LINKEDIN_ADS;
GRANT READ ON SECRET integration_linkedin_ads_oauth TO APPLICATION AIRBYTE_LINKEDIN_ADS;
```
+8. Grant permissions for the app to create a warehouse on which to execute sync tasks, and to execute tasks.
+```
+GRANT CREATE WAREHOUSE ON ACCOUNT TO APPLICATION AIRBYTE_LINKEDIN_ADS;
+GRANT EXECUTE TASK ON ACCOUNT TO APPLICATION AIRBYTE_LINKEDIN_ADS;
+```
+
## Configure a connection
Once this is all set up, you can now configure a connection. To do so, use the Streamlit app by going in the `Apps` section and selecting `AIRBYTE_LINKEDIN_ADS`. You will have to accept the Anaconda terms in order to use Streamlit.
@@ -132,13 +138,13 @@ Leave empty, if you want to pull the data from all associated accounts. To speci
`Output Database`
-The database where the records will be saved. Snowflake's database naming restriction applies here.
+The database where the records will be saved. Snowflake's database [naming convention](https://docs.snowflake.com/en/sql-reference/identifiers-syntax) applies here.
---
`Output Schema`
-The table where the schema will be saved. Snowflake's table naming restriction applies here.
+The table where the schema will be saved. Snowflake's table [naming convention](https://docs.snowflake.com/en/sql-reference/identifiers-syntax) applies here.
---
@@ -154,6 +160,9 @@ How often records are fetched.
---
+## Enabling Logging and Event Sharing for an Application
+Sharing the logging and telemetry data of your installed application helps us improve the application and can allow us to better triage problems that your run into. To configure your application for logging and telemetry data please refer to the documentation for [Enabling Logging and Event Sharing](event-sharing.md).
+
## Run a sync
Once a connection is configured, go in `Connections List` and click on `Sync Now` for the connection you want to sync. Once the sync is complete, you should be able to validate that the records have been stored in `.`
diff --git a/docusaurus/src/theme/TOCItems/index.js b/docusaurus/src/theme/TOCItems/index.js
index 19c94f110e74..db6c6d1733cd 100644
--- a/docusaurus/src/theme/TOCItems/index.js
+++ b/docusaurus/src/theme/TOCItems/index.js
@@ -5,6 +5,23 @@ import { faThumbsDown, faThumbsUp } from '@fortawesome/free-regular-svg-icons';
import styles from "./TOCItemsWrapper.module.css";
+function showDocSurvey() {
+ const showSurvey = () => {
+ chmln.show("6525a7ef3f4f150011627c9f");
+ }
+
+ if (!window.chmln) {
+ // Initialize Chameleon if it's not loaded already
+ !function(d,w){var t="SaG54hxuMI4CDIZa2yBv4lX1NHVB0jQBNTORqyAN2p2tE4-1OtIxC-DS9ywbXXIr2TPyYr",c="chmln",i=d.createElement("script");if(w[c]||(w[c]={}),!w[c].root){w[c].accountToken=t,w[c].location=w.location.href.toString(),w[c].now=new Date,w[c].fastUrl='https://fast.chameleon.io/';var m="identify alias track clear set show on off custom help _data".split(" ");for(var s=0;s {
@@ -15,6 +32,9 @@ export default function TOCItemsWrapper(props) {
vote,
});
setVoted(true);
+ if (vote === "down") {
+ showDocSurvey();
+ }
};
return (
<>