Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

destination-sqs: Replace AirbyteLogger with logging.Logger #38239

Closed
wants to merge 4 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 0 additions & 38 deletions airbyte-integrations/connectors/destination-amazon-sqs/Dockerfile

This file was deleted.

62 changes: 55 additions & 7 deletions airbyte-integrations/connectors/destination-amazon-sqs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,22 +55,70 @@ python main.py read --config secrets/config.json --catalog integration_tests/con

### Locally running the connector docker image

#### Build

**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**

#### Use `airbyte-ci` to build your connector
The Airbyte way of building this connector is to use our `airbyte-ci` tool.
You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1).
Then running the following command will build your connector:

```bash
airbyte-ci connectors --name=destination-amazon-sqs build
airbyte-ci connectors --name destination-amazon-sqs build
```
Once the command is done, you will find your connector image in your local docker registry: `airbyte/destination-amazon-sqs:dev`.

##### Customizing our build process
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
You can customize our build process by adding a `build_customization.py` module to your connector.
This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively.
It will be imported at runtime by our build process and the functions will be called if they exist.

Here is an example of a `build_customization.py` module:
```python
from __future__ import annotations

from typing import TYPE_CHECKING

if TYPE_CHECKING:
# Feel free to check the dagger documentation for more information on the Container object and its methods.
# https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/
from dagger import Container


async def pre_connector_install(base_image_container: Container) -> Container:
return await base_image_container.with_env_variable("MY_PRE_BUILD_ENV_VAR", "my_pre_build_env_var_value")

async def post_connector_install(connector_container: Container) -> Container:
return await connector_container.with_env_variable("MY_POST_BUILD_ENV_VAR", "my_post_build_env_var_value")
```

An image will be built with the tag `airbyte/destination-amazon-sqs:dev`.
#### Build your own connector image
This connector is built using our dynamic built process in `airbyte-ci`.
The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`.
The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py).
It does not rely on a Dockerfile.

If you would like to patch our connector and build your own a simple approach would be to:

1. Create your own Dockerfile based on the latest version of the connector image.
```Dockerfile
FROM airbyte/destination-amazon-sqs:latest

**Via `docker build`:**
COPY . ./airbyte/integration_code
RUN pip install ./airbyte/integration_code

# The entrypoint and default env vars are already set in the base image
# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
```
Please use this as an example. This is not optimized.

2. Build your image:
```bash
docker build -t airbyte/destination-amazon-sqs:dev .
# Running the spec command against your patched connector
docker run airbyte/destination-amazon-sqs:dev spec
```

#### Run

Then run any of the connector commands as follows:
Expand Down Expand Up @@ -113,4 +161,4 @@ You've checked out the repo, implemented a million dollar feature, and you're re
4. Make the connector documentation and its changelog is up to date (`docs/integrations/destinations/amazon-sqs.md`).
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
6. Pat yourself on the back for being an awesome contributor.
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,11 @@


import json
import logging
from typing import Any, Iterable, Mapping
from uuid import uuid4

import boto3
from airbyte_cdk import AirbyteLogger
from airbyte_cdk.destinations import Destination
from airbyte_cdk.models import AirbyteConnectionStatus, AirbyteMessage, ConfiguredAirbyteCatalog, Status, Type
from botocore.exceptions import ClientError
Expand Down Expand Up @@ -124,7 +124,7 @@ def write(
if message.type == Type.STATE:
yield message

def check(self, logger: AirbyteLogger, config: Mapping[str, Any]) -> AirbyteConnectionStatus:
def check(self, logger: logging.Logger, config: Mapping[str, Any]) -> AirbyteConnectionStatus:
try:
# Required propeties
queue_url = config["queue_url"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@
#

import json
import logging
from typing import Any, Mapping

import pytest
from airbyte_cdk import AirbyteLogger
from airbyte_cdk.models import AirbyteStream, ConfiguredAirbyteCatalog, ConfiguredAirbyteStream, DestinationSyncMode, Status, SyncMode
from destination_amazon_sqs import DestinationAmazonSqs

Expand Down Expand Up @@ -37,10 +37,10 @@ def configured_catalog_fixture() -> ConfiguredAirbyteCatalog:


def test_check_valid_config(config: Mapping):
outcome = DestinationAmazonSqs().check(AirbyteLogger(), config)
outcome = DestinationAmazonSqs().check(logging.getLogger("airbyte"), config)
assert outcome.status == Status.SUCCEEDED


def test_check_invalid_config():
outcome = DestinationAmazonSqs().check(AirbyteLogger(), {"secret_key": "not_a_real_secret"})
outcome = DestinationAmazonSqs().check(logging.getLogger("airbyte"), {"secret_key": "not_a_real_secret"})
assert outcome.status == Status.FAILED
Original file line number Diff line number Diff line change
@@ -1,9 +1,15 @@
data:
ab_internal:
ql: 200
sl: 100
connectorBuildOptions:
baseImage: docker.io/airbyte/python-connector-base:1.2.0@sha256:c22a9d97464b69d6ef01898edf3f8612dc11614f05a84984451dde195f337db9
connectorSubtype: api
connectorType: destination
definitionId: 0eeee7fb-518f-4045-bacc-9619e31c43ea
dockerImageTag: 0.1.2
dockerImageTag: 0.1.3
dockerRepository: airbyte/destination-amazon-sqs
documentationUrl: https://docs.airbyte.com/integrations/destinations/amazon-sqs
githubIssueLabel: destination-amazon-sqs
icon: awssqs.svg
license: MIT
Expand All @@ -14,14 +20,10 @@ data:
oss:
enabled: false
releaseStage: alpha
documentationUrl: https://docs.airbyte.com/integrations/destinations/amazon-sqs
tags:
- language:python
- cdk:python
ab_internal:
sl: 100
ql: 200
supportLevel: community
tags:
- language:python
- cdk:python
connectorTestSuitesOptions:
- suite: unitTests
- suite: integrationTests
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@
#

import json
import logging
import time
from typing import Any, Mapping

import boto3
from airbyte_cdk.logger import AirbyteLogger
from airbyte_cdk.models import AirbyteMessage, ConfiguredAirbyteCatalog, Status
from destination_amazon_sqs import DestinationAmazonSqs

Expand Down Expand Up @@ -88,7 +88,7 @@ def test_check():
# Create config
config = create_config(queue_url, queue_region, user["AccessKeyId"], user["SecretAccessKey"], 10)
# Create AirbyteLogger
logger = AirbyteLogger()
logger = logging.getLogger("airbyte")
# Create Destination
destination = DestinationAmazonSqs()
# Run check
Expand Down
11 changes: 6 additions & 5 deletions docs/integrations/destinations/amazon-sqs.md
Original file line number Diff line number Diff line change
Expand Up @@ -119,8 +119,9 @@ The output SQS message would contain:

## CHANGELOG

| Version | Date | Pull Request | Subject |
| :------ | :--------- | :-------------------------------------------------------- | :-------------------------------- |
| 0.1.2 | 2024-03-05 | [#35838](https://github.com/airbytehq/airbyte/pull/35838) | Un-archive connector |
| 0.1.1 | 2024-01-03 | [#33924](https://github.com/airbytehq/airbyte/pull/33924) | Add new ap-southeast-3 AWS region |
| 0.1.0 | 2021-10-27 | [#0000](https://github.com/airbytehq/airbyte/pull/0000) | Initial version |
| Version | Date | Pull Request | Subject |
|:--------|:-----------|:----------------------------------------------------------|:------------------------------------------|
| 0.1.3 | 2024-05-15 | [38239](https://github.com/airbytehq/airbyte/pull/38239) | Replace AirbyteLogger with logging.Logger |
| 0.1.2 | 2024-03-05 | [#35838](https://github.com/airbytehq/airbyte/pull/35838) | Un-archive connector |
| 0.1.1 | 2024-01-03 | [#33924](https://github.com/airbytehq/airbyte/pull/33924) | Add new ap-southeast-3 AWS region |
| 0.1.0 | 2021-10-27 | [#0000](https://github.com/airbytehq/airbyte/pull/0000) | Initial version |
Loading