diff --git a/airbyte-integrations/connectors/source-kintone/README.md b/airbyte-integrations/connectors/source-kintone/README.md new file mode 100644 index 000000000000..cab11197632a --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/README.md @@ -0,0 +1,166 @@ +# Kintone Source + +This is the repository for the Kintone source connector, written in Python. +For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/kintone). + +## Local development + +### Prerequisites +**To iterate on this connector, make sure to complete this prerequisites section.** + +#### Minimum Python version required `= 3.9.0` + +#### Activate Virtual Environment and install dependencies +From this connector directory, create a virtual environment: +``` +python -m venv .venv +``` + +This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your +development environment of choice. To activate it from the terminal, run: +``` +source .venv/bin/activate +pip install -r requirements.txt +pip install '.[tests]' +``` +If you are in an IDE, follow your IDE's instructions to activate the virtualenv. + +Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is +used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`. +If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything +should work as you expect. + +#### Create credentials +**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/kintone) +to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_kintone/spec.yaml` file. +Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. +See `integration_tests/sample_config.json` for a sample config file. + +**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source kintone test creds` +and place them into `secrets/config.json`. + +### Locally running the connector +``` +python main.py spec +python main.py check --config secrets/config.json +python main.py discover --config secrets/config.json +python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json +``` + +### Locally running the connector docker image + +#### Use `airbyte-ci` to build your connector +The Airbyte way of building this connector is to use our `airbyte-ci` tool. +You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1). +Then running the following command will build your connector: + +```bash +airbyte-ci connectors --name source-kintone build +``` +Once the command is done, you will find your connector image in your local docker registry: `airbyte/source-kintone:dev`. + +##### Customizing our build process +When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. +You can customize our build process by adding a `build_customization.py` module to your connector. +This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively. +It will be imported at runtime by our build process and the functions will be called if they exist. + +Here is an example of a `build_customization.py` module: +```python +from __future__ import annotations + +from typing import TYPE_CHECKING + +if TYPE_CHECKING: + # Feel free to check the dagger documentation for more information on the Container object and its methods. + # https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/ + from dagger import Container + + +async def pre_connector_install(base_image_container: Container) -> Container: + return await base_image_container.with_env_variable("MY_PRE_BUILD_ENV_VAR", "my_pre_build_env_var_value") + +async def post_connector_install(connector_container: Container) -> Container: + return await connector_container.with_env_variable("MY_POST_BUILD_ENV_VAR", "my_post_build_env_var_value") +``` + +#### Build your own connector image +This connector is built using our dynamic built process in `airbyte-ci`. +The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`. +The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py). +It does not rely on a Dockerfile. + +If you would like to patch our connector and build your own a simple approach would be to: + +1. Create your own Dockerfile based on the latest version of the connector image. +```Dockerfile +FROM airbyte/source-kintone:latest + +COPY . ./airbyte/integration_code +RUN pip install ./airbyte/integration_code + +# The entrypoint and default env vars are already set in the base image +# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py" +# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"] +``` +Please use this as an example. This is not optimized. + +2. Build your image: +```bash +docker build -t airbyte/source-kintone:dev . +# Running the spec command against your patched connector +docker run airbyte/source-kintone:dev spec +```` + +#### Run +Then run any of the connector commands as follows: +``` +docker run --rm airbyte/source-kintone:dev spec +docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-kintone:dev check --config /secrets/config.json +docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-kintone:dev discover --config /secrets/config.json +docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-kintone:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json +``` +## Testing +Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named. +First install test dependencies into your virtual environment: +``` +pip install .[tests] +``` +### Unit Tests +To run unit tests locally, from the connector directory run: +``` +python -m pytest unit_tests +``` + +### Integration Tests +There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all source connectors) and custom integration tests (which are specific to this connector). +#### Custom Integration tests +Place custom tests inside `integration_tests/` folder, then, from the connector root, run +``` +python -m pytest integration_tests +``` + +### Acceptance Tests +Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. +If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. +Please run acceptance tests via [airbyte-ci](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#connectors-test-command): +```bash +airbyte-ci connectors --name source-kintone test +``` + +## Dependency Management +All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. +We split dependencies between two groups, dependencies that are: +* required for your connector to work need to go to `MAIN_REQUIREMENTS` list. +* required for the testing need to go to `TEST_REQUIREMENTS` list + +### Publishing a new version of the connector +You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what? +1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-kintone test` +2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). +3. Make sure the `metadata.yaml` content is up to date. +4. Make the connector documentation and its changelog is up to date (`docs/integrations/sources/kintone.md`). +5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention). +6. Pat yourself on the back for being an awesome contributor. +7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. + diff --git a/airbyte-integrations/connectors/source-kintone/acceptance-test-config.yml b/airbyte-integrations/connectors/source-kintone/acceptance-test-config.yml new file mode 100644 index 000000000000..e68ca58194e3 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/acceptance-test-config.yml @@ -0,0 +1,39 @@ +# See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) +# for more information about how to configure these tests +connector_image: airbyte/source-kintone:dev +acceptance_tests: + spec: + tests: + - spec_path: "source_kintone/spec.yaml" + connection: + tests: + - config_path: "secrets/config.json" + status: "succeed" + - config_path: "integration_tests/invalid_config.json" + status: "failed" + discovery: + tests: + - config_path: "secrets/config.json" + basic_read: + tests: + - config_path: "secrets/config.json" + configured_catalog_path: "integration_tests/configured_catalog.json" + empty_streams: [] +# TODO uncomment this block to specify that the tests should assert the connector outputs the records provided in the input file a file +# expect_records: +# path: "integration_tests/expected_records.jsonl" +# extra_fields: no +# exact_order: no +# extra_records: yes + incremental: + bypass_reason: "This connector does not implement incremental sync" +# TODO uncomment this block this block if your connector implements incremental sync: +# tests: +# - config_path: "secrets/config.json" +# configured_catalog_path: "integration_tests/configured_catalog.json" +# future_state: +# future_state_path: "integration_tests/abnormal_state.json" + full_refresh: + tests: + - config_path: "secrets/config.json" + configured_catalog_path: "integration_tests/configured_catalog.json" diff --git a/airbyte-integrations/connectors/source-kintone/icon.png b/airbyte-integrations/connectors/source-kintone/icon.png new file mode 100644 index 000000000000..f591d7d54609 Binary files /dev/null and b/airbyte-integrations/connectors/source-kintone/icon.png differ diff --git a/airbyte-integrations/connectors/source-kintone/integration_tests/__init__.py b/airbyte-integrations/connectors/source-kintone/integration_tests/__init__.py new file mode 100644 index 000000000000..c941b3045795 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/integration_tests/__init__.py @@ -0,0 +1,3 @@ +# +# Copyright (c) 2023 Airbyte, Inc., all rights reserved. +# diff --git a/airbyte-integrations/connectors/source-kintone/integration_tests/abnormal_state.json b/airbyte-integrations/connectors/source-kintone/integration_tests/abnormal_state.json new file mode 100644 index 000000000000..52b0f2c2118f --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/integration_tests/abnormal_state.json @@ -0,0 +1,5 @@ +{ + "todo-stream-name": { + "todo-field-name": "todo-abnormal-value" + } +} diff --git a/airbyte-integrations/connectors/source-kintone/integration_tests/acceptance.py b/airbyte-integrations/connectors/source-kintone/integration_tests/acceptance.py new file mode 100644 index 000000000000..43ce950d77ca --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/integration_tests/acceptance.py @@ -0,0 +1,16 @@ +# +# Copyright (c) 2023 Airbyte, Inc., all rights reserved. +# + + +import pytest + +pytest_plugins = ("connector_acceptance_test.plugin",) + + +@pytest.fixture(scope="session", autouse=True) +def connector_setup(): + """This fixture is a placeholder for external resources that acceptance test might require.""" + # TODO: setup test dependencies + yield + # TODO: clean up test dependencies diff --git a/airbyte-integrations/connectors/source-kintone/integration_tests/configured_catalog.json b/airbyte-integrations/connectors/source-kintone/integration_tests/configured_catalog.json new file mode 100644 index 000000000000..8903e8375a42 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/integration_tests/configured_catalog.json @@ -0,0 +1,13 @@ +{ + "streams": [ + { + "stream": { + "name": "AirbyteDemo", + "json_schema": {}, + "supported_sync_modes": ["full_refresh"] + }, + "sync_mode": "full_refresh", + "destination_sync_mode": "overwrite" + } + ] +} diff --git a/airbyte-integrations/connectors/source-kintone/integration_tests/configured_catalog.json.origin b/airbyte-integrations/connectors/source-kintone/integration_tests/configured_catalog.json.origin new file mode 100644 index 000000000000..b999c2ba3abf --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/integration_tests/configured_catalog.json.origin @@ -0,0 +1,15 @@ +{ + "streams": [ + { + "stream": { + "name": "table_name", + "json_schema": {}, + "supported_sync_modes": ["full_refresh"], + "source_defined_cursor": false, + "default_cursor_field": ["column_name"] + }, + "sync_mode": "full_refresh", + "destination_sync_mode": "overwrite" + } + ] +} diff --git a/airbyte-integrations/connectors/source-kintone/integration_tests/invalid_config.json b/airbyte-integrations/connectors/source-kintone/integration_tests/invalid_config.json new file mode 100644 index 000000000000..f3732995784f --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/integration_tests/invalid_config.json @@ -0,0 +1,3 @@ +{ + "todo-wrong-field": "this should be an incomplete config file, used in standard tests" +} diff --git a/airbyte-integrations/connectors/source-kintone/integration_tests/sample_config.json b/airbyte-integrations/connectors/source-kintone/integration_tests/sample_config.json new file mode 100644 index 000000000000..ecc4913b84c7 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/integration_tests/sample_config.json @@ -0,0 +1,3 @@ +{ + "fix-me": "TODO" +} diff --git a/airbyte-integrations/connectors/source-kintone/integration_tests/sample_state.json b/airbyte-integrations/connectors/source-kintone/integration_tests/sample_state.json new file mode 100644 index 000000000000..3587e579822d --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/integration_tests/sample_state.json @@ -0,0 +1,5 @@ +{ + "todo-stream-name": { + "todo-field-name": "value" + } +} diff --git a/airbyte-integrations/connectors/source-kintone/main.py b/airbyte-integrations/connectors/source-kintone/main.py new file mode 100644 index 000000000000..3a787fa07f1b --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/main.py @@ -0,0 +1,8 @@ +# +# Copyright (c) 2023 Airbyte, Inc., all rights reserved. +# + +from source_kintone.run import run + +if __name__ == "__main__": + run() diff --git a/airbyte-integrations/connectors/source-kintone/metadata.yaml b/airbyte-integrations/connectors/source-kintone/metadata.yaml new file mode 100644 index 000000000000..4b2954521afd --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/metadata.yaml @@ -0,0 +1,30 @@ +data: + allowedHosts: + hosts: + - TODO # Please change to the hostname of the source. + registries: + oss: + enabled: false + cloud: + enabled: false + connectorBuildOptions: + # Please update to the latest version of the connector base image. + # https://hub.docker.com/r/airbyte/python-connector-base + # Please use the full address with sha256 hash to guarantee build reproducibility. + baseImage: docker.io/airbyte/python-connector-base:1.0.0@sha256:dd17e347fbda94f7c3abff539be298a65af2d7fc27a307d89297df1081a45c27 + connectorSubtype: api + connectorType: source + definitionId: 2573e588-c875-4119-898f-60302f4d4ce4 + dockerImageTag: 0.1.0 + dockerRepository: airbyte/source-kintone + githubIssueLabel: source-kintone + icon: icon.png + license: MIT + name: Kintone + releaseDate: TODO + supportLevel: community + releaseStage: alpha + documentationUrl: https://docs.airbyte.com/integrations/sources/kintone + tags: + - language:python +metadataSpecVersion: "1.0" diff --git a/airbyte-integrations/connectors/source-kintone/requirements.txt b/airbyte-integrations/connectors/source-kintone/requirements.txt new file mode 100644 index 000000000000..7b9114ed5867 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/requirements.txt @@ -0,0 +1,2 @@ +# This file is autogenerated -- only edit if you know what you are doing. Use setup.py for declaring dependencies. +-e . diff --git a/airbyte-integrations/connectors/source-kintone/setup.py b/airbyte-integrations/connectors/source-kintone/setup.py new file mode 100644 index 000000000000..cd03d131be69 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/setup.py @@ -0,0 +1,35 @@ +# +# Copyright (c) 2023 Airbyte, Inc., all rights reserved. +# + + +from setuptools import find_packages, setup + +MAIN_REQUIREMENTS = [ + "airbyte-cdk~=0.2", +] + +TEST_REQUIREMENTS = [ + "requests-mock~=1.9.3", + "pytest-mock~=3.6.1", + "pytest~=6.2", + "connector-acceptance-test", +] + +setup( + name="source_kintone", + description="Source implementation for Kintone.", + author="Airbyte", + author_email="contact@airbyte.io", + packages=find_packages(), + install_requires=MAIN_REQUIREMENTS, + package_data={"": ["*.json", "*.yaml"]}, + extras_require={ + "tests": TEST_REQUIREMENTS, + }, + entry_points={ + "console_scripts": [ + "source-kintone=source_kintone.run:run", + ], + }, +) diff --git a/airbyte-integrations/connectors/source-kintone/source_kintone/__init__.py b/airbyte-integrations/connectors/source-kintone/source_kintone/__init__.py new file mode 100644 index 000000000000..0dc2aa2a8499 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/source_kintone/__init__.py @@ -0,0 +1,8 @@ +# +# Copyright (c) 2023 Airbyte, Inc., all rights reserved. +# + + +from .source import SourceKintone + +__all__ = ["SourceKintone"] diff --git a/airbyte-integrations/connectors/source-kintone/source_kintone/client.py b/airbyte-integrations/connectors/source-kintone/source_kintone/client.py new file mode 100644 index 000000000000..89dcea3a3e83 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/source_kintone/client.py @@ -0,0 +1,161 @@ +import urllib.request +import urllib.parse +import json + +from airbyte_cdk.logger import AirbyteLogger +from airbyte_cdk.models import ( + AirbyteStream, +) + + +class Client: + def __init__(self, config: json, logger: AirbyteLogger): + self.subdomain = config.get("subdomain") + self.api_token = config.get("api_token") + self.logger = logger + + def _get_base_url(self): + return "https://" + self.subdomain + ".cybozu.com" + + def _get_base_header(self): + return { + "X-Cybozu-API-Token": self.api_token, + } + + def _fetch_data(self, url: str, params: dict, use_json: bool = False): + headers = self._get_base_header() + if use_json: + headers["Content-Type"] = "application/json" + req = urllib.request.Request( + url=url, + headers=headers, + data=json.dumps(params).encode("utf-8"), + method="GET", + ) + else: + url = update_url(url, params) + req = urllib.request.Request( + url=url, + headers=headers, + method="GET", + ) + + return json.load(urllib.request.urlopen(req)) + + + def _post_data(self, url: str, params: dict): + headers = self._get_base_header() + headers["Content-Type"] = "application/json" + req = urllib.request.Request( + url=url, + headers=headers, + data=json.dumps(params).encode("utf-8"), + method="POST", + ) + return json.load( + urllib.request.urlopen(req) + ) + + def _delete_data(self, url: str, params: dict): + headers = self._get_base_header() + headers["Content-Type"] = "application/json" + return urllib.request.Request( + url=url, + headers=headers, + data=json.dumps(params).encode("utf-8"), + method="DELETE", + ) + + # ref. https://kintone.dev/en/docs/kintone/rest-api/apps/get-app/ + def get_app(self, app_id: str): + url = self._get_base_url() + "/k/v1/app.json" + params = {"id": app_id} + return self._fetch_data(url, params) + + # ref. https://kintone.dev/en/docs/kintone/rest-api/apps/get-form-fields/ + def get_app_fields(self, app_id: str): + url = self._get_base_url() + "/k/v1/app/form/fields.json" + params = {"app": app_id} + res = self._fetch_data(url, params) + fields = res["properties"] + without_fields = ["カテゴリー", "作業者", "ステータス"] + return {x: fields[x] for x in fields if x not in without_fields} + + # ref. https://kintone.dev/en/docs/kintone/rest-api/records/add-cursor/ + def create_cursor(self, app_id: str, cursor_field: str = None, cursor_value: str = None): + params = { + "app": app_id, + "size": 500, + } + # https://cybozu.dev/ja/kintone/docs/overview/query/ + if cursor_field and cursor_value: + params["query"] = f"{cursor_field} > \"{cursor_value}\"" + url = self._get_base_url() + "/k/v1/records/cursor.json" + return self._post_data(url, params) + + def get_app_records(self, app_id: str, cursor_field: str = None, cursor_value: str = None): + cursor = self.create_cursor(app_id, cursor_field, cursor_value) + while True: + records, next = self.get_records_by_cursor(cursor["id"]) + for record in records: + yield record + if next == False: + self.delete_cursor(cursor["id"]) + break + + + # ref. https://kintone.dev/en/docs/kintone/rest-api/records/get-cursor/ + def get_records_by_cursor(self, cursor_id: str): + params = { + "id": cursor_id, + } + url = self._get_base_url() + "/k/v1/records/cursor.json" + res = self._fetch_data(url, params, True) + records = [] + for record in res["records"]: + item = {} + for k, v in record.items(): + item[k] = v["value"] + + records.append(item) + + return records, res["next"] + + # ref. https://kintone.dev/en/docs/kintone/rest-api/records/delete-cursor/ + def delete_cursor(self, cursor_id: str): + params = { + "id": cursor_id, + } + url = self._get_base_url() + "/k/v1/records/cursor.json" + return self._delete_data(url, params) + + def get_stream(self, app_id: str): + app = self.get_app(app_id) + # https://jp.cybozu.help/k/en/user/app_settings/app_othersettings/appcode.html + if not app.get('code'): + return None + + json_schema = { + "$schema": "http://json-schema.org/draft-07/schema#", + "type": "object", + "properties": {}, + } + + fields = self.get_app_fields(app_id) + for k, _ in fields.items(): + # TODO: Implement more accurate type casting + # https://kintone.dev/en/docs/kintone/overview/field-types/ + json_schema["properties"][k] = {"type": "string"} + + return AirbyteStream( + name=app["code"], + json_schema=json_schema, + supported_sync_modes=["full_refresh", "incremental"], + default_cursor_field=["updated_at"], + ) + +def update_url(url, params): + url_parts = urllib.parse.urlparse(url) + query = dict(urllib.parse.parse_qsl(url_parts.query)) + query.update(params) + return url_parts._replace(query=urllib.parse.urlencode(query)).geturl() diff --git a/airbyte-integrations/connectors/source-kintone/source_kintone/run.py b/airbyte-integrations/connectors/source-kintone/source_kintone/run.py new file mode 100644 index 000000000000..fc8252d8d88f --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/source_kintone/run.py @@ -0,0 +1,13 @@ +# +# Copyright (c) 2023 Airbyte, Inc., all rights reserved. +# + + +import sys + +from airbyte_cdk.entrypoint import launch +from .source import SourceKintone + +def run(): + source = SourceKintone() + launch(source, sys.argv[1:]) diff --git a/airbyte-integrations/connectors/source-kintone/source_kintone/source.py b/airbyte-integrations/connectors/source-kintone/source_kintone/source.py new file mode 100644 index 000000000000..a8751598ddd2 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/source_kintone/source.py @@ -0,0 +1,157 @@ +# +# Copyright (c) 2023 Airbyte, Inc., all rights reserved. +# + + +import json +from datetime import datetime +from typing import Dict, Generator + +from airbyte_cdk.logger import AirbyteLogger +from airbyte_cdk.models import ( + AirbyteCatalog, + AirbyteConnectionStatus, + AirbyteMessage, + AirbyteRecordMessage, + AirbyteStream, + ConfiguredAirbyteCatalog, + Status, + Type, +) +from airbyte_cdk.sources import Source +from .client import Client + + +class SourceKintone(Source): + def check(self, logger: AirbyteLogger, config: json) -> AirbyteConnectionStatus: + """ + Tests if the input configuration can be used to successfully connect to the integration + e.g: if a provided Stripe API token can be used to connect to the Stripe API. + + :param logger: Logging object to display debug/info/error to the logs + (logs will not be accessible via airbyte UI if they are not passed to this logger) + :param config: Json object containing the configuration of this source, content of this json is as specified in + the properties of the spec.yaml file + + :return: AirbyteConnectionStatus indicating a Success or Failure + """ + client = Client(config, logger) + for app_id in config.get("app_ids"): + app = client.get_app(app_id) + if not app.get('code'): + return AirbyteConnectionStatus( + status=Status.FAILED, + message=f"App code is not found of app_id={app_id}.\n" + f"https://jp.cybozu.help/k/en/user/app_settings/app_othersettings/appcode.html" + ) + + return AirbyteConnectionStatus(status=Status.SUCCEEDED) + + def discover(self, logger: AirbyteLogger, config: json) -> AirbyteCatalog: + """ + Returns an AirbyteCatalog representing the available streams and fields in this integration. + For example, given valid credentials to a Postgres database, + returns an Airbyte catalog where each postgres table is a stream, and each table column is a field. + + :param logger: Logging object to display debug/info/error to the logs + (logs will not be accessible via airbyte UI if they are not passed to this logger) + :param config: Json object containing the configuration of this source, content of this json is as specified in + the properties of the spec.yaml file + + :return: AirbyteCatalog is an object describing a list of all available streams in this source. + A stream is an AirbyteStream object that includes: + - its stream name (or table name in the case of Postgres) + - json_schema providing the specifications of expected schema for this stream (a list of columns described + by their names and types) + """ + streams = [] + + client = Client(config, logger) + for app_id in config.get("app_ids"): + stream = client.get_stream(app_id) + if stream: + streams.append(stream) + + return AirbyteCatalog(streams=streams) + + def read( + self, logger: AirbyteLogger, config: json, catalog: ConfiguredAirbyteCatalog, state: Dict[str, any] + ) -> Generator[AirbyteMessage, None, None]: + """ + Returns a generator of the AirbyteMessages generated by reading the source with the given configuration, + catalog, and state. + + :param logger: Logging object to display debug/info/error to the logs + (logs will not be accessible via airbyte UI if they are not passed to this logger) + :param config: Json object containing the configuration of this source, content of this json is as specified in + the properties of the spec.yaml file + :param catalog: The input catalog is a ConfiguredAirbyteCatalog which is almost the same as AirbyteCatalog + returned by discover(), but + in addition, it's been configured in the UI! For each particular stream and field, there may have been provided + with extra modifications such as: filtering streams and/or columns out, renaming some entities, etc + :param state: When a Airbyte reads data from a source, it might need to keep a checkpoint cursor to resume + replication in the future from that saved checkpoint. + This is the object that is provided with state from previous runs and avoid replicating the entire set of + data everytime. + + :return: A generator that produces a stream of AirbyteRecordMessage contained in AirbyteMessage object. + """ + logger.info(f"state: {state}") + + client = Client(config, logger) + apps = {} + for app_id in config.get("app_ids"): + app = client.get_app(app_id) + apps[app["code"]] = app + + cursor_field, cursor_value = None, None + for configured_stream in catalog.streams: + stream_name = configured_stream.stream.name + app = apps.get(stream_name) + if not app: + logger.info(f"stream '{stream_name}' app code is not found") + continue + + if configured_stream.sync_mode == SyncMode.full_refresh: + logger.info( + f"syncing stream '{stream_name}' with full_refresh") + + elif configured_stream.sync_mode == SyncMode.incremental: + logger.info(f"syncing stream '{stream_name}' with incremental") + if stream_name not in state: + state[stream_name] = {} + + cursor_field = configured_stream.cursor_field[0] + cursor_value = state[stream_name].get(cursor_field, None) + + else: + logger.error( + f"could not sync stream '{configured_stream.stream.name}', invalid sync_mode: {configured_stream.sync_mode}") + + records = client.get_app_records(app["appId"], cursor_field, cursor_value) + for record in records: + cursor_value = record.get(cursor_field, cursor_value) + yield AirbyteMessage( + type=Type.RECORD, + record=AirbyteRecordMessage( + stream=app["code"], + data=record, + emitted_at=self._find_emitted_at(), + )) + state[stream_name][cursor_field] = cursor_value + + yield AirbyteMessage( + type=Type.STATE, + state=AirbyteStateMessage( + data=state, + emitted_at=self._find_emitted_at(), + ), + ) + + logger.info(f"Finished syncing {self.__class__.__name__}") + + + def _find_emitted_at(self) -> int: + # Returns now, in microseconds. This is a seperate function, so that it is easy + # to replace in unit tests. + return int(datetime.now().timestamp()) * 1000 diff --git a/airbyte-integrations/connectors/source-kintone/source_kintone/spec.yaml b/airbyte-integrations/connectors/source-kintone/source_kintone/spec.yaml new file mode 100644 index 000000000000..b71f84ec2f86 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/source_kintone/spec.yaml @@ -0,0 +1,32 @@ +documentationUrl: https://docsurl.com +connectionSpecification: + $schema: http://json-schema.org/draft-07/schema# + title: Kintone Spec + type: object + required: + - subdomain + - api_token + - app_ids + properties: + subdomain: + type: string + description: >- + A subdomain is represented by the part of the URL that is used to access kintone on a Web browser. + For example, in the access URL "https://example.cybozu.com/", the subdomain is "example". + https://jp.cybozu.help/k/en/glossary/subdomain.html + api_token: + type: string + description: >- + API tokens are used for authenticating kintone REST API requests sent from external programs. + Specifying an API token in the "X-Cybozu-API-Token" header of your REST API request authenticates the request, and the API is executed. You can also specify multiple API tokens with commas in between. + https://jp.cybozu.help/k/en/user/app_settings/api_token.html + airbyte_secret: true + app_ids: + type: array + items: + type: string + description: >- + Open the Kintone application screen and check the URL. + The trailing number displayed after "https://[subdomain].cybozu.com/k/" is the app ID. + e.g. https://[subdomain].cybozu.com/k/27 + https://support.kincone.com/hc/ja/articles/4411880749081-kintone%E9%80%A3%E6%90%BA-%E3%82%A2%E3%83%97%E3%83%AAID%E3%81%AE%E7%A2%BA%E8%AA%8D diff --git a/airbyte-integrations/connectors/source-kintone/unit_tests/unit_test.py b/airbyte-integrations/connectors/source-kintone/unit_tests/unit_test.py new file mode 100644 index 000000000000..219ae0142c72 --- /dev/null +++ b/airbyte-integrations/connectors/source-kintone/unit_tests/unit_test.py @@ -0,0 +1,7 @@ +# +# Copyright (c) 2023 Airbyte, Inc., all rights reserved. +# + + +def test_example_method(): + assert True