airbyte-integrations/connectors/destination-aws-datalake/README.md
This is the repository for the Aws Datalake destination connector, written in Python. For information about how to use this connector within Airbyte, see the documentation.
To iterate on this connector, make sure to complete this prerequisites section.
= 3.7.0From this connector directory, create a virtual environment:
python -m venv .venv
This will generate a virtualenv for this module in .venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:
source .venv/bin/activate
pip install -r requirements.txt
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
Note that while we are installing dependencies from requirements.txt, you should only edit setup.py for your dependencies. requirements.txt is
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will call setup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps in setup.py but install using pip install -r requirements.txt and everything
should work as you expect.
If you are a community contributor, follow the instructions in the documentation
to generate the necessary credentials. Then create a file secrets/config.json conforming to the destination_aws_datalake/spec.json file.
Note that the secrets directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
See integration_tests/sample_config.json for a sample config file.
If you are an Airbyte core member, copy the credentials in Lastpass under the secret name destination aws-datalake test creds
and place them into secrets/config.json.
python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
airbyte-ci to build your connectorThe Airbyte way of building this connector is to use our airbyte-ci tool.
You can follow install instructions here.
Then running the following command will build your connector:
airbyte-ci connectors --name destination-aws-datalake build
Once the command is done, you will find your connector image in your local docker registry: airbyte/destination-aws-datalake:dev.
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
You can customize our build process by adding a build_customization.py module to your connector.
This module should contain a pre_connector_install and post_connector_install async function that will mutate the base image and the connector container respectively.
It will be imported at runtime by our build process and the functions will be called if they exist.
Here is an example of a build_customization.py module:
from __future__ import annotations
from typing import TYPE_CHECKING
if TYPE_CHECKING:
# Feel free to check the dagger documentation for more information on the Container object and its methods.
# https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/
from dagger import Container
async def pre_connector_install(base_image_container: Container) -> Container:
return await base_image_container.with_env_variable("MY_PRE_BUILD_ENV_VAR", "my_pre_build_env_var_value")
async def post_connector_install(connector_container: Container) -> Container:
return await connector_container.with_env_variable("MY_POST_BUILD_ENV_VAR", "my_post_build_env_var_value")
This connector is built using our dynamic built process in airbyte-ci.
The base image used to build it is defined within the metadata.yaml file under the connectorBuildOptions.
The build logic is defined using Dagger here.
It does not rely on a Dockerfile.
If you would like to patch our connector and build your own a simple approach would be to:
FROM airbyte/destination-aws-datalake:latest
COPY . ./airbyte/integration_code
RUN pip install ./airbyte/integration_code
# The entrypoint and default env vars are already set in the base image
# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
Please use this as an example. This is not optimized.
docker build -t airbyte/destination-aws-datalake:dev .
# Running the spec command against your patched connector
docker run airbyte/destination-aws-datalake:dev spec
Then run any of the connector commands as follows:
docker run --rm airbyte/destination-aws-datalake:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-aws-datalake:dev check --config /secrets/config.json
# messages.jsonl is a file containing line-separated JSON representing AirbyteMessages
cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-aws-datalake:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
You can run our full test suite locally using airbyte-ci:
airbyte-ci connectors --name=destination-aws-datalake test
Customize acceptance-test-config.yml file to configure tests. See Connector Acceptance Tests for more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
All of your dependencies should go in setup.py, NOT requirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:
MAIN_REQUIREMENTS list.TEST_REQUIREMENTS listYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
airbyte-ci connectors --name=destination-aws-datalake testmetadata.yaml: increment the dockerImageTag value. Please follow semantic versioning for connectors.metadata.yaml content is up to date.docs/integrations/destinations/aws-datalake.md).