airbyte-integrations/connectors/destination-databricks/README.md
This is the repository for the Databricks destination connector in Java. For information about how to use this connector within Airbyte, see the User Documentation.
This connector requires a JDBC driver to connect to Databricks cluster. Before using this connector, you must agree to the JDBC ODBC driver license. This means that you can only use this driver to connector third party applications to Apache Spark SQL within a Databricks offering using the ODBC and/or JDBC protocols.
From the Airbyte repository root, run:
./gradlew :airbyte-integrations:connectors:destination-databricks:build
If you are a community contributor, you will need access to AWS S3, Azure blob storage, and Databricks cluster to run the integration tests:
sample_secrets/config.json, which conforms to the spec file in src/main/resources/spec.json.sample_secrets/azure_config.json, which conforms to the spec file in src/main/resources/spec.json.sample_secrets to secrets.secrets directory is git-ignored by default, so there is no danger of accidentally checking in sensitive information.If you are an Airbyte core member:
destination databricks creds secrets on Last Pass, and put it in sample_secrets/config.json.sample_secrets to secrets.Build the connector image via Gradle:
./gradlew :airbyte-integrations:connectors:destination-databricks:buildConnectorImage
Once built, the docker image name and tag on your host will be airbyte/destination-databricks:dev.
the Dockerfile.
Then run any of the connector commands as follows:
docker run --rm airbyte/destination-databricks:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-databricks:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-databricks:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-databricks:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
We use JUnit for Java tests.
Place unit tests under src/test/io/airbyte/integrations/destinations/databricks.
Airbyte has a standard test suite that all destination connectors must pass. Implement the TODOs in
src/test-integration/java/io/airbyte/integrations/destinations/databricksDestinationAcceptanceTest.java.
All commands should be run from airbyte project root. To run unit tests:
./gradlew :airbyte-integrations:connectors:destination-databricks:unitTest
To run acceptance and custom integration tests:
./gradlew :airbyte-integrations:connectors:destination-databricks:integrationTest
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
airbyte-ci connectors --name=destination-databricks testmetadata.yaml: increment the dockerImageTag value. Please follow semantic versioning for connectors.metadata.yaml content is up to date.docs/integrations/destinations/databricks.md).