plugin/trino-bigquery/README.md
The BigQuery connector module has both unit tests and integration tests. The integration tests require access to a BigQuery instance in Google Cloud. You can follow the steps below to be able to run the integration tests locally.
Build the project by following the instructions here.
Create a Google Cloud Storage bucket using gsutil mb gs://DESTINATION_BUCKET_NAME
Run gsutil cp plugin/trino-bigquery/src/test/resources/region.csv gs://DESTINATION_BUCKET_NAME/tpch/tiny/region.csv
(replace DESTINATION_BUCKET_NAME with the target bucket name).
Create a service account in Google Cloud with the BigQuery Admin role assigned.
Get the base64 encoded text of the service account credentials file using base64 /path/to/service_account_credentials.json.
Create a new BigQuery CLOUD_RESOURCE connection and grant the connection service account GCS permissions.
Documentation.
bq mk --connection --location=us --project_id=$PROJECT_ID --connection_type=CLOUD_RESOURCE $CONNECTION_IDbq show --connection $PROJECT_ID.us.$CONNECTION_ID, which will display the service account ID.gsutil iam ch serviceAccount:[email protected]:objectViewer gs://DESTINATION_BUCKET_NAMEThe TestBigQueryWithDifferentProjectIdConnectorSmokeTest requires an alternate project ID which is different from the
project ID attached to the service account but the service account still has access to.
Set the VM options testing.bigquery.credentials-key, testing.gcp-storage-bucket, testing.alternate-bq-project-id, and testing.bigquery-connection-id in the IntelliJ "Run Configuration"
(or on the CLI if using Maven directly). It should look something like
-Dtesting.bigquery.credentials-key=base64-text -Dtesting.gcp-storage-bucket=DESTINATION_BUCKET_NAME -Dtesting.alternate-bq-project-id=bigquery-cicd-alternate -Dtesting.bigquery-connection-id=my_project.us.connection-id.
Run any test of your choice.