content/flux/v0/write-data/sql/bigquery.md
To write data to Google BigQuery with Flux:
Import the sql package.
Pipe-forward data into sql.to() and provide
the following parameters:
Exec (default is 10000)import "sql"
data
|> sql.to(
driverName: "bigquery",
dataSourceName: "bigquery://projectid/?apiKey=mySuP3r5ecR3tAP1K3y",
table: "exampleTable",
)
The bigquery driver uses the following DSN syntaxes (also known as a connection string):
bigquery://projectid/?param1=value¶m2=value
bigquery://projectid/location?param1=value¶m2=value
The Flux BigQuery implementation uses the Google Cloud Go SDK. Provide your authentication credentials using one of the following methods:
Set the GOOGLE_APPLICATION_CREDENTIALS environment variable to identify the
location of your credential JSON file.
Provide your base-64 encoded service account, refresh token, or JSON credentials using the credentials URL parameter in your BigQuery DSN.
bigquery://projectid/?credentials=eyJ0eXBlIjoiYXV0...
sql.to() converts Flux data types to BigQuery data types.
| Flux data type | BigQuery data type |
|---|---|
| int | INT64 |
| float | FLOAT64 |
| string | STRING |
| bool | BOOL |
| time | TIMESTAMP |