docs/src/main/sphinx/connector/snowflake.md
The Snowflake connector allows querying and creating tables in an external Snowflake account. This can be used to join data between different systems like Snowflake and Hive, or between two different Snowflake accounts.
To configure the Snowflake connector, create a catalog properties file
in etc/catalog named, for example, example.properties, to
mount the Snowflake connector as the snowflake catalog.
Create the file with the following contents, replacing the
connection properties as appropriate for your setup:
connector.name=snowflake
connection-url=jdbc:snowflake://<account>.snowflakecomputing.com
connection-user=root
connection-password=secret
snowflake.account=account
snowflake.database=database
snowflake.role=role
snowflake.warehouse=warehouse
The Snowflake connector uses Apache Arrow as the serialization format when reading from Snowflake. Add the following required, additional JVM argument to the :
--add-opens=java.base/java.nio=ALL-UNNAMED
--sun-misc-unsafe-memory-access=allow
The Snowflake connector can only access a single database within a Snowflake account. Thus, if you have multiple Snowflake databases, or want to connect to multiple Snowflake accounts, you must configure multiple instances of the Snowflake connector.
% snowflake-type-mapping:
Because Trino and Snowflake each support types that the other does not, this
connector {ref}modifies some types <type-mapping-overview> when reading or
writing data. Data types may not map the same way in both directions between
Trino and the data source. Refer to the following sections for type mapping in
each direction.
List of Snowflake data types.
The connector maps Snowflake types to the corresponding Trino types following this table:
:::{list-table} Snowflake type to Trino type mapping :widths: 30, 30, 40 :header-rows: 1
BOOLEANBOOLEANINT, INTEGER, BIGINT, SMALLINT, TINYINT, BYTEINTDECIMAL(38,0)NUMBER(38,0). See Snowflake
data types for fixed point numbers
for more information.FLOAT, FLOAT4, FLOAT8DOUBLEFLOAT, FLOAT4, and FLOAT8 are for compatibility with other systems; Snowflake treats all three as
64-bit floating-point numbers. See Snowflake
data types for floating point numbers
for more information.DOUBLE, DOUBLE PRECISION, REALDOUBLEFLOAT. See Snowflake
data types for floating point numbers
for more information.NUMBERDECIMALDECIMAL, NUMERICDECIMALNUMBER. See Snowflake
data types for fixed point numbers
for more information.VARCHARVARCHARCHAR, CHARACTERVARCHARVARCHAR except default length is VARCHAR(1). See Snowflake
String & Binary Data Types
for more information.STRING, TEXTVARCHARVARCHAR. See Snowflake
String & Binary Data Types
for more information.BINARYVARBINARYVARBINARYVARBINARYBINARY. See Snowflake
String & Binary Data Types
for more information.DATEDATETIMETIMETIMESTAMP_NTZTIMESTAMPDATETIMETIMESTAMPTIMESTAMP_NTZ. See Snowflake
Date & Time Data Types
for more information.TIMESTAMPTIMESTAMPTIMESTAMP variations (TIMESTAMP_NTZ by default). This connector always sets TIMESTAMP_NTZ as the variant.TIMESTAMP_TZTIMESTAMP WITH TIME ZONENo other types are supported.
The connector maps Trino types to the corresponding Snowflake types following this table:
:::{list-table} Trino type to Snowflake type mapping :widths: 30, 30, 40 :header-rows: 1
BOOLEANBOOLEANTINYINTNUMBER(3, 0)SMALLINTNUMBER(5, 0)INTEGERNUMBER(10, 0)BIGINTNUMBER(19, 0)REALDOUBLEDOUBLEDOUBLEDECIMALNUMBERVARCHARVARCHARCHARVARCHARVARBINARYBINARYVARBINARYVARBINARYDATEDATETIMETIMETIMESTAMPTIMESTAMP_NTZTIMESTAMP WITH TIME ZONETIMESTAMP_TZ:::
No other types are supported.
(snowflake-sql-support)=
The connector provides read access and write access to data and metadata in a Snowflake database. In addition to the globally available and read operation statements, the connector supports the following features:
(snowflake-insert)=
(snowflake-procedures)=
(snowflake-table-functions)=
The connector provides specific table functions to access Snowflake.
(snowflake-query-function)=
query(varchar) -> tableThe query function allows you to query the underlying database directly. It
requires syntax native to Snowflake, because the full query is pushed down and
processed in Snowflake. This can be useful for accessing native features which
are not available in Trino or for improving query performance in situations
where running a query natively may be faster.
Find details about the SQL support of Snowflake that you can use in the query in the Snowflake SQL Command Reference, including PIVOT, lateral joins and other statements and functions.
As a simple example, query the example catalog and select an entire table:
SELECT
*
FROM
TABLE(
example.system.query(
query => 'SELECT
*
FROM
tpch.nation'
)
);
As a practical example, you can use the Snowflake SQL support for PIVOT to pivot on all distinct column values automatically with a dynamic pivot.
SELECT
*
FROM
TABLE(
example.system.query(
query => '
SELECT *
FROM quarterly_sales
PIVOT(SUM(amount) FOR quarter IN (ANY ORDER BY quarter))
ORDER BY empid;
'
)
);
The connector includes a number of performance improvements, detailed in the following sections.
(snowflake-pushdown)=
The connector supports pushdown for a number of operations:
{ref}Aggregate pushdown <aggregation-pushdown> for the following functions:
avgcountmaxminsumstddevstddev_popstddev_sampvariancevar_popvar_sampcovar_popcovar_sampcorrregr_interceptregr_slope