docs/dev-guides/agent-context/snowflake.md
📚 Navigation: ← Back to Agent Context Kit | ← LangChain Integration | Copilot Studio Integration →
Snowflake Intelligence provides powerful text-to-SQL capabilities, but it only sees raw table and column names. Without business context, your Snowflake Intelligence agents:
customer_revenue from customer_revenue_archiveDataHub's Snowflake Context Connector solves this by providing Snowflake Intelligence with access to your DataHub metadata through User-Defined Functions (UDFs). This enables:
-- Agent can now answer:
"Find all tables with customer data in the marketing domain"
"What datasets are tagged as PII?"
"Show me the business definition from the glossary for MRR"
"Which tables does the finance team own?"
DataHub's Snowflake context connector allows creating Snowflake Intelligence agents with UDFs that can access DataHub Context to power text-to-SQL generation. The connector creates an integration with DataHub that can search DataHub for documents and assets, as well as read Snowflake data tables and generate queries to help answer questions.
This guide and agent creation tool will create an end-to-end experience for setting up a Snowflake Intelligence agent that can be used in Snowflake Intelligence.
pip install datahub-agent-context[snowflake]
In order to interact with DataHub, you will need the following from your DataHub account
https://<tenant>.acryl.ioIn order to execute the SQL in Snowflake, you will need a user with the SNOWFLAKE_ADMIN role to configure the rules and UDFs. This is necessary to set up the secrets and networking options. Once the initial setup is completed, the SNOWFLAKE_INTELLIGENCE_ADMIN role is required to do edits and further configuration.
In order to use DataHub tools in Snowflake intelligence, we'll create a new agent that has access to UDFs that can use the context inside of DataHub to answer using business questions. We can either run the SQL to register the agent and tools ourselves, or let the DataHub Agents CLI execute it for us.
Add --execute to automatically execute the SQL as your Snowflake user.
Authentication:
--sf-password to automatically run the generated scripts with your password or PAT.--sf-authenticator externalBrowser.datahub agent create snowflake \
--sf-account YOUR_ACCOUNT \
--sf-user YOUR_USER \
--sf-password YOUR_PASSWORD \
--sf-role YOUR_ROLE \
--sf-warehouse YOUR_WAREHOUSE \
--sf-database YOUR_DATABASE \
--sf-schema YOUR_SCHEMA \
--datahub-url https://your-datahub.acryl.io \
--datahub-token YOUR_TOKEN \
--enable-mutations \
--execute
This will automatically execute the commands in your Snowflake environment and output the results. This workflow is recommended for a hands off default experience.
<p align="center"> </p>datahub agent create snowflake \
--sf-account YOUR_ACCOUNT \
--sf-user YOUR_USER \
--sf-role YOUR_ROLE \
--sf-warehouse YOUR_WAREHOUSE \
--sf-database YOUR_DATABASE \
--sf-schema YOUR_SCHEMA \
--datahub-url https://your-datahub.acryl.io \
--datahub-token YOUR_TOKEN \
--enable-mutations
In this version you will need to execute 5 SQL files in your snowflake UI as a notebook. This is recommended for advanced workflows or if you want to review or make changes to the configuration before publishing. These configuration files can be pushed into version control if desired to maintain history.
-- 1. Set up configuration and secrets
@00_configuration.sql;
-- 2. Create network rules
@01_network_rules.sql;
-- 3. Create DataHub UDFs
@02_datahub_udfs.sql;
-- 4. Create stored procedure
@03_stored_procedure.sql;
-- 5. Create Cortex Agent
@04_cortex_agent.sql;
Once your agent is created, you can further customize it's prompt settings, models, tools, & more inside the Snowflake user interface. It is recommended to make sure your prompt and model is tweaked for your specific use case and requirements.
<p align="center"> </p>Open Snowflake Intelligence and select the DataHub Agent.
<p align="center"> </p>Periodically we expect to add new tools and update existing tools. To update the tools for your DataHub Agent, simply run the following SQL snippets to update the tool definitions and SDK:
-- 1. Create updated DataHub UDFs
@02_datahub_udfs.sql;
-- 2. Create updated Cortex Agent
@04_cortex_agent.sql;
After running these updates, refresh your Snowflake Intelligence UI to see the new tools.
For more info on the tools exposed and DataHub Context, see the DataHub Agent Context Documentation.
Symptoms: Insufficient privileges to operate on database/schema errors during setup
Solutions:
ACCOUNTADMIN or SECURITYADMIN role for initial setupCREATE DATABASE and CREATE INTEGRATION privilegesUSAGE on the warehouseSHOW GRANTS TO USER <your_user>; to verify permissionsSymptoms: Cannot create network rules for DataHub connection
Solutions:
https://your-instance.acryl.ioSymptoms: Cannot create secret for DataHub token
Solutions:
eyJ)SHOW SECRETS IN SCHEMA <schema>; to check existing secretsSymptoms: Agent doesn't use DataHub UDFs or says "function not found"
Solutions:
SHOW USER FUNCTIONS IN SCHEMA <schema>;@04_cortex_agent.sqlSymptoms: UDF calls fail with timeout or connection errors
Solutions:
Symptoms: Agent says "no data found" even though data exists in DataHub
Solutions:
Symptoms: Agent generates SQL without consulting DataHub metadata
Solutions:
Symptoms: Agent suggests tables that don't exist
Solutions:
datahub agent create snowflake Command Not FoundSolutions:
# Ensure package is installed with CLI
pip install --upgrade 'acryl-datahub[cli]'
pip install datahub-agent-context[snowflake]
# Verify installation
datahub version
datahub agent --help
--executeSolutions:
--sf-authenticator externalbrowser and ensure browser can open--sf-private-key-path with your key filesnowsql -a <account> -u <user>Solutions:
# When using --execute, add verbose flag
datahub agent create snowflake \
--execute \
--verbose \
...other flags...
-- Test search UDF
SELECT datahub_search('customer', {'entity_type': ['dataset']}, 10);
-- Test document search
SELECT datahub_search_documents('data retention policy', 5);
-- Check UDF definitions
SHOW USER FUNCTIONS LIKE 'datahub%';
In Snowflake Intelligence UI:
| Error | Cause | Solution |
|---|---|---|
Network rule violation | DataHub URL not allowed | Update network rules to include DataHub domain |
Secret not found | Secret name mismatch | Verify secret name matches in UDF and creation script |
Insufficient privileges | Missing permissions | Grant required roles and privileges |
Invalid access token | Token expired or invalid | Regenerate DataHub token and update secret |
Function does not exist | UDF not created | Run @02_datahub_udfs.sql to create UDFs |
After setup, verify everything works:
SHOW USER FUNCTIONS LIKE 'datahub%';SHOW NETWORK RULES;SHOW SECRETS IN SCHEMA <schema>;SELECT datahub_search('test', {}, 5);