dc-agents/sqlite/README.md
This directory contains an SQLite implementation of a data connector agent. It can use local SQLite database files as referenced by the "db" config field.
The SQLite agent currently supports the following capabilities:
Note: You are able to get detailed metadata about the agent's capabilities by
GETting the /capabilities endpoint of the running agent.
>= 3.38.0 or compiled in JSON support
npm install
npm run build
npm run start
Or a simple dev-loop via entr:
echo src/**/*.ts | xargs -n1 echo | DB_READONLY=y entr -r npm run start
> docker build . -t dc-sqlite-agent:latest
> docker run -it --rm -p 8100:8100 dc-sqlite-agent:latest
You will want to mount a volume with your database(s) so that they can be referenced in configuration.
Note: Boolean flags {FLAG} can be provided as 1, true, t, yes, y, or omitted and default to false.
| ENV Variable Name | Format | Default | Info |
|---|---|---|---|
PORT | INT | 8100 | Port for agent to listen on. |
PERMISSIVE_CORS | {FLAG} | false | Allows all requests - Useful for testing with SwaggerUI. Turn off on production. |
DB_CREATE | {FLAG} | false | Allows new databases to be created. |
DB_READONLY | {FLAG} | false | Makes databases readonly. |
DB_ALLOW_LIST | DB1[,DB2]* | Any Allowed | Restrict what databases can be connected to. |
DB_PRIVATECACHE | {FLAG} | Shared | Keep caches between connections private. |
DEBUGGING_TAGS | {FLAG} | false | Outputs xml style tags in query comments for deugging purposes. |
PRETTY_PRINT_LOGS | {FLAG} | false | Uses pino-pretty to pretty print request logs |
LOG_LEVEL | fatal | error | info | debug | trace | silent | info | The minimum log level to output |
METRICS | {FLAG} | false | Enables a /metrics prometheus metrics endpoint. |
QUERY_LENGTH_LIMIT | INT | Infinity | Puts a limit on the length of generated SQL before execution. |
DATASETS | {FLAG} | false | Enable dataset operations |
DATASET_DELETE | {FLAG} | false | Enable DELETE /datasets/:name |
DATASET_TEMPLATES | DIRECTORY | ./dataset_templates | Directory to clone datasets from. |
DATASET_CLONES | DIRECTORY | ./dataset_clones | Directory to clone datasets to. |
MUTATIONS | {FLAG} | false | Enable Mutation Support. |
The agent is configured as per the configuration schema. The valid configuration properties are:
| Property | Type | Default |
|---|---|---|
db | string | |
tables | string[] | null |
include_sqlite_meta_tables | boolean | false |
explicit_main_schema | boolean | false |
The only required property is db which specifies a local sqlite database to use.
The schema is exposed via introspection, but you can limit which tables are referenced by
tables property, orinclude_sqlite_meta_tables to include or exclude sqlite meta tables.The explicit_main_schema field can be set to opt into exposing tables by their fully qualified names (ie ["main", "MyTable"] instead of just ["MyTable"]).
The dataset used for testing the reference agent is sourced from:
Datasets support is enabled via the ENV variables:
DATASETSDATASET_DELETEDATASET_TEMPLATESDATASET_CLONESTemplates will be looked up at ${DATASET_TEMPLATES}/${template_name}.sqlite or ${DATASET_TEMPLATES}/${template_name}.sql. The .sqlite templates are just SQLite database files that will be copied as a clone. The .sql templates are SQL script files that will be run against a blank SQLite database in order to create a clone.
Clones will be copied to ${DATASET_CLONES}/${clone_name}.sqlite.
Ensure you run the agent with DATASETS=1 DATASET_DELETE=1 MUTATIONS=1 in order to enable testing of mutations.
Then run:
cabal run dc-api:test:tests-dc-api -- test --agent-base-url http://localhost:8100 sandwich --tui
From the HGE repo.
/metricsresultTT and other badly named types in the schema.ts modulefind_table_relationship in more scenariosNOT EXISTS IS NULL != EXISTS IS NOT NULL