This DuckDuckAPI connector allows you to easily build a high-performing connector to expose existing API services, where reads happen against DuckDB and writes happen directly to the upstream API servce. This is ideal to make the API data accessible to LLMs via PromptQL,
- Create a DuckDB schema and write a loading script to load data from an API into DuckDB
- Implement functions to wrap over upstream API endpoints, particularly for write operations
This allows a GraphQL or PromptQL query to run against API data in a highly flexible way without performance or rate limiting issues. Ofcourse, the tradeoff is that the data will only be eventually consistent because writes will reflect in subsequent reads only after the API data gets updated in DuckDB (via the loader script).
This repo is both a connector and an npm SDK. This makes the dev lifecycle a little interesting to set up.
- Clone this repo
cd ndc-duckduckapi
npm i
npm run build
- From the root folder of this project:
cd connector-definition/template
- Make sure your package.json is using the
ndc-duckduckapi
sdk through a file URI for local dev:.... "@hasura/ndc-duckduckapi": "file:///../../ndc-duckduckapi" ...
- Now run:
npm install
- And now, run the connector:
HASURA_CONNECTOR_PORT=9094 npm run start
- Verify that everything is running by hitting
localhost:9094/schema
and you should see a google-calendar NDC schema
To test this connector, you'll want to run a supergraph project that uses this connector as an HTTP connector:
- Outside of this repo,
ddn supergraph init test-proj
ddn connector-link add dda --configure-host=http://local.hasura.dev:9094
- Make sure to remove the Authorization headers from the
dda.hml
- Make sure to add argumentPresets to dda.hml
- argument: headers
value:
httpHeaders:
forward:
- X-Hasura-Oauth-Services
additional: {}
ddn connector-link update dda
ddn connector-link add-resources dda
ddn supergraph build local
ddn run docker-start
- Submit a PR and once its merged to main, tag it with a version and everything else is magic
git tag v0.1.6
Then update NDC Hub:
- TODO: Action coming soon
ddn supergraph init myproject
ddn connector init -i
>>> choose hasura/duckduckapi
>>> set name to myconnector
ddn connector introspect myconnector
ddn models add myconnector '*'
ddn commands add myconnector '*'
# For local dev
ddn supergraph build local
ddn run docker-start
ddn console --local
# For deploying to cloud
ddn supergraph build create
ddn console
Head to the OAuth Playground on the console.
- Login (or add a new oauth2 provider) to your SaaS service
- Start the loader job by hitting Run
- Set up a
schema.sql
. This will be run on startup and will initialize the duckdb schema. Refer the example in index.ts for details - Add loader functions in
functions.ts
and follow the examples to build
To test, run the ts connector and refresh the supergraph project (step 3 onwards in the Get Started above).
TODO:
Below, you'll find a matrix of all supported features for the DuckDB connector:
Feature | Supported | Notes |
---|---|---|
Native Queries + Logical Models | ❌ | |
Simple Object Query | ✅ | |
Filter / Search | ✅ | |
Simple Aggregation | ❌ | |
Sort | ✅ | |
Paginate | ✅ | |
Table Relationships | ✅ | |
Views | ❌ | |
Distinct | ❌ | |
Remote Relationships | ✅ | |
Custom Fields | ❌ | |
Mutations | ❌ |
Any functions exported from functions.ts
are made available as NDC functions/procedures to use in your Hasura metadata and expose as GraphQL fields in queries or mutation.
If you write a function that performs a read-only operation, you should mark it with the @readonly
JSDoc tag, and it will be exposed as an NDC function, which will ultimately show up as a GraphQL query field in Hasura.
/** @readonly */
export function add(x: number, y: number): number {
return x + y;
}
Functions without the @readonly
JSDoc tag are exposed as NDC procedures, which will ultimately show up as a GraphQL mutation field in Hasura.
Arguments to the function end up being field arguments in GraphQL and the return value is what the field will return when queried. Every function must return a value; void
, null
or undefined
is not supported.
/** @readonly */
export function hello(name: string, year: number): string {
return `Hello ${name}, welcome to ${year}`;
}
Async functions are supported:
type HttpStatusResponse = {
code: number;
description: string;
};
export async function test(): Promise<string> {
const result = await fetch("http://httpstat.us/200");
const responseBody = (await result.json()) as HttpStatusResponse;
return responseBody.description;
}
If you'd like to split your functions across multiple files, do so, then simply re-export them from functions.ts
like so:
export * from "./another-file-1";
export * from "./another-file-2";
The basic scalar types supported are:
string
(NDC scalar type:String
)number
(NDC scalar type:Float
)boolean
(NDC scalar type:Boolean
)bigint
(NDC scalar type:BigInt
, represented as a string in JSON)Date
(NDC scalar type:DateTime
, represented as an ISO formatted string in JSON)
You can also import JSONValue
from the SDK and use it to accept and return arbitrary JSON. Note that the value must be serializable to JSON.
import * as sdk from "@hasura/ndc-lambda-sdk";
export function myFunc(json: sdk.JSONValue): sdk.JSONValue {
const propValue =
json.value instanceof Object &&
"prop" in json.value &&
typeof json.value.prop === "string"
? json.value.prop
: "default value";
return new sdk.JSONValue({ prop: propValue });
}
null
, undefined
and optional arguments/properties are supported:
export function myFunc(name: string | null, age?: number): string {
const greeting = name != null ? `hello ${name}` : "hello stranger";
const ageStatement =
age !== undefined ? `you are ${age}` : "I don't know your age";
return `${greeting}, ${ageStatement}`;
}
However, any undefined
s in the return type will be converted to nulls, as GraphQL does not have the concept of undefined
.
Object types and interfaces are supported. The types of the properties defined on these must be supported types.
type FullName = {
title: string;
firstName: string;
surname: string;
};
interface Greeting {
polite: string;
casual: string;
}
export function greet(name: FullName): Greeting {
return {
polite: `Hello ${name.title} ${name.surname}`,
casual: `G'day ${name.firstName}`,
};
}
Arrays are also supported, but can only contain a single type (tuple types are not supported):
export function sum(nums: number[]): number {
return nums.reduce((prev, curr) => prev + curr, 0);
}
Anonymous types are supported, but will be automatically named after the first place they are used. It is recommended that you avoid using anonymous types. Instead, prefer to name all your types to ensure the type name does not change unexpectedly as you rename usage sites and re-order usages of the anonymous type.
export function greet(
name: { firstName: string; surname: string }, // This type will be automatically named greet_name
): string {
return `Hello ${name.firstName} ${name.surname}`;
}
For more docs refer to the underlying TypeScript Lambda functions connector README;