Skip to content

Commit

Permalink
fix local pydoc generation
Browse files Browse the repository at this point in the history
fix relative pydoc links
  • Loading branch information
sh-rp committed Mar 26, 2024
1 parent df1ba17 commit 7e59186
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 4 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,7 @@ The second option is running dbt using data load tool(dlt).

I work at dlthub and often create dlt pipelines. These often need dbt for modeling the data, making
the dlt-dbt combination highly effective. For using this combination on cloud functions, we used
[dlt-dbt runner](https://dlthub.com/docs/api_reference/helpers/dbt/runner#create_runner) developed
[dlt-dbt runner](../../api_reference/helpers/dbt/runner#create_runner) developed
at dlthub.

The main reason I use this runner is because I load data with dlt and can re-use dlt’s connection to
Expand Down Expand Up @@ -331,7 +331,7 @@ event-driven pipelines with small to medium workloads. For larger data loads nea
consider separating dlt and dbt into different cloud functions.

> For more info on using `dlt-dbt runner` , please refer to the
> [official documentation by clicking here.](https://dlthub.com/docs/api_reference/helpers/dbt/runner#dbtpackagerunner-objects)
> [official documentation by clicking here.](../../api_reference/helpers/dbt/runner#dbtpackagerunner-objects)

### Deployment considerations: How does cloud functions compare to Git Actions?

Expand Down
2 changes: 1 addition & 1 deletion docs/website/docs/dlt-ecosystem/destinations/bigquery.md
Original file line number Diff line number Diff line change
Expand Up @@ -263,7 +263,7 @@ bigquery_adapter(my_resource, partition="partition_column_name")
my_resource = bigquery_adapter(my_resource, partition="partition_column_name")
```

Refer to the [full API specification](https://dlthub.com/docs/api_reference/destinations/impl/bigquery/bigquery_adapter) for more details.
Refer to the [full API specification](../../api_reference/destinations/impl/bigquery/bigquery_adapter) for more details.

<!--@@@DLT_TUBA bigquery-->

2 changes: 1 addition & 1 deletion docs/website/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"private": true,
"scripts": {
"docusaurus": "docusaurus",
"start": "node tools/update_version_env.js && node tools/preprocess_docs.js && concurrently --kill-others \"node tools/preprocess_docs.js --watch\" \"docusaurus start\"",
"start": "PYTHONPATH=. poetry run pydoc-markdown && node tools/update_version_env.js && node tools/preprocess_docs.js && concurrently --kill-others \"node tools/preprocess_docs.js --watch\" \"docusaurus start\"",
"build": "node tools/preprocess_docs.js && PYTHONPATH=. poetry run pydoc-markdown && node tools/update_version_env.js && docusaurus build",
"build:netlify": "node tools/preprocess_docs.js && PYTHONPATH=. pydoc-markdown && node tools/update_version_env.js && docusaurus build --out-dir build/docs",
"swizzle": "docusaurus swizzle",
Expand Down

0 comments on commit 7e59186

Please sign in to comment.