Skip to content

Commit

Permalink
updated installation command in destination docs and a few others (#1410
Browse files Browse the repository at this point in the history
)
  • Loading branch information
dat-a-man authored May 25, 2024
1 parent 993ac37 commit 5d2d1ec
Show file tree
Hide file tree
Showing 18 changed files with 25 additions and 25 deletions.
4 changes: 2 additions & 2 deletions docs/website/docs/dlt-ecosystem/destinations/athena.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ The Athena destination stores data as Parquet files in S3 buckets and creates [e
## Install dlt with Athena
**To install the dlt library with Athena dependencies:**
```sh
pip install dlt[athena]
pip install "dlt[athena]"
```

## Setup Guide
Expand All @@ -30,7 +30,7 @@ First, install dependencies by running:
```sh
pip install -r requirements.txt
```
or with `pip install dlt[athena]`, which will install `s3fs`, `pyarrow`, `pyathena`, and `botocore` packages.
or with `pip install "dlt[athena]"`, which will install `s3fs`, `pyarrow`, `pyathena`, and `botocore` packages.

:::caution

Expand Down
2 changes: 1 addition & 1 deletion docs/website/docs/dlt-ecosystem/destinations/bigquery.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ keywords: [bigquery, destination, data warehouse]
**To install the dlt library with BigQuery dependencies:**

```sh
pip install dlt[bigquery]
pip install "dlt[bigquery]"
```

## Setup Guide
Expand Down
4 changes: 2 additions & 2 deletions docs/website/docs/dlt-ecosystem/destinations/clickhouse.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ keywords: [ clickhouse, destination, data warehouse ]
**To install the DLT library with ClickHouse dependencies:**

```sh
pip install dlt[clickhouse]
pip install "dlt[clickhouse]"
```

## Setup Guide
Expand All @@ -33,7 +33,7 @@ requirements file by executing it as follows:
pip install -r requirements.txt
```

or with `pip install dlt[clickhouse]`, which installs the `dlt` library and the necessary dependencies for working with ClickHouse as a destination.
or with `pip install "dlt[clickhouse]"`, which installs the `dlt` library and the necessary dependencies for working with ClickHouse as a destination.

### 2. Setup ClickHouse database

Expand Down
2 changes: 1 addition & 1 deletion docs/website/docs/dlt-ecosystem/destinations/databricks.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ keywords: [Databricks, destination, data warehouse]
## Install dlt with Databricks
**To install the dlt library with Databricks dependencies:**
```sh
pip install dlt[databricks]
pip install "dlt[databricks]"
```

## Set up your Databricks workspace
Expand Down
4 changes: 2 additions & 2 deletions docs/website/docs/dlt-ecosystem/destinations/dremio.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ keywords: [dremio, iceberg, aws, glue catalog]
## Install dlt with Dremio
**To install the dlt library with Dremio and s3 dependencies:**
```sh
pip install dlt[dremio,s3]
pip install "dlt[dremio,s3]"
```

## Setup Guide
Expand All @@ -28,7 +28,7 @@ First install dependencies by running:
```sh
pip install -r requirements.txt
```
or with `pip install dlt[dremio,s3]` which will install `s3fs`, `pyarrow`, and `botocore` packages.
or with `pip install "dlt[dremio,s3]"` which will install `s3fs`, `pyarrow`, and `botocore` packages.

To edit the `dlt` credentials file with your secret info, open `.dlt/secrets.toml`. You will need to provide a `bucket_url` which holds the uploaded parquet files.

Expand Down
2 changes: 1 addition & 1 deletion docs/website/docs/dlt-ecosystem/destinations/duckdb.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ keywords: [duckdb, destination, data warehouse]
## Install dlt with DuckDB
**To install the dlt library with DuckDB dependencies, run:**
```sh
pip install dlt[duckdb]
pip install "dlt[duckdb]"
```

## Setup Guide
Expand Down
6 changes: 3 additions & 3 deletions docs/website/docs/dlt-ecosystem/destinations/filesystem.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ The Filesystem destination stores data in remote file systems and bucket storage
## Install dlt with filesystem
**To install the dlt library with filesystem dependencies:**
```sh
pip install dlt[filesystem]
pip install "dlt[filesystem]"
```

This installs `s3fs` and `botocore` packages.
Expand Down Expand Up @@ -125,7 +125,7 @@ client_kwargs = '{"verify": "public.crt"}'
```

#### Google Storage
Run `pip install dlt[gs]` which will install the `gcfs` package.
Run `pip install "dlt[gs]"` which will install the `gcfs` package.

To edit the `dlt` credentials file with your secret info, open `.dlt/secrets.toml`.
You'll see AWS credentials by default.
Expand All @@ -148,7 +148,7 @@ if you have default google cloud credentials in your environment (i.e. on cloud
Use **Cloud Storage** admin to create a new bucket. Then assign the **Storage Object Admin** role to your service account.

#### Azure Blob Storage
Run `pip install dlt[az]` which will install the `adlfs` package to interface with Azure Blob Storage.
Run `pip install "dlt[az]"` which will install the `adlfs` package to interface with Azure Blob Storage.

Edit the credentials in `.dlt/secrets.toml`, you'll see AWS credentials by default replace them with your Azure credentials:

Expand Down
2 changes: 1 addition & 1 deletion docs/website/docs/dlt-ecosystem/destinations/motherduck.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ keywords: [MotherDuck, duckdb, destination, data warehouse]
## Install dlt with MotherDuck
**To install the dlt library with MotherDuck dependencies:**
```sh
pip install dlt[motherduck]
pip install "dlt[motherduck]"
```

:::tip
Expand Down
4 changes: 2 additions & 2 deletions docs/website/docs/dlt-ecosystem/destinations/mssql.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ keywords: [mssql, sqlserver, destination, data warehouse]
## Install dlt with MS SQL
**To install the dlt library with MS SQL dependencies, use:**
```sh
pip install dlt[mssql]
pip install "dlt[mssql]"
```

## Setup guide
Expand Down Expand Up @@ -38,7 +38,7 @@ pip install -r requirements.txt
```
or run:
```sh
pip install dlt[mssql]
pip install "dlt[mssql]"
```
This will install `dlt` with the `mssql` extra, which contains all the dependencies required by the SQL server client.

Expand Down
2 changes: 1 addition & 1 deletion docs/website/docs/dlt-ecosystem/destinations/postgres.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ keywords: [postgres, destination, data warehouse]
## Install dlt with PostgreSQL
**To install the dlt library with PostgreSQL dependencies, run:**
```sh
pip install dlt[postgres]
pip install "dlt[postgres]"
```

## Setup Guide
Expand Down
2 changes: 1 addition & 1 deletion docs/website/docs/dlt-ecosystem/destinations/qdrant.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ This destination helps you load data into Qdrant from [dlt resources](../../gene
1. To use Qdrant as a destination, make sure `dlt` is installed with the `qdrant` extra:

```sh
pip install dlt[qdrant]
pip install "dlt[qdrant]"
```

2. Next, configure the destination in the dlt secrets file. The file is located at `~/.dlt/secrets.toml` by default. Add the following section to the secrets file:
Expand Down
4 changes: 2 additions & 2 deletions docs/website/docs/dlt-ecosystem/destinations/redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ keywords: [redshift, destination, data warehouse]
## Install dlt with Redshift
**To install the dlt library with Redshift dependencies:**
```sh
pip install dlt[redshift]
pip install "dlt[redshift]"
```

## Setup Guide
Expand All @@ -26,7 +26,7 @@ The above command generates several files and directories, including `.dlt/secre
```sh
pip install -r requirements.txt
```
or with `pip install dlt[redshift]`, which installs the `dlt` library and the necessary dependencies for working with Amazon Redshift as a destination.
or with `pip install "dlt[redshift]"`, which installs the `dlt` library and the necessary dependencies for working with Amazon Redshift as a destination.

### 2. Setup Redshift cluster
To load data into Redshift, you need to create a Redshift cluster and enable access to your IP address through the VPC inbound rules associated with the cluster. While we recommend asking our GPT-4 assistant for details, we have provided a general outline of the process below:
Expand Down
2 changes: 1 addition & 1 deletion docs/website/docs/dlt-ecosystem/destinations/synapse.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ keywords: [synapse, destination, data warehouse]
## Install dlt with Synapse
**To install the dlt library with Synapse dependencies:**
```sh
pip install dlt[synapse]
pip install "dlt[synapse]"
```

## Setup guide
Expand Down
2 changes: 1 addition & 1 deletion docs/website/docs/dlt-ecosystem/destinations/weaviate.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ This destination helps you load data into Weaviate from [dlt resources](../../ge
1. To use Weaviate as a destination, make sure dlt is installed with the 'weaviate' extra:

```sh
pip install dlt[weaviate]
pip install "dlt[weaviate]"
```

2. Next, configure the destination in the dlt secrets file. The file is located at `~/.dlt/secrets.toml` by default. Add the following section to the secrets file:
Expand Down
2 changes: 1 addition & 1 deletion docs/website/docs/dlt-ecosystem/file-formats/parquet.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ keywords: [parquet, file formats]
To use this format, you need a `pyarrow` package. You can get this package as a `dlt` extra as well:

```sh
pip install dlt[parquet]
pip install "dlt[parquet]"
```

## Supported Destinations
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -247,7 +247,7 @@ API token.
[destination](../../dlt-ecosystem/destinations/), For example, duckdb:

```sh
pip install dlt[duckdb]
pip install "dlt[duckdb]"
```

1. Run the pipeline with the following command:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -231,7 +231,7 @@ need to register to use this service neither get an API key.
[destination](https://dlthub.com/docs/dlt-ecosystem/destinations/), For example, duckdb:

```sh
pip install dlt[duckdb]
pip install "dlt[duckdb]"
```

1. Run the pipeline with the following command:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -284,7 +284,7 @@ The first step is to register on [SerpAPI](https://serpapi.com/) and obtain the
[destination](https://dlthub.com/docs/dlt-ecosystem/destinations/), For example, duckdb:

```sh
pip install dlt[duckdb]
pip install "dlt[duckdb]"
```

1. Run the pipeline with the following command:
Expand Down

0 comments on commit 5d2d1ec

Please sign in to comment.