Skip to content

Commit

Permalink
docs(filesystem): include Google Drive into filesystem tutorial (#962)
Browse files Browse the repository at this point in the history
* docs(filesystem): include Google Drive into filesystem tutorial

* fix typos

* fix typo

* fix API name
  • Loading branch information
IlyaFaer authored Feb 26, 2024
1 parent ca7d630 commit e035e38
Showing 1 changed file with 11 additions and 8 deletions.
19 changes: 11 additions & 8 deletions docs/website/docs/dlt-ecosystem/verified-sources/filesystem.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,7 @@ keywords: [readers source and filesystem, filesystem, readers source]
or [book a call](https://calendar.app.google/kiLhuMsWKpZUpfho6) with our support engineer Adrian.
:::

This verified source easily streams files from AWS s3, GCS, Azure, or local filesystem using the reader
source.
This verified source easily streams files from AWS S3, Google Cloud Storage, Google Drive, Azure, or local filesystem using the reader source.

Sources and resources that can be used with this verified source are:

Expand All @@ -32,6 +31,7 @@ This source can access various bucket types, including:

- AWS S3.
- Google Cloud Storage.
- Google Drive.
- Azure Blob Storage.
- Local Storage

Expand All @@ -48,13 +48,13 @@ To get AWS keys for S3 access:
For more info, see
[AWS official documentation.](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html)

#### Google Cloud Storage credentials
#### Google Cloud Storage / Google Drive credentials

To get GCS access:
To get GCS/GDrive access:

1. Log in to [console.cloud.google.com](http://console.cloud.google.com/).
1. Create a [service account](https://cloud.google.com/iam/docs/service-accounts-create#creating).
1. Enable "Google Analytics API"; see
1. Enable "Cloud Storage API" / "Google Drive API"; see
[Google's guide](https://support.google.com/googleapi/answer/6158841?hl=en).
1. In IAM & Admin > Service Accounts, find your account, click the three-dot menu > "Manage Keys" >
"ADD KEY" > "CREATE" to get a JSON credential file.
Expand Down Expand Up @@ -111,7 +111,7 @@ For more information, read the
aws_access_key_id="Please set me up!"
aws_secret_access_key="Please set me up!"

# For GCS storage bucket access:
# For GCS bucket / Google Drive access:
client_email="Please set me up!"
private_key="Please set me up!"
project_id="Please set me up!"
Expand All @@ -133,11 +133,14 @@ For more information, read the
```

For remote file systems you need to add the schema, it will be used to get the protocol being
used, for example:
used:

```toml
[sources.filesystem] # use [sources.readers.credentials] for the "readers" source
bucket_url="s3://my-bucket/csv_files/"
# bucket_url="az://my-bucket/csv_files/" - for Azure Blob Storage
# bucket_url="gdrive://my-bucket/csv_files/" - for Google Drive folder
# bucket_url="gs://my-bucket/csv_files/" - for Google Storage
bucket_url="s3://my-bucket/csv_files/" # for AWS S3
```
:::caution
For Azure, use adlfs>=2023.9.0. Older versions mishandle globs.
Expand Down

0 comments on commit e035e38

Please sign in to comment.