Skip to content

Commit

Permalink
Add readme
Browse files Browse the repository at this point in the history
  • Loading branch information
damcou committed Jun 2, 2021
1 parent 5f3d9af commit 1a7d5f8
Show file tree
Hide file tree
Showing 3 changed files with 55 additions and 31 deletions.
56 changes: 55 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1 +1,55 @@
# algoliasearch-crawler-github-actions
# Algolia Crawler Github action

## How to use

- On your repository, create a `.github/workflows/mail.yml` file
- Add the following code when needed in your flow:

```yaml
on: [push]

jobs:
algolia_reindex:
name: Algolia Reindex
runs-on: ubuntu-latest
steps:
# checkout this repo
- name: Checkout Repo
uses: actions/checkout@v2
# checkout the private repo containing the action to run
- name: Checkout GitHub Action Repo
uses: actions/checkout@v2
with:
repository: algolia/algoliasearch-crawler-github-actions
ref: v0.4 # version of the release you want to use
token: ${{ secrets.GIT_HUB_TOKEN }}
- name: Algolia crawler creation and reindex
uses: ./
id: crawler
with:
crawler-user-id: ${{ secrets.CRAWLER_USER_ID }}
crawler-api-key: ${{ secrets.CRAWLER_API_KEY }}
crawler-api-base-url: 'https://crawler-dev.algolia.com/api/1/'
crawler-name: ${{ github.repository }}-${{ github.ref }}
algolia-app-id: ${{ secrets.ALGOLIA_APP_ID }}
algolia-api-key: ${{ secrets.ALGOLIA_API_KEY }}
site-url: 'https://crawler.algolia.com/test-website/'`
```

- More for information about Github actions [here](https://docs.github.com/en/actions)

## Variables to provide
- `crawler-user-id`: User Id of your crawler account
- `crawler-api-key`: Api Key of your crawner account
- `crawler-api-base-url`: Base URL of the crawler (by default it's [https://crawler.algolia.com/api/1/](https://crawler.algolia.com/api/1/))
- `crawler-name`: Name of the created crawler
- `algolia-app-id`: Algolia Application ID
- `algolia-api-key`: Algolia API Key
- `site-url`: URL of the website to crawl

## Create the following Github secrets on your repository (in Settings > Secrets)
- `ALGOLIA_API_KEY`: Algolia Application ID
- `ALGOLIA_APP_ID`: Algolia API Key
- `CRAWLER_API_KEY`: Api Key of your crawner account
- `CRAWLER_USER_ID`: User Id of your crawler account
- `GIT_HUB_TOKEN`: Github token (to use the action as long it's a private repository)
14 changes: 0 additions & 14 deletions dist/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -59,20 +59,6 @@ var __spreadArray = (this && this.__spreadArray) || function (to, from) {
Object.defineProperty(exports, "__esModule", { value: true });
var core = require('@actions/core');
var crawler_api_client_1 = require("./crawler-api-client");
// // CREDENTIALS
// const CRAWLER_USER_ID = "00000000-0000-4000-a000-000000000001";
// const CRAWLER_API_KEY = "14mn074r34l4p1k3yd0n7u53m31npr0d";
// const CRAWLER_API_BASE_URL = "http://localhost:7900/api/1/";
// // CREDENTIALS
// const CRAWLER_USER_ID = "478e6f93-7550-4850-9f3e-91ea853fa13d";
// const CRAWLER_API_KEY = "d4bc8f523e6f126c88ec07bb7da3611d";
// const CRAWLER_API_BASE_URL = "https://crawler-dev.algolia.com/api/1/";
//
// // CRAWLER CONFIGURATION
// const CRAWLER_NAME = 'damcou/simple-page/master'.replace(/\//g, '-');
// const ALGOLIA_APP_ID = 'Y34K42BB0X';
// const ALGOLIA_API_KEY = '64b0f8f5892b676971d5e1da39f0604a';
// const SITE_URL = 'https://crawler.algolia.com/test-website/';
// CREDENTIALS
var CRAWLER_USER_ID = core.getInput('crawler-user-id');
var CRAWLER_API_KEY = core.getInput('crawler-api-key');
Expand Down
16 changes: 0 additions & 16 deletions src/index.ts
Original file line number Diff line number Diff line change
@@ -1,22 +1,6 @@
const core = require('@actions/core');
import { CrawlerApiClient } from './crawler-api-client';

// // CREDENTIALS
// const CRAWLER_USER_ID = "00000000-0000-4000-a000-000000000001";
// const CRAWLER_API_KEY = "14mn074r34l4p1k3yd0n7u53m31npr0d";
// const CRAWLER_API_BASE_URL = "http://localhost:7900/api/1/";

// // CREDENTIALS
// const CRAWLER_USER_ID = "478e6f93-7550-4850-9f3e-91ea853fa13d";
// const CRAWLER_API_KEY = "d4bc8f523e6f126c88ec07bb7da3611d";
// const CRAWLER_API_BASE_URL = "https://crawler-dev.algolia.com/api/1/";
//
// // CRAWLER CONFIGURATION
// const CRAWLER_NAME = 'damcou/simple-page/master'.replace(/\//g, '-');
// const ALGOLIA_APP_ID = 'Y34K42BB0X';
// const ALGOLIA_API_KEY = '64b0f8f5892b676971d5e1da39f0604a';
// const SITE_URL = 'https://crawler.algolia.com/test-website/';

// CREDENTIALS
const CRAWLER_USER_ID = core.getInput('crawler-user-id');
const CRAWLER_API_KEY = core.getInput('crawler-api-key');
Expand Down

0 comments on commit 1a7d5f8

Please sign in to comment.