Follow CloudGoat Quick Start
Create 3 sample scenarios, like:
./cloudgoat.py create iam_privesc_by_key_rotation
./cloudgoat.py create rce_web_app
./cloudgoat.py create cicd
⚠️ Remember to destroy all the resources you create after the demo.⚠️
Run below instructions to sync data using CloudQuery or Cartography.
- AWS Account
- AWS authentication credentials (see CloudQquery -> plugins -> AWS -> Authentication )
- Docker compose
- CloudQuery account
Create a .env
file at cloudquery with content:
CLOUDQUERY_API_KEY=your_api_key
Setup AWS credentials either at .env
or at ~/.aws/credentials
AWS_ACCESS_KEY_ID={Your AWS Access Key ID}
AWS_SECRET_ACCESS_KEY={Your AWS secret access key}
Sample .env
:
CLOUDQUERY_API_KEY=your_api_key
AWS_ACCESS_KEY_ID={Your AWS Access Key ID}
AWS_SECRET_ACCESS_KEY={Your AWS secret access key}
Go to /cloudquery and run docker compose up
cd ./cloudquery
docker-compose up -d
docker-compose run cloudquery
It can take around 15-25 minutes to complete, so be patient.
Follow the instructions from cloudquery/plugins/source/aws/dashboards#aws-asset-inventory
Open Grafana for the first time at http://localhost:3000 and sign in using admin/admin
credentials.
Open a shell into the postgres DB and run the resources.sql script
docker exec -it cloudquery-postgres-1 bash
psql -U postgres < /var/lib/postgresql/scripts/resources.sql
Add the CloudQuery postgres database as a data source to Grafana (Connections -> Data Sources -> Add new data source)
- Search for PostgreSQL data-source and select it
- Set the host url to
postgres:5432
- Set credentials postgres/pass
- Disable TLS/SSL Mode
- Save and test connection
Import the dashboards from cloudquery/grafana
Play around with more queries from CloudQuery query examples or the formerly Cloud Security Posture Management (CSPM) policies queries.
Follow instructions from OpenSource CSPM how to guide.
Install dbt from their Docker image.
Add the dbt container to docker-compose.yml
file.
- AWS Account
- AWS authentication credentials (follow single AWS account setup, at least adding the SecurityAudit policy to your AWS connected credentials)
- Docker compose
Used a slightly different Dockerfile from the official one, hence you have to build it locally.
Clone locally Cartography GitHub repository
cd cartography
git clone https://github.com/lyft/cartography.git
docker build -t lyft/cartography .
Update the docker-compose-yml file to point out to heryxpc/cartography
cartography:
image: heryxpc/cartography
user: cartography
init: true
docker-compose up -d
docker-compose run cartography cartography --neo4j-uri bolt://neo4j:7687
Takes around 45 minutes to complete.
- Open NeoDash at http://localhost:5005/
- Click on New Dashboard
- Use defaults and set
localhost
as `Hostname`` - Click the load button from the left panel (small cloud with up arrow)
- Choose Select From File
- Open the file neodash/dashboard.json
- Click on Load Dashboard
- Play around with the information presented
Would need to restart
and healthcheck
options from the docker-compose.yaml to avoid Docker to keep restarting neo4j when shutting down.
# restart: unless-stopped
# healthcheck:
# test: ["CMD", "curl", "-f", "http://localhost:7474"]
# interval: 10s
# timeout: 10s
# retries: 10
Open a terminal at the neo4j container and stop neo4j
docker exec -it cartography-neo4j-1 bash
neo4j stop
Copy the DB dump from sample_data folder to the container
docker cp ./cartography/sample_data/neo4j.dump cartography-neo4j-1:/import/neo4j.dump
From the container, run the backup restore command and restart neo4j
neo4j-admin load --from=/var/lib/neo4j/import --database=neo4j --force
neo4j start