This is slightly modified version of Nussknacker's quickstart.
It shows how to use Apache Ignite in several cases of Nussknacker deployment:
- aggregation - persist large aggregates collected while processing input records
- enrichment - enrich input records with detailed data using Ignite as super-fast cache
docker-compose
jq
To run end-to-end scenarios, just run:
./testAggregates.sh
for aggregation scenario, or
./testEnrichment.sh
for enrichment.
To cleanup Docker stuff after running end-to-end tests run ./cleanup.sh
.
You can just run ./start.sh
to pull and start required docker images.
After doing it, you can access following components:
- Nussknacker - user/password: admin/admin
- Apache Flink UI
- Apache NiFi
- Grafana
- AKHQ
Ignite doesn't expose any UI to access, you can connect to it using JDBC or built-in sqlline
tool:
#> docker exec -it nussknacker_ignite bash
bash-4.4# /opt/ignite/apache-ignite/bin/sqlline.sh -u 'jdbc:ignite:thin://ignite' -n ignite -p ignite
sqlline version 1.9.0
0: jdbc:ignite:thin://ignite> select * from customer;
+----------+---------------+----------+
| clientId | name | category |
+----------+---------------+----------+
| client1 | John Doe | STANDARD |
| client2 | Robert Wright | GOLD |
| client3 | Юрий Шевчук | PLATINUM |
| client4 | Иосиф Кобзон | STANDARD |
+----------+---------------+----------+
4 rows selected (0.036 seconds)
0: jdbc:ignite:thin://ignite>
There is also helper script to query aggregates table:
#> ./testData/aggregates/queryIgniteAggregates.sh
1/1 SELECT * FROM AGGREGATES ORDER BY "eventDate" DESC;
+----------+-----------------------+--------+
| clientId | eventDate | amount |
+----------+-----------------------+--------+
| client1 | 2022-01-10 00:00:00.0 | 158 |
| client2 | 2022-01-10 00:00:00.0 | 175 |
| client3 | 2022-01-10 00:00:00.0 | 190 |
| client4 | 2022-01-10 00:00:00.0 | 192 |
| client5 | 2022-01-10 00:00:00.0 | 188 |
+----------+-----------------------+--------+
5 rows selected (0.061 seconds)
sqlline version 1.9.0
In aggregates scenario, updated aggregate records are published to dailyAggregates
topic. In order to populate this data
to Ignite, a simple NiFi flow is provided:
This PutIgniteRecord
processor is a part of our nifi-extensions project.
Here's a brief summary of scripts located in testData
directory, which are used in end-to-end scenarios:
createIgniteTables.sh
- creates Ignite tables and populates with basic dataimportAndDeploy.sh <scenario_file>
- imports Nussknacker's scenario fromscenario_file
and deploys it to FlinksendTestTransactions.sh <count>
- sends randomcount
transactions to input topic
More advanced usages of Nussknacker image (available properties and so on) you can find out on our Installation guide
Please send your feedback on our mailing list. Issues and pull request can be reported on our project page