Skip to content

Commit

Permalink
Changed broken links, Added a seperate section on extending pooler
Browse files Browse the repository at this point in the history
  • Loading branch information
thecoderpanda committed Nov 30, 2023
1 parent 17f1b5f commit 2eb2e2c
Show file tree
Hide file tree
Showing 6 changed files with 95 additions and 29 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
---
sidebar_position: 3
---
# Extending the UniswapV2 Dashboard

This documentation provides a step-by-step guide for developers looking to extend the functionality of the UniswapV2 Dashboard, specifically focusing on implementing new data points. The goal is to empower developers to enhance the dashboard with custom features, making it a valuable tool in hackathons and blockchain analytics.

## Extending with New Data Points

:::tip
Prerequisities: Before we dive into the implementation of new data points, you may want to look at the concept of how the pooler functions and how it retrives and processes the data.

[Closer look into the Snapshot Datasets](/docs/Build-with-Powerloom/UniswapV2%20Dashboard/Tour%20of%20the%20existing%20implementation/closer-inspection-of-the-snapshot-datasets)
:::

### Scenario: 2-Hour Aggregate of Swap Events

We'll use the example of creating a new data point that aggregates only Swap events over a 2-hour period. This involves capturing snapshots of Swap event logs and trade volumes within this timeframe.

### Steps to Implement the New Data Point

1. **Fork the Pooler Repository**:
Begin by forking the [Pooler repository](https://github.com/powerloom/pooler). This will be your workspace for implementing the new feature.

2. **Setup Development Environment**:
Follow the setup instructions in the [`deploy` repository](https://github.com/powerloom/deploy) to prepare your development environment. This step ensures you have all the necessary tools and dependencies.

3. **Configure Aggregation Worker**:
In the `config/aggregator.json` file of your forked repository, add a new entry for your aggregation worker class. This class will be responsible for handling the new data aggregation task.
- Define the `project_type` as something like `"aggregate_swap_events_2h"`.
- Set `"aggregate_on"` to `"SingleProject"` or `"MultiProject"` depending on your aggregation logic.
- Under `"processor"`, specify the module and class name of your new processor.
```json
{
"config": [
// ... existing configurations ...
{
"project_type": "aggregate_swap_events_2h",
"aggregate_on": "SingleProject",
"processor": {
"module": "snapshotter.modules.computes.aggregate.swap_event_2h",
"class_name": "AggregateSwapEventProcessor"
}
}
// ... additional configurations ...
]
}
```

4. **Create a New Data Model**:
Develop a new data model in [`utils/message_models.py`](https://github.com/PowerLoom/snapshotter-computes/blob/eth_uniswapv2/utils/models/message_models.py). Use existing models like `UniswapTradesAggregateSnapshot` and `UniswapTradesSnapshot` as references. Your model should be tailored to capture and represent data specific to the 2-hour Swap event aggregation.

5. **Focus on 2-Hour Time Span and Swap Events**:
Modify the data collection logic to concentrate on a 2-hour time span (`epochId`). Ensure that your implementation is set to extract only Swap event logs and their associated trade volumes. Refer to the existing 24-hour aggregation example for guidance on structuring your logic.

6. **Testing and Validation**:
After implementation, rigorously test your new feature to ensure accuracy and efficiency. Validate that the data collected aligns with your intended 2-hour aggregation of Swap events.

7. **Commit and Share Your Work**:
Once your implementation is complete and tested, commit your changes to your implementation branch. Share your work with the community by creating a pull request to the main Pooler repository, if desired.

## Why Extend the UniswapV2 Dashboard?

Extending the UniswapV2 Dashboard is an excellent opportunity for developers to:

- **Contribute to Open Source**: Enhance a widely-used tool and give back to the community.
- **Learn and Experiment**: Gain hands-on experience with blockchain data and smart contract interactions.
- **Create Custom Analytics**: Tailor the dashboard to specific analytical needs, making it more versatile and useful.
- **Showcase Skills**: Use the extended dashboard as a portfolio piece in hackathons and professional settings.


Extending the UniswapV2 Dashboard with new data points like a 2-hour aggregate of Swap events is an exciting thing to work on. It requires a good understanding of blockchain data, smart contract events, and data modeling. By following the steps outlined in this guide, developers can successfully enhance the dashboard, making it a more powerful tool for blockchain analytics and research.

If you have any questions while building / integrating, you can reach us out on our [discord](https://discord.com/powerloom).
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"label": "Tour of the Existing implementation",
"position": 3,
"position": 2,
"link": {
"type": "generated-index",
"description": "5 minutes to learn the most important Docusaurus concepts."
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
---
sidebar_position: 1
---
# Closer Inspection of the Snapshot Datasets

Pooler is a Uniswap V2 specific implementation in the PowerLoom Protocol, designed to capture, process, and aggregate blockchain data. This documentation provides an in-depth look at how Pooler operates and how developers can utilize it for building data-rich applications.

Expand Down Expand Up @@ -47,7 +48,7 @@ This will allow you to connect to the Powerloom Prost 1D Chain testnet.

To access and utilize the ABI of the protocol state contract from the Powerloom project, follow these steps:

1. Visit the Powerloom 'pooler' repository on GitHub at this URL: [Powerloom pooler repository - ProtocolContract.json](https://github.com/PowerLoom/pooler/blob/testnet_5_pairs/snapshotter/static/abis/ProtocolContract.json).
1. Visit the Powerloom 'pooler' repository on GitHub at this URL: [Powerloom pooler repository - ProtocolContract.json](https://github.com/PowerLoom/pooler/blob/main/snapshotter/static/abis/ProtocolContract.json).
2. Locate the `ProtocolContract.json` file.
3. Copy the contents of the file.
4. Open the Remix IDE.
Expand Down Expand Up @@ -86,9 +87,9 @@ This process will allow you to review the aggregated data for the top pairs on U
Before you dive into this section, please make sure you take a look into the [Project Configuration Section](./fetching-higher-order-datapoints.md#project-configuration)
:::

In the last section, we learned how to get data from the protocol state contract and see it in JSON format through the IPFS Gateway. Next, we're going to explore how trade data is processed in basic snapshots.
In the last section, we explored how to get data from the protocol state contract and see it in JSON format through the IPFS Gateway. Next, we're going to explore how trade data is processed in basic snapshots.

Our Pooler system has several classes that handle the hard work of processing and ensuring the data is correct. One of these classes is `TradeVolumeProcessor`, located at [`snapshotter/modules/pooler/uniswapv2/trade_volume.py`](https://github.com/PowerLoom/pooler/blob/main/snapshotter/modules/pooler/uniswapv2/trade_volume.py). This class uses the `GenericProcessorSnapshot` structure found in [`pooler/utils/callback_helpers.py`](https://github.com/PowerLoom/pooler/blob/main/pooler/utils/callback_helpers.py).
Our Pooler system has several classes that handle the hard work of processing and ensuring the data is correct. One of these classes is `TradeVolumeProcessor`, located in the **Snapshotter-compute** Repo (eth_uniswapv2 branch)[`snapshotter-computer/aggregate/single_uniswap_trade_volume_24h.py`](hhttps://github.com/PowerLoom/snapshotter-computes/blob/eth_uniswapv2/aggregate/single_uniswap_trade_volume_24h.py). This class uses the `GenericProcessorSnapshot` structure found in [`snapshotter/utils/callback_helpers.py`](https://github.com/PowerLoom/pooler/blob/main/snapshotter/utils/callback_helpers.py).


If you are planning to write your own extraction logic, here are few quick concepts that are crucial:
Expand All @@ -102,7 +103,7 @@ Additionally, `transformation_lambdas` are used for extra calculations on the sn


```python reference
https://github.com/PowerLoom/pooler/blob/0e65170ac8160160d5e6978d512a7f0f89fcc9c2/snapshotter/modules/pooler/uniswapv2/trade_volume.py#L23-L28
https://github.com/PowerLoom/snapshotter-computes/blob/74b2eaa452bfac8c0e4e0a7ed74a4d2748e9c224/aggregate/single_uniswap_trade_volume_24h.py#L110-L120
```
The format of the output data can vary based on what you need it for. However, it's a good idea to use [`pydantic`](https://pypi.org/project/pydantic/) models, as they help organize and define the data structure clearly.

Expand All @@ -111,10 +112,10 @@ Pydantic Model is a Python Library that helps data validation and parsing, by us
:::


In this example related to Uniswap V2, the output is a data model named `UniswapTradesSnapshot`. This model is defined in a specific section of the Pooler code, located in the directory [`utils/models/message_models.py`](https://github.com/PowerLoom/pooler/blob/main/snapshotter/modules/pooler/uniswapv2/utils/models/message_models.py).
In this example related to Uniswap V2, the output is a data model named `UniswapTradesSnapshot`. This model is defined in a specific section of the Pooler code, located in the Snapshotter Compute repository (eth_uniswapv2 branch) [`utils/models/message_models.py`](https://github.com/PowerLoom/snapshotter-computes/blob/eth_uniswapv2/utils/models/message_models.py).

```python reference
https://github.com/PowerLoom/pooler/blob/0e65170ac8160160d5e6978d512a7f0f89fcc9c2/snapshotter/modules/pooler/uniswapv2/utils/models/message_models.py#L47-L55
https://github.com/PowerLoom/snapshotter-computes/blob/74b2eaa452bfac8c0e4e0a7ed74a4d2748e9c224/utils/models/message_models.py#L47-L55
```

The `TradeVolumeProcessor` collects and stores information about trades that happen within a specific range of blocks in the blockchain, known as the epoch. This range is defined by the lowest block number (`min_chain_height`) and the highest block number (`max_chain_height`) in that epoch.
Expand All @@ -125,11 +126,11 @@ The `TradeVolumeProcessor` collects and stores information about trades that hap

- The epoch size as described in the prior section on [epoch generation](../../../Protocol/Specifications/Epoch.md) can be considered to be constant for this specific implementation of the Uniswap v2 use case on PowerLoom Protocol, and by extension, the time duration captured within the epoch.

- The finalized state and data CID corresponding to each epoch can be accessed on the smart contract on the anchor chain that holds the protocol state. The corresponding helpers for that can be found in `get_project_epoch_snapshot()` in [`pooler/utils/data_utils`](https://github.com/PowerLoom/pooler/blob/main/pooler/utils/data_utils.py)
- The finalized state and data CID corresponding to each epoch can be accessed on the smart contract on the anchor chain that holds the protocol state. The corresponding helpers for that can be found in `get_project_epoch_snapshot()` in [`pooler/snapshotter/utils/data_utils.py`](hhttps://github.com/PowerLoom/pooler/blob/main/snapshotter/utils/data_utils.py)

```python reference

https://github.com/PowerLoom/pooler/blob/1452c166bef7534568a61b3a2ab0ff94535d7229/pooler/utils/data_utils.py#L183-L191
https://github.com/PowerLoom/pooler/blob/fc08cdd951166ab0cea669d233cd28d0639f628d/snapshotter/utils/data_utils.py#L273-L295

```

Expand All @@ -142,13 +143,13 @@ tail_epoch_id = current_epoch_id - int(time_in_seconds / (source_chain_epoch_siz

```python reference

https://github.com/PowerLoom/pooler/blob/1452c166bef7534568a61b3a2ab0ff94535d7229/pooler/utils/data_utils.py#L263-L290
https://github.com/PowerLoom/pooler/blob/fc08cdd951166ab0cea669d233cd28d0639f628d/snapshotter/utils/data_utils.py#L507-L546
```

The worker class for such aggregation is defined in `config/aggregator.json` in the following manner:

```json reference
https://github.com/PowerLoom/pooler/blob/1452c166bef7534568a61b3a2ab0ff94535d7229/config/aggregator.example.json#L3-L10
https://github.com/PowerLoom/snapshotter-configs/blob/ae77941311155a9126205af08735c3dfa5d72ac2/aggregator.example.json#L3-L10

```

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Project IDs are unique identifiers in Pooler that correspond to specific pair co
- `projects` - smart contracts to extract data from, pooler can generate different snapshots from multiple sources as long as the Contract ABI is same
- `processor` - the actual compuation logic reference, while you can write the logic anywhere, it is recommended to write your implementation in pooler/modules folder
```json reference
https://github.com/PowerLoom/pooler/blob/1452c166bef7534568a61b3a2ab0ff94535d7229/config/projects.example.json#L1-L35
https://github.com/PowerLoom/snapshotter-configs/blob/6e34c5b68fa3fba7cad3b140f8676dcbdab687c5/projects.example.json#L1-L35
```

```json
Expand All @@ -41,20 +41,20 @@ There's currently no limitation on the number or type of usecases you can build

## Core APIs

This component is one of the most important and allows you to access the finalized protocol state on the smart contract running on the anchor chain. Find it in [`core_api.py`](https://github.com/PowerLoom/pooler/blob/main/pooler/core_api.py).
This component is one of the most important and allows you to access the finalized protocol state on the smart contract running on the anchor chain. Find it in [`core_api.py`](https://github.com/PowerLoom/pooler/blob/main/snapshotter/core_api.py).

The [pooler-frontend](https://github.com/powerloom/pooler-frontend) that serves the Uniswap v2 dashboards hosted by the PowerLoom foundation on locations like [https://uniswapv2.powerloom.io/](https://uniswapv2.powerloom.io/) is a great example of a frontend specific web application that makes use of this API service.
The [pooler-frontend](https://github.com/powerloom/pooler-frontend) that serves the Uniswap v2 dashboards hosted by the PowerLoom foundation on locations like [https://uniswapv2.powerloom.io/](https://uniswapv2.powerloom.io/) is a great example of a frontend specific web application that makes use of this API service.

Among many things, the core API allows you to **access the finalized CID as well as its contents at a given epoch ID for a project**.

The main endpoint implementations can be found as follows:

```python reference
https://github.com/PowerLoom/pooler/blob/5e7cc3812074d91e8d7d85058554bb1175bf8070/snapshotter/core_api.py#L186-L268
https://github.com/PowerLoom/pooler/blob/fc08cdd951166ab0cea669d233cd28d0639f628d/snapshotter/core_api.py#L247-L339
```

```python reference
https://github.com/PowerLoom/pooler/blob/5e7cc3812074d91e8d7d85058554bb1175bf8070/snapshotter/core_api.py#L273-L324
https://github.com/PowerLoom/pooler/blob/fc08cdd951166ab0cea669d233cd28d0639f628d/snapshotter/core_api.py#L344-L404
```

The first endpoint in GET /last_finalized_epoch/{project_id} returns the last finalized EpochId for a given project ID and the second one is GET /data/{epoch_id}/{project_id}/ which can be used to return the actual snapshot data for a given EpochId and ProjectId.
Expand Down
2 changes: 1 addition & 1 deletion docs/Build-with-Powerloom/UniswapV2 Dashboard/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,4 +70,4 @@ Access [Pooler API Documentation](../Pooler-API-Docs/)

Pooler’s design enables extensions and custom use case implementations. It offers a detailed guide for extending its capabilities, particularly with Uniswap v2 data points. Developers can add new configurations and data models as needed, ensuring Pooler’s adaptability to various requirements. We have a dedicated section in the documentation which walkthrough the details on further implementation and usecases

_TODO: link to extending uniswap v2 dashboard_
Check out our guide on [Extending UniswapV2 Dashboard](/docs/Build-with-Powerloom/UniswapV2%20Dashboard/Extending-Uniswapv2-Dashboard.md)
15 changes: 3 additions & 12 deletions docs/Build-with-Powerloom/build-with-powerloom.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
sidebar_position: 3
---

# Build Applications with Powerloom
# Building applications using Powerloom


Powerloom Protocol offers a versatile platform for developers to build various blockchain-based applications.
Expand All @@ -15,19 +15,10 @@ Data Dashboards can leverage Powerloom to display blockchain data in an interact

### Extending Pooler to implement new datapoints

The possibilities are endless with the diverse range of datasets you can implement. To illustrate this, let's examine an example:
The possibilities are endless with the diverse range of datasets you can implement. To illustrate this, we have a guide that that can help you!

#### New Datapoint: 2 hours aggregate of only swap events
[Extending UniswapV2 Dashboard](/docs/Build-with-Powerloom/UniswapV2%20Dashboard/Extending-Uniswapv2-Dashboard.md)

From the information provided above, the following is left as an exercise for the reader to generate aggregate datasets at every epochId finalization for a pair contract, spanning 2 hours worth of snapshots and containing only Swap event logs and the trade volume generated from them as a result.

> Feel free to fork the [pooler repo](https://github.com/powerloom/pooler) and commit these on your implementation branch. By following the steps recommended for developers for the overall setup on [`deploy`](https://github.com/powerloom/deploy), you can begin capturing aggregates for this datapoint.
To set up this new data aggregation:

1. Add a new entry for the aggregation worker class in the `config/aggregator.json` file.
2. Create a new data model in [`utils/message_models.py`](https://github.com/PowerLoom/pooler/blob/main/snapshotter/modules/pooler/uniswapv2/utils/models/message_models.py). Use `UniswapTradesAggregateSnapshot` and `UniswapTradesSnapshot` as references.
3. Look at the 24-hour aggregation example for guidance. Work on creating a 2-hour time span (`epochId` span) and focus on extracting only `Swap` events and their trade volumes.

---

Expand Down

0 comments on commit 2eb2e2c

Please sign in to comment.