Skip to content

Commit

Permalink
fix: links and references
Browse files Browse the repository at this point in the history
  • Loading branch information
xadahiya committed Dec 1, 2023
1 parent e36c6b5 commit cbdd76d
Show file tree
Hide file tree
Showing 12 changed files with 34 additions and 34 deletions.
2 changes: 1 addition & 1 deletion docs/Protocol/Specifications/Snapshotter/components.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ https://github.com/PowerLoom/pooler/blob/634610801a7fcbd8d863f2e72a04aa8204d27d0
Upon receiving a message from the processor distributor after preloading is complete, the workers do most of the heavy lifting along with some sanity checks and then call the `compute()` callback function on the project's configured snapshot worker class to transform the dependent data points as cached by the preloaders to finally generate the base snapshots.

:::info
[Snapshot generation specification](/docs/protocol/specifications/snapshotter/snapshot-build.md)
[Snapshot generation specification](/docs/protocol/specifications/snapshotter/snapshot-build)
:::

## RPC Helper
Expand Down
8 changes: 4 additions & 4 deletions docs/Protocol/Specifications/Snapshotter/implementations.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The Snapshotter Peer is designed with a modular and highly configurable architec

## Snapshotter Core

This foundational component defines all the essential interfaces and handles a wide range of tasks, from listening to epoch release events to distributing tasks and managing snapshot submissions. Read more about it in the detailed section on its [components](/docs/protocol/specifications/snapshotter/components.md).
This foundational component defines all the essential interfaces and handles a wide range of tasks, from listening to epoch release events to distributing tasks and managing snapshot submissions. Read more about it in the detailed section on its [components](/docs/protocol/specifications/snapshotter/components).


## Data market specific
Expand Down Expand Up @@ -40,6 +40,6 @@ The heart of the system resides in the `snapshotter/modules` directory that's li

# Useful links

* [Snapshot generation specifications](/docs/protocol/specifications/snapshotter/snapshot-build.md)
* [Data markets and sources](/docs/protocol/data-sources.md)
* [Composition of snapshots and higher order datapoints](/docs/protocol/data-composition.md)
* [Snapshot generation specifications](/docs/protocol/specifications/snapshotter/snapshot-build)
* [Data markets and sources](/docs/protocol/data-sources)
* [Composition of snapshots and higher order datapoints](/docs/protocol/data-composition)
16 changes: 8 additions & 8 deletions docs/Protocol/Specifications/Snapshotter/snapshot-build.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ sidebar_position: 2
## Snapshot computation modules
---

As briefly introduced in the section on snapshotter implementations that [leverage Git Submodules for specific computation logic](/docs/protocol/specifications/snapshotter/implementations.md), the modules are specified in the configuration for project types under the key `processor`.
As briefly introduced in the section on snapshotter implementations that [leverage Git Submodules for specific computation logic](/docs/protocol/specifications/snapshotter/implementations), the modules are specified in the configuration for project types under the key `processor`.

```json reference
https://github.com/PowerLoom/snapshotter-configs/blob/39e4713cdd96fff99d100f1dea7fb7332df9e491/projects.example.json#L15-L28
Expand Down Expand Up @@ -51,7 +51,7 @@ https://github.com/PowerLoom/pooler/blob/634610801a7fcbd8d863f2e72a04aa8204d27d0
```


3. Else, we can have a [static list of contracts](/docs/protocol/data-sources.md#static-data-sources)
3. Else, we can have a [static list of contracts](/docs/protocol/data-sources#static-data-sources)

### Data source specification: Bulk mode

Expand All @@ -71,7 +71,7 @@ https://github.com/PowerLoom/snapshotter-configs/blob/39e4713cdd96fff99d100f1dea

This allows for the flexibility to filter through all transactions and blocks without the need for predefined data sources.

The `Processor Distributor` generates a `SnapshotProcessMessage` with bulk mode enabled for each project type. When snapshot workers receive this message, they leverage [common preloaders](/docs/protocol/specifications/snapshotter/preloading.md#shipped-preloaders)to filter out relevant data.
The `Processor Distributor` generates a `SnapshotProcessMessage` with bulk mode enabled for each project type. When snapshot workers receive this message, they leverage [common preloaders](/docs/protocol/specifications/snapshotter/preloading#shipped-preloaders)to filter out relevant data.

```python reference
https://github.com/PowerLoom/pooler/blob/634610801a7fcbd8d863f2e72a04aa8204d27d03/snapshotter/processor_distributor.py#L717-L730
Expand All @@ -81,7 +81,7 @@ https://github.com/PowerLoom/pooler/blob/634610801a7fcbd8d863f2e72a04aa8204d27d0
Since common datapoints like block details, transaction receipts etc are preloaded, this approach can efficiently scale to accommodate a large number of project types with little to no increase in RPC (Remote Procedure Call) calls.
:::

Whenever a data source is added or removed by the [signaling ecosystem](/docs/protocol/data-sources.md#data-source-signaling), the protocol state smart contract emits a `ProjectUpdated` event with the following data model.
Whenever a data source is added or removed by the [signaling ecosystem](/docs/protocol/data-sources#data-source-signaling), the protocol state smart contract emits a `ProjectUpdated` event with the following data model.

```python reference
https://github.com/PowerLoom/pooler/blob/5892eeb9433d8f4b8aa677006d98a1dde0458cb7/snapshotter/utils/models/data_models.py#L102-L105
Expand Down Expand Up @@ -111,12 +111,12 @@ https://github.com/PowerLoom/snapshotter-computes/blob/6fb98b1bbc22be8b5aba8bdc8
## Aggregate snapshots
---

Aggregate and higher order snapshots that build on base snapshots are configured in their specific repos like the following in our [Uniswap V2 Dashboard use case](/docs/category/uniswapv2-dashboard). This is where you see the [dependency graph of snapshot composition](/docs/protocol/data-composition.md#dependency-graph) in action.
Aggregate and higher order snapshots that build on base snapshots are configured in their specific repos like the following in our [Uniswap V2 Dashboard use case](/docs/category/uniswapv2-dashboard). This is where you see the [dependency graph of snapshot composition](/docs/protocol/data-composition#dependency-graph) in action.

:::info

* [Single project composition](/docs/protocol/data-composition.md#single-project-composition)
* [Multi project composition](/docs/protocol/data-composition.md#multiple-projects-composition)
* [Single project composition](/docs/protocol/data-composition#single-project-composition)
* [Multi project composition](/docs/protocol/data-composition#multiple-projects-composition)
* [Walkthrough of the snapshotter implementation for Uniswap V2 dashboard](docs/category/tour-of-the-existing-implementation)
:::

Expand All @@ -133,7 +133,7 @@ https://github.com/PowerLoom/snapshotter-configs/blob/fcf9b852bac9694258d7afcd8b
* For eg, a base snapshot build on a project ID `pairContract_trade_volume:0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc:UNISWAPV2` triggers the worker `AggreagateTradeVolumeProcessor` as defined in the `processor` config, against the pair contract `0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc`.
* The span of epochs on which corresponding base snapshots will be aggregated is determined by the logic contained in the module specified in the `processor` key

The following implementation aggregates [trade volume snapshots](/docs/build-with-powerloom/uniswapv2-dashboard/tour-of-existing-implementation/closer-inspection-of-the-snapshot-datasets.md#extracting-base-snapshots-trade-data-logic) across a span of 24 hours worth of epochs, if available. Else, it aggregates the entire span of epochs available on the protocol against the data market and reports it back.
The following implementation aggregates [trade volume snapshots](/docs/build-with-powerloom/use-cases/existing-implementations/uniswapv2-dashboard/closer-inspection-of-the-snapshot-datasets#extracting-base-snapshots-trade-data-logic) across a span of 24 hours worth of epochs, if available. Else, it aggregates the entire span of epochs available on the protocol against the data market and reports it back.

```python reference
https://github.com/PowerLoom/snapshotter-computes/blob/6fb98b1bbc22be8b5aba8bdc860004d35786f4df/aggregate/single_uniswap_trade_volume_24h.py#L110-L121
Expand Down
4 changes: 2 additions & 2 deletions docs/Protocol/Specifications/Snapshotter/state-machine.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,6 @@ sidebar_position: 4

Refer to the following sections for detailed information on the state transitions that a snapshotter participates in while processing data sources per epoch.

* [Epoch processing state transitions](Protocol/Specifications/epoch.md#state-transitions)
* [Snapshotter internal APIs](/docs/Snapshotters/health-tracking.md)
* [Epoch processing state transitions](protocol/specifications/epoch#state-transitions)
* [Snapshotter internal APIs](/docs/build-with-powerloom/snapshotter-node/health-tracking)

2 changes: 1 addition & 1 deletion docs/Protocol/data-composition.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ As defined by the data sources configuration, the protocol state collects snapsh

## Higher order aggregations

An example of this can be found in the [Uniswap V2 dashboard implementation](/docs/build-with-powerloom/uniswapv2-dashboard/index.md) where trade activity aggregation dataset is generated by
An example of this can be found in the [Uniswap V2 dashboard implementation](/docs/build-with-powerloom/use-cases/existing-implementations/uniswapv2-dashboard/) where trade activity aggregation dataset is generated by

* combining individual snapshots of trade volume and feees across multiple pair contracts
* spanning a specific set of epochs that satisfy a time duration, for eg, 24 hours
Expand Down
14 changes: 7 additions & 7 deletions docs/Protocol/data-sources.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,15 +5,15 @@ title: Data markets and sources

# Data markets

Snapshotters generate snapshots, base as well as higher order, composable snapshots according to the data sources _defined by the data markets they participate in_. Every data market maintains its [specific protocol state](/docs/protocol/specifications/protocol-state.md) regarding submission, calculation and finalization of such snapshots.
Snapshotters generate snapshots, base as well as higher order, composable snapshots according to the data sources _defined by the data markets they participate in_. Every data market maintains its [specific protocol state](/docs/protocol/specifications/protocol-state) regarding submission, calculation and finalization of such snapshots.

The data sources defined by a market can be static as well as dynamic, depending on the use case at hand.

## Static data sources

This is utilized by our own implementation of a data market that serves datasets to render a live Uniswap V2 dashboard. You can find further details of the data source configuration and snapshot schema in the following section within our documentation

* [Building with Powerloom -- Uniswap V2 Dashboard](/docs/build-with-powerloom/uniswapv2-dashboard/index.md)
* [Building with Powerloom -- Uniswap V2 Dashboard](/docs/build-with-powerloom/use-cases/existing-implementations/uniswapv2-dashboard/)

We will continue with this example and take a look at the data sources list defined as part of the Uniswap v2 specific configuration to be found in the [`snapshotter-configs`](https://github.com/PowerLoom/snapshotter-configs/blob/fcf9b852bac9694258d7afcd8beeaa4cf961c65f/projects.example.json#L1-L11) repo.

Expand All @@ -27,7 +27,7 @@ The `projects` field in the above configuration snippet are nothing but the Unis

For situations where data sources are constantly changing or numerous, making it impractical to maintain an extensive list of them, the data sources need not be defined explicitly in the configuration.

Instead it is left to the snapshotter implementation to operate in ['bulk mode'](/docs/protocol/specifications/snapshotter/snapshot-build.md#bulk-mode). The data source configuration merely specifies the computation modules which will utilize general purpose, [preloaded](/docs/protocol/specifications/snapshotter/preloading.md) datasets to filter out transactions, event logs etc on contract addresses of interest. This is where signaling of data sources comes into picture.
Instead it is left to the snapshotter implementation to operate in ['bulk mode'](/docs/protocol/specifications/snapshotter/snapshot-build#bulk-mode). The data source configuration merely specifies the computation modules which will utilize general purpose, [preloaded](/docs/protocol/specifications/snapshotter/preloading) datasets to filter out transactions, event logs etc on contract addresses of interest. This is where signaling of data sources comes into picture.

```json reference
https://github.com/PowerLoom/snapshotter-configs/blob/39e4713cdd96fff99d100f1dea7fb7332df9e491/projects.example.json#L1-L28
Expand All @@ -39,18 +39,18 @@ Data sources can be dynamically added to the contract according to the role of c

In the present implementation of the use case that tracks wallet activity for Quests on Polygon zkEVM, such wallets are added from a data feed supplied by Mercle that consists of wallets that signup on their platform. Only these wallet addresses are of interest to the Quest platform on Mercle for their activities to be tracked across DEXs and asset bridges.

Read more about it in the [snapshotter specs of bulk node](/docs/protocol/specifications/snapshotter/snapshot-build.md#bulk-mode).
Read more about it in the [snapshotter specs of bulk node](/docs/protocol/specifications/snapshotter/snapshot-build#bulk-mode).

![Mercle data source signaling](/images/data_source_signaling_example.png)


## Project types and IDs

All data sources are tracked with a project ID on the protocol. Think of it as a stream of datasets, finalized by consensus against [each epoch released](/docs/protocol/specifications/epoch.md#1-epoch_released) on the protocol.
All data sources are tracked with a project ID on the protocol. Think of it as a stream of datasets, finalized by consensus against [each epoch released](/docs/protocol/specifications/epoch#1-epoch_released) on the protocol.

Find more details on this in the [specifications of snapshot generation](/docs/protocol/specifications/snapshotter/snapshot-build.md).
Find more details on this in the [specifications of snapshot generation](/docs/protocol/specifications/snapshotter/snapshot-build).


## Useful links and concepts

* [Modular architecture of Use case specific snapshotter implementations](/docs/protocol/specifications/snapshotter/implementations.md)
* [Modular architecture of Use case specific snapshotter implementations](/docs/protocol/specifications/snapshotter/implementations)
4 changes: 2 additions & 2 deletions docs/build-with-powerloom/accessing-data.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Data Dashboards can leverage Powerloom to display blockchain data in an interact

The possibilities are endless with the diverse range of datasets you can implement. To illustrate this, we have a guide that that can help you!

[Extending UniswapV2 Dashboard](/docs/build-with-powerloom/use-cases/existing-implementations/uniswapv2-dashboard/extending-uniswapv2-dashboard.md)
[Extending UniswapV2 Dashboard](/docs/build-with-powerloom/use-cases/existing-implementations/uniswapv2-dashboard/extending-uniswapv2-dashboard)


---
Expand Down Expand Up @@ -97,7 +97,7 @@ In Pooler, `config/aggregator.json` is a file that outlines different types of d
```


The following configuration generates a collection of data sets of 24 hour trade volume as calculated by the worker above across multiple pair contracts. This can be seen by the `aggregate_on` key being set to `MultiProject`. * `projects_to_wait_for` specifies the exact project IDs on which this collection will be generated once a snapshot build has been achieved for an [`epochId`](/docs/protocol/specifications/epoch.md).
The following configuration generates a collection of data sets of 24 hour trade volume as calculated by the worker above across multiple pair contracts. This can be seen by the `aggregate_on` key being set to `MultiProject`. * `projects_to_wait_for` specifies the exact project IDs on which this collection will be generated once a snapshot build has been achieved for an [`epochId`](/docs/protocol/specifications/epoch).

```json

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ You can tunnel into port `8002` of an instance running the snapshotter and right

## `GET /internal/snapshotter/epochProcessingStatus`

As detailed out in the section on [epoch processing state transitions](Protocol/Specifications/epoch.md#state-transitions), this internal API endpoint offers the most detailed insight into each epoch's processing status as it passes through the snapshot builders and is sent out for consensus.
As detailed out in the section on [epoch processing state transitions](protocol/specifications/epoch.md#state-transitions), this internal API endpoint offers the most detailed insight into each epoch's processing status as it passes through the snapshot builders and is sent out for consensus.

> NOTE: The endpoint, though paginated and cached, serves a raw dump of insights into an epoch's state transitions and the payloads are significantly large enough for requests to timeout or to clog the internal API's limited resource. Hence it is advisable to query somewhere between 1 to 5 epochs. The same can be specified as the size query parameter.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ This process will allow you to review the aggregated data for the top pairs on U
## Extracting Base Snapshots: Trade Data Logic

:::info
Before you dive into this section, please make sure you take a look into the [Project Configuration Section](/docs/build-with-powerloom/use-cases/existing-implementations/uniswapv2-dashboard/fetching-higher-order-datapoints.md#project-configuration)
Before you dive into this section, please make sure you take a look into the [Project Configuration Section](/docs/build-with-powerloom/use-cases/existing-implementations/uniswapv2-dashboard/fetching-higher-order-datapoints#project-configuration)
:::

In the last section, we explored how to get data from the protocol state contract and see it in JSON format through the IPFS Gateway. Next, we're going to explore how trade data is processed in basic snapshots.
Expand Down Expand Up @@ -124,7 +124,7 @@ The `TradeVolumeProcessor` collects and stores information about trades that hap

- As we explored in the previous section, the `TradeVolumeProcessor` logic takes care of capturing a snapshot of information regarding Uniswap v2 trades between the block heights of `min_chain_height` and `max_chain_height`.

- The epoch size as described in the prior section on [epoch generation](/docs/protocol/specifications/epoch.md) can be considered to be constant for this specific implementation of the Uniswap v2 use case on PowerLoom Protocol, and by extension, the time duration captured within the epoch.
- The epoch size as described in the prior section on [epoch generation](/docs/protocol/specifications/epoch) can be considered to be constant for this specific implementation of the Uniswap v2 use case on PowerLoom Protocol, and by extension, the time duration captured within the epoch.

- The finalized state and data CID corresponding to each epoch can be accessed on the smart contract on the anchor chain that holds the protocol state. The corresponding helpers for that can be found in `get_project_epoch_snapshot()` in [`pooler/snapshotter/utils/data_utils.py`](hhttps://github.com/PowerLoom/pooler/blob/main/snapshotter/utils/data_utils.py)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Project IDs are unique identifiers in Pooler that correspond to specific pair co

#### Config File: `projects.json` defines project types and associated smart contract addresses.
- **Structure:**
- `project_type` - unique identifier prefix for the usecase, [used to generate project ID](protocol/specifications/snapshotter/snapshot-build.md)
- `project_type` - unique identifier prefix for the usecase, [used to generate project ID](/docs/protocol/specifications/snapshotter/snapshot-build)
- `projects` - smart contracts to extract data from, pooler can generate different snapshots from multiple sources as long as the Contract ABI is same
- `processor` - the actual compuation logic reference, while you can write the logic anywhere, it is recommended to write your implementation in pooler/modules folder
```json reference
Expand Down
Loading

0 comments on commit cbdd76d

Please sign in to comment.