Skip to content
This repository has been archived by the owner on Mar 5, 2023. It is now read-only.

Round 20 Tech Earmarks

Alex Coseru edited this page Jul 27, 2022 · 10 revisions

(Parent page: OceanDAO Tech Earmarks)

Decentralized Backend Storage; First-class integration with Filecoin or Arweave (Web3 storage). Here's the WIP specs:

First-class integration with Chainlink or TheGraph (Web3 streaming data). What follows is a placeholder guideline; after R20 and when the spec is written, then Ocean Shipyard will be the vehicle to fund this.

  • Integration includes both publishing and consuming. “Publishing” means a data stream is wrapped as an Ocean data NFT. It does not need to include storing the asset itself; that can be done from the network’s existing GUIs. “Consuming” means that the consumer can access the data feed, or compute against a data feed.
  • There are three parts: backend, frontend, populate.
    • Backend. Implementation will include a PR or fork of oceanprotocol/provider (side-by-side with existing data services of uri and C2D), a PR of oceanprotocol/aquarius, a PR of oceanprotocol/ocean.py including a README of how to use, and maybe more. Backend users should not have to provide any new tokens to use the network; if needed those tokens should be converted on the fly.
    • Frontend. Implementing this includes a PR to oceanprotocol/market, oceanprotocol/ocean.js, and maybe more. It should be as low-friction to use the new network as existing data services of uri and C2D. Frontend users should not have to provide any new tokens to use the network; if needed those tokens should be converted on the fly.
    • Populate Ocean Market. The outcome should be: every asset of the network (Chainlink, TheGraph) should be available as a data NFT plus datatoken at the level of Ocean smart contracts, in Aquarius, and in Ocean Market.
  • Ideally there is be no additional centralization. Practically: using some centralized components that will be decentralized over time is ok too. Eg a gateway from TheGraph. That is, it's ok for trustless (run it yourself) or convenient (use the one run by OPF or other), but not forcing to have both up front (eg via proxy re-encryption). Just like Ocean Provider.

First-class integration with WebSockets Web2 data streaming service service. Spec is akin to the Web3 streaming data above.

First-class integration plus community engagement with data science tech communities. Any of the following: HuggingFace, eLeuther, fastAI, OpenMined, TensorFlow and variants, PapersWithCode, Scikit-learn, Anaconda, OpenML, Kaggle, or (any other similar - please suggest to us!). Example for scikit-learn: imagine “import sklearn.ocean”. USPs: provenance of data & compute for scientific reproducibility and GDPR, private compute on open datasets, sell algorithms and keep them private (on open data), and ultimately an “Ocean” of all datasets & algorithms (possible with incentives).

For v4 Data NFTs, code that fills in a more interesting image behind the tokenUri. As an extra incentive beyond the grant, the people who work on the ideas may get a spotlight in the marketplace. Note that people can always update tokenUri. Bonus: Dynamically update image when rendered, eg from from an oracle or an api. There may be some emerging standards on this. Maybe even a new app / widget: dashboard based on dynamic NFTs, where each NFT is e.g. a graph of stats. Note that there seem to be no broadly agreed-upon standards; e.g. Uniswap is using base64 svgs, OpenSea something different, etc.

If I use C2D to train an algorithm, there’s an easy flow to re-publish it right away. Example use case: an AI model is trained with C2D, and the model is republished with its inference as C2D in Ocean Market. Includes perhaps having its own Ocean Provider. The status quo is that user must do many intermediate steps. Note that V4 will make this easier in that V4 C2D backend will publish the asset, then do a transfer. One Q is where does the inference code reside (Web2 hosted? Web3 hosted)?

Allow approval of C2D algorithms for compute-to-data assets in Ocean Market. Currently the allow-listing of algorithms for compute-to-data assets is only possible by manually adding them by the publisher. This makes it necessary that an algorithm developer has to get into contact with the publisher. The feature proposed here will allow a more automated process where the algorithm creator can propose the allow-listing of the algorithm via the Ocean Market. It will be necessary that the algorithm creator shares access to the algorithm either as a Docker container or via GitHub or other code repositories to the compute-to-data asset publisher so they can check it. If they accept the proposal, the algorithm will get allow-listed via a change of the metadata of the compute-to-data asset. In case of a rejection they have to give a reason for the rejection.

Mobile data DAO. Install the app, auto-join a DAO, give permission to share location & other data on the phone, get tokens (big incentive to join). The DAO sells the aggregated data of its members on Ocean Market, preserving privacy via Ocean Compute-to-Data

Build comments & ratings into Ocean Market, as a smart contract extension. It should be integratable by any Ocean-powered market. Uses decentralized storage. Includes affordances to prevent spam.

Signed Claims on Data Assets in Ocean Market. What specifically: a party with a given eth address can make a digitally-signed claim about a given data asset, in Ocean Market. Examples: "stamp of approval" on a dataset; or "this algorithm did what it said". Ocean V4 backend supports this: any claim or comment is its own Data NFT, which points to another Data NFT. Then there can be a frontend for signed claims (this earmark), or for comments/ratings (work elsewhere).

Enhance Ocean’s Fine-Grained Permissions to support Verifiable Credentials (vs. just Ethereum addresses) in allowlist and denylist. The schema already allows it. This includes at the level of consuming data, and at Ocean Market level for publishing, buying, etc.

Documentation. Work with Ocean core team to improve documentation of the core tech.

Decentralized social media. E.g. email, messaging, twitter-like, FB-like. Email web/mobile app powered by Ocean. Each "email" = a different ERC20 data asset, "send email" = "send access token", etc. Plays well with existing email protocols. More inspiration: Messari 2022 crypto report, "DeSo", page 111: "What happens when you combine PFPs, permanent .eth identifiers, data composability, and data marketplaces that price data packets with AI? You get a Decentralized Social Network and potential lotterylike rewards for early and viral user-generated content.".

Bootstrap & maintain community market. How to bootstrap this? Who is going to maintain it. We need a proposal.

Note: if you would like to influence lists for future rounds, then please come join the Core Tech WG. Here's the discord.

Clone this wiki locally