Skip to content
This repository has been archived by the owner on Mar 5, 2023. It is now read-only.

Round 15 Tech Earmarks

Trent McConaghy edited this page Feb 17, 2022 · 11 revisions

Enhance Ocean’s Fine-Grained Permissions to support Verifiable Credentials (vs. just Ethereum addresses) in allowlist and denylist. The schema already allows it. This includes at the level of consuming data, and at Ocean Market level for publishing, buying, etc.

First-class integration with WebSockets as a data service. It would include a fork of Ocean Provider, plus affordances elsewhere in the stack. First class = It should touch all the required components, e.g. Ocean Market frontend if appropriate, Provider, Aquarius, etc. May include updates to Provider, Aquarius, etc. Ultimately it should be as low friction as possible to use, without additional centralization.

First-class integration with Arweave, Filecoin, etc as a data service. Other possible storage networks that fall into this earmark include IPFS pinned data, Arcana, Ethereum Swarm, Storj, and Sia; or aggregators like Filebase and Pinata. First class = It should touch all the required components, e.g. Ocean Market frontend if appropriate, Provider, Aquarius, etc. May include updates to Provider, Aquarius, etc. Ultimately it should be as low friction as possible to use, without additional centralization. Arweave may already have some support; this needs to be validated; then the aim is to remove friction and leverage Arweave more fully; perhaps kyve.network or redstone.finance tools will help. Ocean already supports to some extent via IPFS; the aim here is to remove friction and leverage Filecoin more fully.

For v4 Data NFTs, code that fills in a more interesting image behind the tokenUri. As an extra incentive beyond the grant, the people who work on the ideas may get a spotlight in the marketplace. Note that people can always update tokenUri. Bonus: Dynamically update image when rendered, eg from from an oracle or an api. There may be some emerging standards on this. Maybe even a new app / widget: dashboard based on dynamic NFTs, where each NFT is e.g. a graph of stats. Note that there seem to be no broadly agreed-upon standards; e.g. Uniswap is using base64 svgs, OpenSea something different, etc.

If I use C2D to train an algorithm, there’s an easy flow to re-publish it right away. Example use case: an AI model is trained with C2D, and the model is republished with its inference as C2D in Ocean Market. Includes perhaps having its own Ocean Provider. The status quo is that user must do many intermediate steps. Note that V4 will make this easier in that V4 C2D backend will publish the asset, then do a transfer. One Q is where does the inference code reside (Web2 hosted? Web3 hosted)?

Allow approval of C2D algorithms for compute-to-data assets in Ocean Market. Currently the allow-listing of algorithms for compute-to-data assets is only possible by manually adding them by the publisher. This makes it necessary that an algorithm developer has to get into contact with the publisher. The feature proposed here will allow a more automated process where the algorithm creator can propose the allow-listing of the algorithm via the Ocean Market. It will be necessary that the algorithm creator shares access to the algorithm either as a Docker container or via GitHub or other code repositories to the compute-to-data asset publisher so they can check it. If they accept the proposal, the algorithm will get allow-listed via a change of the metadata of the compute-to-data asset. In case of a rejection they have to give a reason for the rejection.

First-class integration plus community engagement with data science tech communities. Any of the following: HuggingFace, eLeuther, fastAI, OpenMined, TensorFlow and variants, PapersWithCode, Scikit-learn, Anaconda, OpenML, Kaggle, or (any other similar - please suggest to us!). Example for scikit-learn: imagine “import sklearn.ocean”. USPs: provenance of data & compute for scientific reproducibility and GDPR, private compute on open datasets, sell algorithms and keep them private (on open data), and ultimately an “Ocean” of all datasets & algorithms (possible with incentives).

Build comments & ratings into Ocean Market, as a smart contract extension. It should be integratable by any Ocean-powered market. Uses decentralized storage. Includes affordances to prevent spam.

Mobile data DAO. Install the app, auto-join a DAO, give permission to share location & other data on the phone, get tokens (big incentive to join). The DAO sells the aggregated data of its members on Ocean Market, preserving privacy via Ocean Compute-to-Data

First-class integration of market with Google Dataset Search. Dataset Search is a search engine for datasets. By adding a little structured data (JSON-LD or RDFa) to the market HTML, it will allow users outside of the Ocean ecosystem to more easily find and purchase datasets. Certain properties already exist for a DID, such as name, description, creator, license, etc. Others may be added when datasets are published such as temporalCoverage, spatialCoverage, Tabular datasets, etc. Complete list of properties.

Decentralized social media. E.g. email, messaging, twitter-like, FB-like. Email web/mobile app powered by Ocean. Each "email" = a different ERC20 data asset, "send email" = "send access token", etc. Plays well with existing email protocols. More inspiration: Messari 2022 crypto report, "DeSo", page 111: "What happens when you combine PFPs, permanent .eth identifiers, data composability, and data marketplaces that price data packets with AI? You get a Decentralized Social Network and potential lotterylike rewards for early and viral user-generated content."

DeSo

Note: if you would like to influence lists for future rounds, then please come join the Core Tech WG. Here's the discord.

Parent page: OceanDAO Tech Earmarks

Clone this wiki locally