-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Backfill DB with legacy permits #109
Comments
I was able to collect about 500 confirmed Off the top of my head the CLI pulls about double that in permits. Some permits in the DB rn don't have a Not all have a A permit being generated and it being claimed can have an arbitrary length of time between so trying to match Still an angle or two I haven't tried yet but for sake of reference, for legacy permits where I cannot find a |
Permits have been generated from different signers at different moments in time so using only on-chain data might not be too accurate. It makes sense to utilize github API for fetching permits.
I don't think claimed permits should miss tx hash.
You may leave
Github API might have this info.
I think most of them should have a |
As far as I can tell with the CLI output it has been four addresses and I used all four parsing on-chain.
I assume you mean the comment metadata here yeah? The CLI collects permits via the ty for the clarification on the rest |
By the CLI you mean https://github.com/ubiquity/.github/tree/main/airdrop-cli, right? How does it work exactly? Does it parse only on-chain transactions from permit signer or uses github API to fetch all permits info from the |
It parses comments from the bot and pav extracting the permit data using the It uses four different regex to capture the various formats that have existed, and likely needs updated again to match the current format. I collected data from Dune manually using the four payment addresses The tally from Dune alone is far lower than what it is expected to be. You can only trace back 50k (might be 5k off the top of my head) blocks at a time using a provider and scripting for each hunter which wasn't really feasible so Dune was the better option. I can't think how you'd get the Any suggestions? It's doable anyway, I just needed those points clarified |
We had a similar task of matching github issues with on-chain transactions at https://github.com/ubiquity/audit.ubq.fi and used etherscan API, although I don't remember a particular API method that was used (this one or this one or some other) |
I wasn't aware of the issue or that etherscan had those endpoints actually, I checked and gnosisscan has them also. They don't seem to be limited by the same block limit either and use they use same method, awesome. |
I've been working on this and will have an open PR ready tonight or tomorrow |
I've had to pretty much refactor the CLI entirely for the following reasons:
I spent far more hours than estimated. I wanted to be confident that the data was as verifiable as I could make it and not push a half-arsed approach. |
Required by ubiquity/work.ubq.fi#46
Using the CLI as-is or with required changes:
a) Collect permit data across ubiquity and ubiquibot orgs
b) Insert this data in the permits table (idempotently)
original comment ubiquity/work.ubq.fi#46 (comment)
time estimate: < 4 hrs
The text was updated successfully, but these errors were encountered: