-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Backfill database from existing ElasticSearch cluster #36
Comments
Include votes and flags |
I'm still searching for a way to export elasticsearch data to mysql. Do you know of any way to so that? |
I was going to build a script from this, but I have gotten a new computer since then and don't have your backup file anymore. Would you prefer a backfill from an export, or backfill from scrolling through elasticsearch itself? |
No worries :) I don't mind whatever is faster I guess. |
I could provide a smaller backup file from the old scraper if you like ? |
That would help. Thanks
…On Wed, Jan 30, 2019 at 7:56 AM ash121121 ***@***.***> wrote:
I could provide a smaller backup file from the old scraper if you like ?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#36 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ACnvZrnYGiouLVYBxau7l0Oa_kHq2jxTks5vIaRwgaJpZM4Xw6fC>
.
|
sorry got caught up in other stuff. ill get you a backup tonight. |
Awesome. Thanks
…On Thu, Jan 31, 2019 at 6:57 AM ash121121 ***@***.***> wrote:
sorry got caught up in other stuff. ill get you a backup tonight.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#36 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ACnvZtiHYQjsz_iFFVEroUMATUn8YjY6ks5vIug9gaJpZM4Xw6fC>
.
|
Was just thinking if used the backup I have of 20 million torrents and imported them to the new database . then wouldn't i end up with the same issue with tons of torrents not updated with seeders/leechers info. also just putting an idea out there. what about feeding the scraper a backup file of just infohashes? could this be done? id pay for this :D @Prefinem |
That would be really amazing! /me tosses in some BTC. |
That should be doable. If you can get me a backup file, I can create an import script. I lost the last one you sent me when I got a new computer. So... just to summarize:
Would that work? |
That would be perfect :) I'll do the database today for you. And yes if we could feed it just a ton of infohashes and it could run them through and get all the meta info etc? Send me your PayPal or Bitcoin and I'll donate some |
@Prefinem here is a small database. |
Is that ok for you mate ? |
Yes. That should work. I will try and make it efficient for large files.
…On Fri, Feb 1, 2019 at 7:50 PM ash121121 ***@***.***> wrote:
Is that ok for you mate ?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#36 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ACnvZnUCQ7OsuTh1qUFAecJ_dK-h0I2Nks5vJO7VgaJpZM4Xw6fC>
.
|
The large back up file I have is around 25gb I believe. Would this be too large ? If so maybe it would be better to export to mysql by querying elasticsearch directly? Kind regards |
Not sure if node will be able to read that large if a file.
Is there a way to get a connection to your elastic search server? I could
write a command to scroll through all the documents and put them into the
database. That or a backup of the full dataset to import into a local
elastic search server
…On Fri, Feb 1, 2019 at 8:08 PM ash121121 ***@***.***> wrote:
The large back up file I have is around 25gb I believe. Would this be too
large ? If so maybe it would be better to export to mysql by querying
elasticsearch directly?
Kind regards
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#36 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ACnvZtxJ6QQJCKLtcXQ-JivWCMEQOyisks5vJPL_gaJpZM4Xw6fC>
.
|
Sure I'll send you the ip. |
Thanks!
…On Fri, Feb 1, 2019 at 8:17 PM ash121121 ***@***.***> wrote:
Sure I'll send you the ip.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#36 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ACnvZnnzyvxBc2vJDZFBRiWwrvwpw_mGks5vJPVMgaJpZM4Xw6fC>
.
|
No description provided.
The text was updated successfully, but these errors were encountered: