Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Daily cron to do mysql dump #829

Open
aaronminnick opened this issue Oct 22, 2024 · 7 comments
Open

Daily cron to do mysql dump #829

aaronminnick opened this issue Oct 22, 2024 · 7 comments

Comments

@aaronminnick
Copy link
Collaborator

@fool @onewheelskyward

As a developer / user of the Shift website, I would like automated daily(?) backups to occur so that if something goes wrong with the database, my events aren't 'lost'.

Discussing in the meetup, we were talking about the process of restoring the event data if the database got wiped. It is possible to run a mysql dump to get the database as a text file, but this is a manual process. It would be good if we could have a cron running every day to create a dump file, to minimize data loss if something goes wrong with the database (and if AWS backups are not available / accessible).

@fool
Copy link
Member

fool commented Oct 22, 2024

my default oneliner of shift mysqldump | tail +1 | mailx -s "shift db dump" [email protected] doesn't work because mailx doesn't work on prod. Not sure what's up with that exactly. I can see in root's crontab that @onewheelskyward tried something similar before but it's commented out.

Note that you want the tail +1 or something like that because otherwise the first line is a warning about command line passwords which you need to strip out before you can pipe it back into mysql (gzip -cd file.gz | shift mysqlpipe to restore)

@fool
Copy link
Member

fool commented Oct 22, 2024 via email

@fool
Copy link
Member

fool commented Nov 5, 2024

after some struggle I have a dump running at 0243 UTC (was convenient to test with, we can/should adjust based on when the backups run) in the root crontab. The output file is /tmp/mysql-backup-YYYY-MM-DD.gz (with the placeholders set in realtime)

(edit: Hmm, still struggling, not actually running yet)

@fool
Copy link
Member

fool commented Nov 5, 2024

well, this didn't end up working tonight; will try again next week

@fool
Copy link
Member

fool commented Nov 5, 2024

backup runs around midnight, but let's dump at 11p.

crontab is funky so will get back to it next week

@fool
Copy link
Member

fool commented Nov 12, 2024

I have given up on this on the current config. Will take a shot again after we do "something" about #630 . @onewheelskyward suggests moving to RDS rather than moving to non-docker host storage; I think either will work

@ionous
Copy link
Contributor

ionous commented Dec 1, 2024

from slack:

okay, the ubuntu user ( crontab -e ) has a nightly script

0 0 * * * /opt/shift-docs/shift-simon backup &>> /tmp/mysql-backup.log 
shift-simon is a copy of the shift script with a special backup command added:
sub_backup() {
  cd "${ROOT}"
  docker-compose exec -T db mysqldump -u ${MYSQL_USER} -h db -P 3306 -p"${MYSQL_PASSWORD}" --no-tablespaces ${MYSQL_DATABASE} > /tmp/mysql-$(date +%F).out
}

it differs from the regular mysqldump by having -T for docker, and --no-tablespaces for mysql.

-T stops it from allocating a tty console; which disrupts cron:
the input device is not a TTY

--no-tablespaces is some sort of permissions error,
mysqldump: Access denied; you need (at least one of) the PROCESS privilege(s) for this operation

the mysql one might warrant some investigation ( https://dev.mysql.com/doc/relnotes/mysql/8.0/en/news-8-0-21.html#mysqld-8-0-21-security )

TODO:

  • we should add the "backup" command to the real ./shift script and commit it to the repo, then update the crontab
  • we might want to anonymize the database if the intention is for devs to use it. the node code has a tool to take a dump and generate a anonymized sqlite db for testing: npm run -w tools import-mysql --in="/Users/ionous/Downloads/prod.mysql" --out="./bin/test.db not sure if we want to incorporate that in this process somehow.
  • if we want to mail ( a compressed, encrypted? ) version somewhere, we will have to setup the mailer to talk to aws ( or we could write a node tool to do so, since it is configured to talk to the aws mailer )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants