
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 I did a first pass at creating a backup process for gerrit.ovirt.org: http://ovirt.org/wiki/Gerrit_backup Right now it uses rsync on /usr/local/src/git to a remote directory at ovirt.org:/home/gerrit-backup/gerrit.ovirt.org-src-backup/src/ . So that's my hacky process and bash script to start. Very open to other ideas. One I'm not clear on is what we really need backed up from gerrit.ovirt.org? I'd like us to get /etc in a private git repo, maybe we just do that as a single repo that we distribute across all hosts' /root directories? Should we also be using git for backing up the git repo - so have gerrit-backup on the remote host pull changes instead of pushing a directory structure via rsync? Itamar mentions that we need to backup the Gerrit DB, which I'll work on next. I reckon that's a combination of a dump of the database and rsyncing the results to the remote host? - - Karsten - -- name: Karsten 'quaid' Wade, Sr. Community Architect team: Red Hat Community Architecture & Leadership uri: http://communityleadershipteam.org http://TheOpenSourceWay.org gpg: AD0E0C41 -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iD8DBQFO344x2ZIOBq0ODEERAu3TAKDDLtt5jr/a78mMLnW8T0eqSVx1fQCgi2c1 7lTFWA0VPPtuuIkokqk+JbI= =xt+L -----END PGP SIGNATURE-----

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 12/07/2011 08:02 AM, Karsten 'quaid' Wade wrote:
I did a first pass at creating a backup process for gerrit.ovirt.org:
This is now: http://ovirt.org/wiki/Gerrit_server_backup It includes the addition of a PostgreSQL database backup script. All the backups for this are found on the remote host (ovirt.org) /home/gerrit-backup. In testing I've run initial backups and copies to the remote host of the git repo and gerrit database; they'll also run tonight in a few hours (10 and 10:20 pm East coast time.) - - Karsten
Right now it uses rsync on /usr/local/src/git to a remote directory at ovirt.org:/home/gerrit-backup/gerrit.ovirt.org-src-backup/src/ .
So that's my hacky process and bash script to start. Very open to other ideas.
One I'm not clear on is what we really need backed up from gerrit.ovirt.org? I'd like us to get /etc in a private git repo, maybe we just do that as a single repo that we distribute across all hosts' /root directories? Should we also be using git for backing up the git repo - so have gerrit-backup on the remote host pull changes instead of pushing a directory structure via rsync?
Itamar mentions that we need to backup the Gerrit DB, which I'll work on next. I reckon that's a combination of a dump of the database and rsyncing the results to the remote host?
- Karsten _______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra
- -- name: Karsten 'quaid' Wade, Sr. Community Architect team: Red Hat Community Architecture & Leadership uri: http://communityleadershipteam.org http://TheOpenSourceWay.org gpg: AD0E0C41 -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iD8DBQFO3/zw2ZIOBq0ODEERAlejAKDG3/9lQmdD+QLI/Ur+ArWKqUPGRwCg2/ew oI6DatY8SIv+6uHIMvsMoDM= =U0S9 -----END PGP SIGNATURE-----

On 12/08/2011 01:55 AM, Karsten 'quaid' Wade wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 12/07/2011 08:02 AM, Karsten 'quaid' Wade wrote:
I did a first pass at creating a backup process for gerrit.ovirt.org:
This is now:
http://ovirt.org/wiki/Gerrit_server_backup
It includes the addition of a PostgreSQL database backup script. All the backups for this are found on the remote host (ovirt.org) /home/gerrit-backup.
In testing I've run initial backups and copies to the remote host of the git repo and gerrit database; they'll also run tonight in a few hours (10 and 10:20 pm East coast time.)
seems like backing up the gerrit2 user home folder would help as well: http://groups.google.com/group/repo-discuss/browse_thread/thread/9d7793201d8... thanks, Itamar
- - Karsten
Right now it uses rsync on /usr/local/src/git to a remote directory at ovirt.org:/home/gerrit-backup/gerrit.ovirt.org-src-backup/src/ .
So that's my hacky process and bash script to start. Very open to other ideas.
One I'm not clear on is what we really need backed up from gerrit.ovirt.org? I'd like us to get /etc in a private git repo, maybe we just do that as a single repo that we distribute across all hosts' /root directories? Should we also be using git for backing up the git repo - so have gerrit-backup on the remote host pull changes instead of pushing a directory structure via rsync?
Itamar mentions that we need to backup the Gerrit DB, which I'll work on next. I reckon that's a combination of a dump of the database and rsyncing the results to the remote host?
- Karsten _______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra
- -- name: Karsten 'quaid' Wade, Sr. Community Architect team: Red Hat Community Architecture& Leadership uri: http://communityleadershipteam.org http://TheOpenSourceWay.org gpg: AD0E0C41 -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iD8DBQFO3/zw2ZIOBq0ODEERAlejAKDG3/9lQmdD+QLI/Ur+ArWKqUPGRwCg2/ew oI6DatY8SIv+6uHIMvsMoDM= =U0S9 -----END PGP SIGNATURE----- _______________________________________________ Infra mailing list Infra@ovirt.org http://lists.ovirt.org/mailman/listinfo/infra

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 12/08/2011 05:31 AM, Itamar Heim wrote:
seems like backing up the gerrit2 user home folder would help as well: http://groups.google.com/group/repo-discuss/browse_thread/thread/9d7793201d8...
Thanks!
There was a valuable hidden folder (~/.gerritcodereview) that I didn't know was there, so definitely worth grabbing.
I added a third cronjob to backup /home/gerrit2 every day at 10:40 pm Eastern time. That script and details are now in: http://ovirt.org/wiki/Gerrit_server_backup BTW, backup of www.ovirt.org is just a snapshot on Linode, so useful but not the same value as having a remote host on another network. Does it make sense for now to do the same as I did here? That is, setup scripts on linode01.ovirt.org to backup there and copy to gerrit.ovirt.org as a backup? - - Karsten - -- name: Karsten 'quaid' Wade, Sr. Community Architect team: Red Hat Community Architecture & Leadership uri: http://communityleadershipteam.org http://TheOpenSourceWay.org gpg: AD0E0C41 -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iD8DBQFO4SHx2ZIOBq0ODEERAjr8AJwKRCplApagybizhrHhSeOxEHrPGgCffNwJ KGorjIf8eta1eUUVuEfpVq8= =MNGh -----END PGP SIGNATURE-----

On Thu, Dec 08, 2011 at 12:45:37PM -0800, Karsten 'quaid' Wade wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 12/08/2011 05:31 AM, Itamar Heim wrote:
seems like backing up the gerrit2 user home folder would help as well: http://groups.google.com/group/repo-discuss/browse_thread/thread/9d7793201d8...
Thanks!
There was a valuable hidden folder (~/.gerritcodereview) that I didn't know was there, so definitely worth grabbing.
I added a third cronjob to backup /home/gerrit2 every day at 10:40 pm Eastern time. That script and details are now in:
The backup is never going to be a consistent snapshop, but why add an intentional 20 minute window between the different backup jobs? why not serialize them in a single script? Dan.

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 12/10/2011 12:10 AM, Dan Kenigsberg wrote:
The backup is never going to be a consistent snapshop,
I couldn't see a way to make a consistent snapshot; is there a better or at least more consistent way to do the snapshot?
but why add an intentional 20 minute window between the different backup jobs? why not serialize them in a single script?
It was more the way things got built than a specific plan. I wanted a buffer so that backups could be complete before the next one starts - didn't want to risk the backup by abusing resources all at the same time. The backup scripts were written so we could reuse them easily on other hosts, but it wouldn't be hard to do it all in one script, serially. - - Karsten - -- name: Karsten 'quaid' Wade, Sr. Community Architect team: Red Hat Community Architecture & Leadership uri: http://communityleadershipteam.org http://TheOpenSourceWay.org gpg: AD0E0C41 -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.11 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iD8DBQFO5iCM2ZIOBq0ODEERAj5YAKDKphhHf9JoYvgpg2uoF3x5zxPzYwCghyBD z/7uZ8gXSdpH79kNbo32Gr4= =EJ16 -----END PGP SIGNATURE-----

On Mon, Dec 12, 2011 at 07:41:00AM -0800, Karsten 'quaid' Wade wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 12/10/2011 12:10 AM, Dan Kenigsberg wrote:
The backup is never going to be a consistent snapshop,
I couldn't see a way to make a consistent snapshot; is there a better or at least more consistent way to do the snapshot?
No, I do not think it is possible, without shutting down gerrit while backup is taking place (and I do not think we should).
but why add an intentional 20 minute window between the different backup jobs? why not serialize them in a single script?
It was more the way things got built than a specific plan. I wanted a buffer so that backups could be complete before the next one starts - didn't want to risk the backup by abusing resources all at the same time.
The backup scripts were written so we could reuse them easily on other hosts, but it wouldn't be hard to do it all in one script, serially.
yeah, I think running the scripts serially woul leave less open questions if and when we need to restore.
participants (3)
-
Dan Kenigsberg
-
Itamar Heim
-
Karsten 'quaid' Wade