Forum OpenACS Q&A: GZIP backup not working
I've got a problem with my backup.
I am running Postgresql 8.2.5 and carry out a daily database dump using pg_dump which is then zipped using gzip. These files are then transferred automatically to another server and a restore is performed meaning we have 2 servers which are almost in synch with each other for backup and failover purposes.
This was working fine until last month. The file has now reached 2.5GB and when I try to gzip the .dmp file I get a Permission Denied message. I've tried changing the permissions on the .dmp file without success. Is there some sort of maximum file size? I am using Red Hat Enterprise 4.
Any ideas would be greatly appreciated. I can also provide more info if it's needed.
Further investigations are making me think that it's the .dmp file with the problem and not the gzip application. I can successfully zip an arbitrary file using gzip.
I can't copy the file anywhere and when I open it with EMACS there is no content.
Use split. The split command allows you to split the output into pieces that are acceptable in size to the underlying file system. For example, to make chunks of 1 megabyte:
pg_dump dbname | split -b 1m - filename
cat filename* | psql dbname
Perhaps this should go in one of the backup docs on this site somewhere?
Cheers for the help Dave.