Hello all, I am trying to write a backup bash script that will just zip folders on my Red hat server.
Here is the script:
zip -r "$myfilename" /opt/websites/thesite
The problem is that sometimes a file wont exist? Zip exits with an I/O error File Not Found. Perhaps a broken symnlink? Of course when the zip dies midway through a 1 gig backup then the rest of the backup doesn't finish.
Is there any way to make Zip ignore errors and just move on to the next file? Thanks for any help.
Other ideas: (Feel free to shoot them down)
Perhaps tar can do this?
Maybe copy the folder to another folder before zipping? That way only existing files would be copied.