Objective: provide the ability to remotely update the system or add new features.
What am I supposed to do . Back up the current environment of the target computer, and if the update fails at any stage, return to the original environment.
Say my directory structures look something like this:
/ home / user / project1 / .... bla bla
project 1 contains symbolic links, hard links, executable files for software and firmware, etc.
My dilemma
Should I use strategy 1 or 2?
Do I have to copy the entire current environment and come back if the update fails.
example -> cp -p -r / home / user / project1 / * / home / user / project1_backup /
if update fails β
mv / home / user / project1_backup // home / user / project1
Do I have to archive the entire environment and deploy it if the update fails. To create a tar ball, I'm a little skeptical about saving symbolic links and hard links ... and the same, until I figured it out.
Can someone please give a specific answer, which method should I follow, and if I go using the ball-ball approach, what will be the bash command.
As far as I know, tar -cvfz
for creating tar gunzip will not save links and permissions and similarly until the tar ball is disclosed. Please throw some light?
source share