D&C GLug - Home Page

[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]

Re: [LUG] Using mv command

 

On 28/10/2018 18:52, M. J. Everitt wrote:
> ^ Last para especially .. you can do some surprisingly silly things with
> shell/parameter/bash expansions, and mis-chosen command options!
> 
> Equally applicable to sed/awk scripts .. no harm whatsoever in making an
> exact copy with a '.orig' or '.bak' extension, and if you make a mistake,
> you can always get back to a 'known state'!!


ZFS is genuinely changing the way I use computers now I'm fully immersed 
in it - it has answers to everything when you really start to understand 
it and integrate it into your workflow. For example, like Rich's 
original problem it's pretty common for all of us to need to do similar 
stuff, often frequently and sometimes on a _lot_ of data. Usually very 
important data, frequently not at rest. No more lengthy recursive copies 
or tedious testing on laboriously spun-up and snapshotted test VMs.

Presume there is a massive directory of stuff at /export/private/pics 
that needs extensive potentially destructive transformations done on it 
- in my case /export/private/pics is a zfs dataset instead of a normal 
ext4 folder.

zfs snapshot export/private/pics@reorganizationtesting
zfs clone export/private/pics@reorganizationtesting export/TESTING
* do anything you want to the data here *
zfs promote export/TESTING

ZFS will take care of everything else including mountpoints, endpoint 
re/creation, etc. Data is safe and atomically checkpointed all the way 
through and it's CoW on course so basically instant and nearly "free". 
Once you've checked your work is ok the zfs promote command 
transparently moves the new data versions over the originals (and then 
you can either keep or wipe any of the multiple snapshots and original 
datasets you went through). You can even do this with your root 
filesystem (although not the promote whilst it's live - yet).

The coolest thing is that I zfs send/receive my main workstation and 
laptop continuously and incrementally to a backup server: I can SSH to 
the server, zfs clone my latest entire workstation filesystem to a test 
dataset and then systemd-nspawn it up like a supercharged chroot. You 
can test potentially catastrophic experiments, dist-upgrades and 
anything else effectively instantly and free :]

Cheers


-- 
The Mailing List for the Devon & Cornwall LUG
https://mailman.dclug.org.uk/listinfo/list
FAQ: http://www.dcglug.org.uk/listfaq