D&C GLug - Home Page

[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]

Re: [LUG] stupid data deletion...

 

On 25/08/10 17:03, Simon Robert -Cottage wrote:
> 
> My command line skills are minimal. If I am in a directory called
> "recover" with 3000 sub directories what command can I use to
>     a delete all files callled *.xml

find /start/directory/here -name "*.xml" -print0 | xargs -0 rm

Find files under directory "here" with name matching pattern and print
the names in a format suitable for processing with "xargs -0", and pipe
the list to xargs to run the "rm" command.

The use of "xargs -0" and "print0" is rather elaborate, but it means it
only uses one "rm" command to remove all of them (and so fast), and I
believe it also avoids problems with the 32768 character command limit,
as well as spaces in file names and other "gotchas". So is one of those
mantra's you learn and use all the time.

>     b delete empty sub directories

find /start/directory/here -type d -empty -print0 | xargs -0 rmdir

I haven't tested the later. Presumably if a directory contains only a
directory you would have to run it several times to remove mindless
chains of emptiness. There is probably a way to do that directly, but
life is too short.

However be cautious, and make sure you understand "find".

Run the "find" without the "xargs" and the "print0" first to be sure it
is doing the right thing.

e.g.

find /start/from/here -type d -empty

Will produce a readable list of (supposedly) empty directories that
would be deleted by the fuller command above.

Can you not find the data you want and copy that somewhere safe, and
then worry about what you don't want? Or back-up everything first? As it
is easy to mess up at the command line.

 Simon

-- 
The Mailing List for the Devon & Cornwall LUG
http://mailman.dclug.org.uk/listinfo/list
FAQ: http://www.dcglug.org.uk/listfaq