D&C GLug - Home Page

[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]

Re: [LUG] Mass editing text files?

 

Hi,

On Fri, Jul 27, 2007 at 07:53:05AM +0100, Neil Williams wrote:
> On Thu, 26 Jul 2007 20:43:15 +0100
> "Jonathan Roberts" <jonathan.roberts.uk@xxxxxxxxxxxxxx> wrote:
> 
> > I have a load of HTML files and I want to add the same text to the
> > <head> section of all of them: is there any way I can add it
> > automatically?! sed perhaps?
> 
> I'd use perl with something like: 
> opendir(FH, "dir") or die ("message: $!);
> @files=grep(!/^\.\.?*.html/, readdir(FH));
> closedir(FH);
> 
> foreach $file (@files)
> {
>       open (FILE, "dir/$file") or die ("message2: $!);
>       @contents=<FILE>;
>       close (FILE);
> # now parse each file, one line at a time via the @contents array
> 
> }

The above can be made much simpler by use of the -pi flags which
allow for in-place editing of files with a backup copy made.  e.g.:

perl -npi.bak -e 's#<head>#<head>\nYour extra stuff here#i;' *.html

would turn:

<head>

into:

<head>
Your extra stuff here

in every file ending in .html and preserve the original version as
file.html.bak.

I believe the same can be done with sed/awk but I'm from the Perl
generation so never really bothered to learn anything other than the
absolute basics of those.

Cheers,
Andy

-- 
http://bitfolk.com/ -- No-nonsense VPS hosting
Encrypted mail welcome - keyid 0x604DE5DB

Attachment: signature.asc
Description: Digital signature

-- 
The Mailing List for the Devon & Cornwall LUG
http://mailman.dclug.org.uk/listinfo/list
FAQ: http://www.dcglug.org.uk/linux_adm/list-faq.html