D&C GLug - Home Page

[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]

Re: [LUG] Download script

 

alan wrote:

On 2004.12.13 10:52 Rob Beard wrote:

Hi folks,

I have a question that isn't entirely Linux related but I thought
maybe
someone might know of something that would do the job...

Basicly after getting my broadband connection, I've been trying to
download a
couple of ISO's (Fedora Core 3).  At the moment my connection is
running at
2Mbit (thanks to Eclipse Internet's 1 month free flexing), and I'm
finding
ISO's download really quickly.


Hi Rob,

further to the other replies, if you do the wget method, it has a flag for backgrounding, -b to be exact, so you can get away without the nohup or &

You could use the system library in perl to do this

ie.

#!/usr/bin/perl

require 'cgi-lib.pl';
$mydirectory ="/home/whatever/";    # change this to the correct path

if(&ReadParse(*in)){
    chdir $mydirectory;
    system "wget -b $in('url')";
    print "content-type: text/html\n\n";
    print "getting your file now";
}else{
print "content-type: text/html\n\n";
print "Sorry, you didn't enter a url";
}


Just point a web form at this script, making sure the text field on the form is named url
or append ?url=http://www.whatever.com/ to the address of your perl script.


You do need the cgi-lib.pl library though for this to run.

alan

Thanks Alan, I'll give this a try in a minute. Just got back from work. I'm sure it'll be just what I need, something simple that downloads a file.

I suppose I could add other switches to the wget command line too to limit the bandwith used (for when my free flexing runs out!).

Rob


-- No virus found in this outgoing message. Checked by AVG Anti-Virus. Version: 7.0.296 / Virus Database: 265.5.0 - Release Date: 09/12/2004


-- The Mailing List for the Devon & Cornwall LUG Mail majordomo@xxxxxxxxxxxx with "unsubscribe list" in the message body to unsubscribe.