#!/usr/bin/perl
use CGI ':standard';
use CGI::Carp qw(fatalsToBrowser);
my $files_location;
my $ID;
my @fileholder;
$files_location = "C:\Users\user\Documents\hello\icon.png";
open(DLFILE, "<$files_location") ;
@fileholder = <DLFILE>;
close (DLFILE) ;
print "Content-Type:application/x-download\n";
print "Content-Disposition:attachment;filename=$ID\n\n";
print @fileholder;
When I run this script, instead of returning the icon.png
file it returns the download.pl
(the name of the script given above) with no content inside it. What is the issue?
Script i am using currently.
#!C:\Perl64\bin\perl.exe -w
use CGI qw(:standard);
use File::Copy qw( copy );
use File::Spec::Functions qw( catfile );
use constant IMG_DIR => catfile(qw( D:\ ));
serve_logo(IMG_DIR);
sub serve_logo {
my ($dir) = @_;
my $cgi = CGI->new;
my $file = "icon.png";
#print $file;
defined ($file) or die "Invalid image name in CGI request\n";
send_file($cgi, $dir, $file);
return;
}
sub send_file
{
my ($cgi, $dir, $file) = @_;
my $path = catfile($dir, $file);
open my $fh, '<:raw', $path or die "Cannot open '$path': $!";
print $cgi->header( -type => 'application/octet-stream', -attachment => $file, );
binmode STDOUT, ':raw';
copy $fh => \*STDOUT, 8_192;
close $fh or die "Cannot close '$path': $!";
return;
}
There are quite a few issues. The first one is the fact that you are using @fileholder = <DLFILE>;
to slurp a binary file. On Windows, automatic conversion of line endings will wreak havoc on the contents of that file.
Other issues are:
You are not checking the return value of open
. We don't even know if open
succeeded.
You never assign a value to $ID
, meaning you're sending "filename=\n\n"
in your response.
You are slurping a binary file, making the memory footprint of your program proportional to the size of the binary file. Robust programs don't do that.
You're use
ing CGI.pm, but you are neither using it nor have you read the docs.
You're using a bareword (i.e. package global) filehandle.
The fundamental reason, however, is that open
fails. Why does open
fail? Simple:
C:\temp> cat uu.pl
#!/usr/bin/env perl
use strict; use warnings;
my $files_location = "C:\Users\user\Documents\hello\icon.png";
print "$files_location\n";
Let's try running that, shall we?
C:\temp> uu
Unrecognized escape \D passed through at C:\temp\uu.pl line 5.
Unrecognized escape \h passed through at C:\temp\uu.pl line 5.
Unrecognized escape \i passed through at C:\temp\uu.pl line 5.
C:SERSSERDOCUMENTSHELLOICON.PNG
Here is a short script illustrating a better way:
use CGI qw(:standard);
use File::Copy qw( copy );
use File::Spec::Functions qw( catfile );
use constant IMG_DIR => catfile(qw(
E:\ srv localhost images
));
serve_logo(IMG_DIR);
sub serve_logo {
my ($dir) = @_;
my %mapping = (
'big' => 'logo-1600x1200px.png',
'medium' => 'logo-800x600.png',
'small' => 'logo-400x300.png',
'thumb' => 'logo-200x150.jpg',
'icon' => 'logo-32x32.gif',
);
my $cgi = CGI->new;
my $file = $mapping{ $cgi->param('which') };
defined ($file)
or die "Invalid image name in CGI request\n";
send_file($cgi, $dir, $file);
return;
}
sub send_file {
my ($cgi, $dir, $file) = @_;
my $path = catfile($dir, $file);
open my $fh, '<:raw', $path
or die "Cannot open '$path': $!";
print $cgi->header(
-type => 'application/octet-stream',
-attachment => $file,
);
binmode STDOUT, ':raw';
copy $fh => \*STDOUT, 8_192;
close $fh
or die "Cannot close '$path': $!";
return;
}
I also posted a detailed explanation on my blog.
It took me a while to figure what was wrong, so for those that end up here (as I did) having random issues serving large files, here's my advice:
Avoid File::Copy, as it's bugged for this purpose.
When serving data through CGI, syswrite can return undef ($! being 'Resource temporarily unavailable') for some time.
File::Copy stops in that case (returns 0, sets $!), failing to transfer the entire file (or stream).
Many different options to work around that, retrying the syswrite, or using blocking sockets, not sure which on is the best though !