I am trying to find a solution to get directly the file like wget does instead of reading from the stream and writing to another file, but I am not sure if this is possible.
Any suggestions?
I am trying to find a solution to get directly the file like wget does instead of reading from the stream and writing to another file, but I am not sure if this is possible.
Any suggestions?
I also found copy which allows to copy a file from an url directly to your disk and is a oneliner without the complexity of curl or the need to create an empty file where to transfer the content of file_get_contents.
copy($file_url, $localpath);
file_put_contents($local_path, file_get_contents($file_url));
is a one liner too ;-)
The only issue with above code could be with very large files: copy could be better in that case, but see also http://www.php.net/manual/en/function.copy.php#88520
Some testing needed...
With CURLOPT_FILE
you can write some filestream direct into an open filehandle (see curl_setopt).
/**
* @param string $url
* @param string $destinationFilePath
* @throws Exception
* @return string
*/
protected function _downloadFile($url, $destinationFilePath)
{
$fileHandle = fopen($destinationFilePath, 'w');
if (false === $fileHandle) {
throw new Exception('Could not open filehandle');
}
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FILE, $fileHandle);
$result = curl_exec($ch);
curl_close($ch);
fclose($fileHandle);
if (false === $result) {
throw new Exception('Could not download file');
}
return $destinationFilePath;
}
Edit based on your comments:
If you want a oneliner or want to use wget call it through exec() or system() like so:
exec('wget http://google.de/ -O google.html -q')
Edit for later reference:
<?php
function downloadCurl($url, $destinationFilePath)
{
$fileHandle = fopen($destinationFilePath, 'w');
if (false === $fileHandle) {
throw new Exception('Could not open filehandle');
}
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FILE, $fileHandle);
$result = curl_exec($ch);
curl_close($ch);
fclose($fileHandle);
if (false === $result) {
throw new Exception('Could not download file');
}
}
function downloadCopy($url, $destinationFilePath)
{
if (false === copy($url, $destinationFilePath)) {
throw new Exception('Could not download file');
}
}
function downloadExecWget($url, $destinationFilePath)
{
$output = array();
$return = null;
exec(sprintf('wget %s -O %s -q', escapeshellarg($url), escapeshellarg($destinationFilePath)), $output, $return);
if (1 === $return) {
throw new Exception('Could not download file');
}
}
All three methods have nearly equal runtime and memory usage.
Use whatever fits best to your environment.
$c = file_get_contents('http://www.example.com/my_file.tar.gz');
And now write $c to local file ...