Ok, I'm looking for the fastest possible way to read all of the contents of a file via php with a filepath on the server, also these files can be huge. So it's very important that it does a READ ONLY to it as fast as possible.
Is reading it line by line faster than reading the entire contents? Though, I remember reading up on this some, that reading the entire contents can produce errors for huge files. Is this true?
You could use
file_get_contents
Example:
Reading the whole file in one go is faster.
But huge files may eat up all your memory and cause problems. Then your safest bet is to read line by line.
If you want to load the full-content of a file to a PHP variable, the easiest (and, probably fastest) way would be
file_get_contents
.But, if you are working with big files, loading the whole file into memory might not be such a good idea : you'll probably end up with a
memory_limit
error, as PHP will not allow your script to use more than (usually) a couple mega-bytes of memory.So, even if it's not the fastest solution, reading the file line by line (
fopen
+fgets
+fclose
), and working with those lines on the fly, without loading the whole file into memory, might be necessary...$file_handle
as reference to the file itself.file_get_contents()
is the most optimized way to read files in PHP, however - since you're reading files in memory you're always limited to the amount of memory available.You can issue a
ini_set('memory_limit', -1)
if you have the right permissions but you'll still be limited by the amount of memory available on your system, this is common to all programming languages.The only solution is to read the file in chunks, for that you can use
file_get_contents()
with the fourth and fifth arguments ($offset
and$maxlen
- specified in bytes):Here is an example where I use this technique to serve large download files:
Another option is the use the less optimized
fopen()
,feof()
,fgets()
andfclose()
functions, specially if you care about getting whole lines at once, here is another example I provided in another StackOverflow question for importing large SQL queries into the database:Which technique you use will really depend on what you're trying to do (as you can see with the SQL import function and the download function), but you'll always have to read the data in chunks.
If you're not worried about memory and file size,
$lines is then the array of the file.