I'm trying to read a 17MB excel file (2003) with PHPExcel1.7.3c, but it crushes already while loading the file, after exceeding the 120 seconds limit I have. Is there another library that can do it more efficiently? I have no need in styling, I only need it to support UTF8. Thanks for your help
相关问题
- Views base64 encoded blob in HTML with PHP
- Laravel Option Select - Default Issue
- PHP Recursively File Folder Scan Sorted by Modific
- Can php detect if javascript is on or not?
- Using similar_text and strpos together
Maybe you could convert/export into csv, and use built-in fgetcsv(). Depends on what kind of functionality you need.
Filesize isn't a good measure when using PHPExcel, it's more important to get some idea of the number of cells (rowsxcolumns) in each worksheet.
If you have no need for styling, are you calling:
before loading the file?
If you don't need to access all worksheets, or only certain cells within a worksheet, look at using
or
or defining a readFilter
Are you using cell caching? If so, what method? That slows down the load time.
I've heard that Excel Explorer is better in reading large files.
I'm currently using the spreadsheet-reader (https://github.com/nuovo/spreadsheet-reader) which is quite fast in reading
XLSX
,ODS
andCSV
, and has the problems mentioned only in reading theXLS
format.17MB is a hefty file.
Time how long a 1MB file takes to parse so you can work out how long a 17MB file would take. Then one option might be just to increase your 120 second limit.
Alternatively, you could export to CSV, which will be way more efficient, and import via PHP's fgetcsv.
If the only thing you need from your read Excel file is data, here is my way to read huge Excel files :
I install gnumeric on my server, ie with debian/ubuntu : apt-get install gnumeric
Then the php calls to read my excel file and store it into a two dimensionnal data array are incredibly simple (dimensions are rows and cols) :
Then I can do what I want with my array. This takes less than 10 seconds for a 10MB large xls file, the same time I would need to load the file on my favorite spreadsheet software !
For very huge files, you should use fopen() and file_getcsv() functions and do what you have to do without storing data in a huge array to avoid storing the whole csv file in memory with the file() function. This will be slower, but will not eat all your server's memory !