Reading large excel file with PHP

2019-01-24 15:02发布

I'm trying to read a 17MB excel file (2003) with PHPExcel1.7.3c, but it crushes already while loading the file, after exceeding the 120 seconds limit I have. Is there another library that can do it more efficiently? I have no need in styling, I only need it to support UTF8. Thanks for your help

6条回答
戒情不戒烟
2楼-- · 2019-01-24 15:38

Maybe you could convert/export into csv, and use built-in fgetcsv(). Depends on what kind of functionality you need.

查看更多
手持菜刀,她持情操
3楼-- · 2019-01-24 15:39

Filesize isn't a good measure when using PHPExcel, it's more important to get some idea of the number of cells (rowsxcolumns) in each worksheet.

If you have no need for styling, are you calling:

$objReader->setReadDataOnly(true);

before loading the file?

If you don't need to access all worksheets, or only certain cells within a worksheet, look at using

$objReader->setLoadSheetsOnly(array(1,2))

or

$objReader->setLoadSheetsOnly(1)

or defining a readFilter

Are you using cell caching? If so, what method? That slows down the load time.

查看更多
仙女界的扛把子
4楼-- · 2019-01-24 15:40

I've heard that Excel Explorer is better in reading large files.

查看更多
唯我独甜
5楼-- · 2019-01-24 15:47

I'm currently using the spreadsheet-reader (https://github.com/nuovo/spreadsheet-reader) which is quite fast in reading XLSX, ODS and CSV, and has the problems mentioned only in reading the XLS format.

查看更多
The star\"
6楼-- · 2019-01-24 15:48

17MB is a hefty file.

Time how long a 1MB file takes to parse so you can work out how long a 17MB file would take. Then one option might be just to increase your 120 second limit.

Alternatively, you could export to CSV, which will be way more efficient, and import via PHP's fgetcsv.

查看更多
混吃等死
7楼-- · 2019-01-24 15:52

If the only thing you need from your read Excel file is data, here is my way to read huge Excel files :

I install gnumeric on my server, ie with debian/ubuntu : apt-get install gnumeric

Then the php calls to read my excel file and store it into a two dimensionnal data array are incredibly simple (dimensions are rows and cols) :

system("ssconvert \"$excel_file_name\" \"temp.csv\"");
$array = array_map("str_getcsv", file("temp.csv"));

Then I can do what I want with my array. This takes less than 10 seconds for a 10MB large xls file, the same time I would need to load the file on my favorite spreadsheet software !

For very huge files, you should use fopen() and file_getcsv() functions and do what you have to do without storing data in a huge array to avoid storing the whole csv file in memory with the file() function. This will be slower, but will not eat all your server's memory !

查看更多
登录 后发表回答