break down the large number of rows in CSV and pro

2019-09-07 03:26发布

问题:

I am upto edit and upload a CSV with more then 50000 records (shopping cart produts). and it should update the number of tables in the system. So i m using zend framework with my shopping cart.

im planing to break them down (50000 CSV records) in memory before processing them batch-wise using PHP/MYSQL

please any one can give me advice on this

What im up to now is

public function getDataFromPath($path=null) {
        if($path == null) {
            $path = $this->_path;
        }
        ini_set("auto_detect_line_endings", 1);
        $fp = fopen($path, "r");
        while(($line = fgetcsv($fp, 5000, ",")) !== FALSE) {
            $line = self::trimArray($line);
            $this->data[] = $line;
        }
        fclose($fp);          
        return $this->data;
    }

regards roshan

回答1:

Every RDBMS out there shouldn't have problems with 50.000 rows. That's nothing. There's no need to process them batch wise.

Just use the LOAD DATA INFILE command and you will be fine.

For an example see here: LOAD DATA INFILE easily convert YYYYMMDD to YYYY-MM-DD?

UPDATE (cause of comment to Ion Wood's answer): To create a CSV file you can use the SELECT .. INTO OUTFILE command.

SELECT a,b,a+b INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;

For more info, see the manual.



回答2:

This is a job for...

DATABASE MAN!!!

Load your csv directly into a table using the load data infile business and do what ever magic you need to after that.