Minimize database in order to export all products

2019-06-04 18:14发布

问题:

I have a Magento 1.4.0.1 site and I want to export all the products in a CSV format. I tried running an Import/Export profile, but the max_timeout limit in php.ini is too low for the time it needs to run the operation. Since my provider doesn't allow me to set a higher max_timeout I have to copy the site into my local machine and run the profile from there.

My local machine runs on Windows 7 and WAMP 2.2

I copied all the files, but I'm having real trouble importing the database as it has 300MB. I configured my php.ini to have:

max_execution_time=3600
post_max_size=999M
memory_limit=999M
upload_max_filesize=999M
max_input_time=5000

I restarted the WAMP server and Imported the dump through PhpMyAdmin, but I still get errors. So my plan is to reduce the size of the database in order to have a functional backend and all the products, and empty all unimportant tables. The problem is I don't know which ones are vital and which ones aren't. Can you suggest a list of tables that I have to keep or the ones that I have to empty?

Please note that since I'm running WAMP on windows I don't have the possibility use any SSH command line stuff.

回答1:

I will share a simple php script, that I usually use to get a Magento DB dump with smaller footprint.

For example you can create a file with name: tiny-dump.php in Magento root directory and to paste the script inside this file. Later you can just run the script if you hit the url: http://mydomain.com/tiny-dump.php ... if everything works well you will find a sql file with the DB dump in var/ directory. The file name will contain {DB name}-{current date}.sql

For your information I used some ideas from this article: http://www.crucialwebhost.com/kb/article/log-cache-maintenance-script/

The script will work, if your hosting provide have installed "mysqldump"

Here is a link to the script: https://gist.github.com/4495889

Here is the script:

<?php
$xml = simplexml_load_file('./app/etc/local.xml', NULL, LIBXML_NOCDATA);

$db['host'] = $xml->global->resources->default_setup->connection->host;
$db['name'] = $xml->global->resources->default_setup->connection->dbname;
$db['user'] = $xml->global->resources->default_setup->connection->username;
$db['pass'] = $xml->global->resources->default_setup->connection->password;
$db['pref'] = $xml->global->resources->db->table_prefix;

function export_tiny() {

global $db;

$sqlFileName =  'var/' . $db['name'] . '-' . date('j-m-y-h-i-s') . '.sql';

$tables = array(
    'dataflow_batch_export',
    'dataflow_batch_import',
    'log_customer',
    'log_quote',
    'log_summary',
    'log_summary_type',
    'log_url',
    'log_url_info',
    'log_visitor',
    'log_visitor_info',
    'log_visitor_online',
    'index_event',
    'report_event',
    'report_compared_product_index',
    'report_viewed_product_index',
    'catalog_compare_item',
    'catalogindex_aggregation',
    'catalogindex_aggregation_tag',
    'catalogindex_aggregation_to_tag'
);

$ignoreTables = ' ';
foreach($tables as $table) {
    $ignoreTables .= '--ignore-table=' . $db['name'] . '.' . $db['pref'] . $table . ' ';
}

$dumpSchema = 'mysqldump' . ' ';
$dumpSchema .= '--no-data' . ' ';
$dumpSchema .=  '-u ' . $db['user'] . ' ';
$dumpSchema .= '-p' . $db['pass'] . ' ';
$dumpSchema .= $db['name'] .' > ' . $sqlFileName;

exec($dumpSchema);


$dumpData = 'mysqldump' . ' ';
$dumpData .= $ignoreTables;
$dumpData .=  '-u ' . $db['user'] . ' ';
$dumpData .= '-p' . $db['pass'] . ' ';
$dumpData .= $db['name'] .' >> ' . $sqlFileName;

exec($dumpData);
}

export_tiny();

Known issues: Sometimes the script fails to create the DB dump if the the DB password contains special characters.

Hope, that it's helpful!