Importing JSON into Mysql

2019-01-14 08:30发布

the problem that I am facing is importing JSON into Mysql. I have searched the interwebs, I have searched stackoverflow and probably even more sources, but could not find a single proper solution. Full disclosure here - I am not a PHP, SQL nor am I a JSON professional.

What I am trying to accomplish is fairly simple: Import this JSON file https://developers.facebook.com/tools/explorer/method=GET&path=245226895508982%2Ffeed%3Faccess_token%3D%3Caccess_token%3E= into a Mysql database. I understand that I need so somehow allocate the JSON data into columns and then curate the rows accordingly with the content. I do not need all of the data, but most of it. The data should later be echoed through a search query from a user.

My question is seems fairly simple - how can I do it?

P.S: I tried converting it into XML, but I do not have the sufficient rights to import the XML into the SQL database. Also the converter did not the best job.

5条回答
一夜七次
2楼-- · 2019-01-14 08:51

I know is an old post, but maybe it can help someone. I wanted to create the same database in mysql like I have in mongoDb.

I have a valid json file with data (not structure), then I created a table in mysql with the appropriate structure (according to the json data), and imported it through MySql workbench.

Here is the example with csv, but you can load your json file. Just follow the wizard for importing data into table and make sure the structure is valid, otherwise nothing will be imported.

查看更多
\"骚年 ilove
3楼-- · 2019-01-14 08:55

I know this post is quite old, however, I wanted to respond.

I ran into this same need, and ended up building a custom solution.

I needed added functionality above and beyond what is available with https://sqlizer.io/#/.

Basically, I needed it to build parent-child relationship tables on the fly, with foreign key ids.

I ended up using Python. I used a multi-dimensional array, storing the key-values in there, examining the type of object on the values, and if it was a dict type, calling the whole thing again recursively to create another multi-dimensional array, linking it to the parent table.

Was not an easy task, took over a week to build and test. But, now I have a utility that I can point at most REST APIs that respond in JSON, and automatically dump the result directly into MySQL in a well formatted table.

查看更多
做个烂人
4楼-- · 2019-01-14 09:00

Like others have said, you are going to have to do a little bit of conversion to do what you're asking. Fortunately, it's not too difficult to have PHP iterate through the data and do most of the work for you. Here's a quick-and-dirty attempt:

// MySQL table's name
$tableName = 'my_table';
// Get JSON file and decode contents into PHP arrays/values
$jsonFile = '/path/to/file.json';
$jsonData = json_decode(file_get_contents($jsonFile), true);

// Iterate through JSON and build INSERT statements
foreach ($jsonData as $id=>$row) {
    $insertPairs = array();
    foreach ($row as $key=>$val) {
        $insertPairs[addslashes($key)] = addslashes($val);
    }
    $insertKeys = '`' . implode('`,`', array_keys($insertPairs)) . '`';
    $insertVals = '"' . implode('","', array_values($insertPairs)) . '"';

    echo "INSERT INTO `{$tableName}` ({$insertKeys}) VALUES ({$insertVals});" . "\n";
}

This will spit out a series of INSERT INTO my_table (...) VALUES (...); SQL statements that you can save to a .sql file and import into MySQL. With this method each INSERT can have different columns without causing problems.

Note: addslashes() isn't a very secure way of doing this, and something like real_escape_string() would be preferable, but this works without a DB connection assuming your JSON data is trustworthy.

查看更多
Emotional °昔
5楼-- · 2019-01-14 09:00

Using MySQL Shell 8.0.13 you can import JSON documents straight to MySQL Server using util.importJson shell API function or command line option --import. For example import from file:

mysqlsh user@host:port/database --import /tmp/facebook.json collection_name

or pipe JSON document to stdin:

cat /tmp/facebook.json | mysqlsh user@host:port/database --import - collection_name

where collection_name is a collection.

More about MySQL Shell's JSON Import utility: https://dev.mysql.com/doc/mysql-shell/8.0/en/mysql-shell-utilities-json.html

More about document store: https://dev.mysql.com/doc/refman/8.0/en/document-store.html

More about documents and collections: https://dev.mysql.com/doc/refman/8.0/en/mysql-shell-tutorial-javascript-documents-collections.html

查看更多
Bombasti
6楼-- · 2019-01-14 09:09

You can export your json to csv : http://www.danmandle.com/blog/json-to-csv-conversion-utility/ or https://github.com/danmandle/JSON2CSV

Then :

LOAD DATA INFILE 'filepath/your_csv_file.csv' INTO TABLE tablename;

It should be okay for a oneshot.

More info for load data infile.

查看更多
登录 后发表回答