How to convert a JSON file to an SQLite database

2020-06-04 11:01发布

If I have some sample data, how do I put it into SQLite (preferably fully automated)?

{"uri":"/","user_agent":"example1"}
{"uri":"/foobar","user_agent":"example1"}
{"uri":"/","user_agent":"example2"}
{"uri":"/foobar","user_agent":"example3"}

标签: json sqlite
2条回答
乱世女痞
2楼-- · 2020-06-04 11:12

I found the easiest way to do this is by using jq and CSV as an intermediary format.

Edit: as pointed out (thanks @Leo), the original question did show newline delimited JSON objects, which each on their own conform to rfc4627, but not all together in that format. jq can handle a single JSON array of objects much the same way though by preprocessing the file using jq '.[]' <input.json >preprocessed.json. If you happen to be dealing with JSON text sequences (rfc7464) luckily jq has got your back too with the --seq parameter.

Getting the CSV

First write your data to a file. I will assume data.json here.

Then construct the header using jq:

% head -1 data.json | jq -r 'keys | @csv'
"uri","user_agent"

The head -1 is because we only want one line. jq's -r makes the output a plain string instead of a JSON-String wrapping the CSV. We then call the internal function keys to get the keys of the input as an array. This we send to the @csv formatter which outputs us a single string with the headers in quoted CSV format.

We then need to construct the data.

% jq -r '[.[]] | @csv' < data.json
"/","example1"
"/foobar","example1"
"/","example2"
"/foobar","example3"

We now take the whole input and deconstruct the associative array (map) using .[] and then put it back into a simple array […]. This basically converts our dictionary to an array of keys. Sent to the @csv formatter, we again get some CSV.

Putting it all together we get a single one-liner in the form of:

% (head -1 data.json | jq -r 'keys | @csv' && jq -r '[.[]] | @csv' < data.json) > data.csv

If you need to convert the data on the fly, i.e. without a file, try this:

% cat data.json | (read -r first && jq -r '(keys | @csv),( [.[]] | @csv)' <<<"${first}" && jq -r '[.[]] | @csv')

Loading it into SQLite

Open an SQLite database:

sqlite3 somedb.sqlite

Now in the interactive shell do the following (assuming you wrote the CSV to data.csv and want it in a table called my_table):

.mode csv
.import data.csv my_table

Now close the shell and open it again for a clean environment. You can now easily SELECT from the database and do whatever you want to.

Putting it all together

Have an asciinema recording right there:

asciicast

查看更多
贼婆χ
3楼-- · 2020-06-04 11:15

sqlitebiter appears to provide a python solution:

A CLI tool to convert CSV/Excel/HTML/JSON/LTSV/Markdown/SQLite/TSV/Google-Sheets to a SQLite database file. http://sqlitebiter.rtfd.io/

docs: http://sqlitebiter.readthedocs.io/en/latest/

project: https://github.com/thombashi/sqlitebiter

  • last update approximately 3 months ago
  • last issue closed approximately 1 month ago, none open
  • noted today, 2018-03-14
查看更多
登录 后发表回答