I would like to import data from a CSV file into an existing database table. I do not want to save the CSV file, just take the data from it and put it into the existing table. I am using Ruby 1.9.2 and Rails 3.
This is my table:
create_table "mouldings", :force => true do |t|
t.string "suppliers_code"
t.datetime "created_at"
t.datetime "updated_at"
t.string "name"
t.integer "supplier_id"
t.decimal "length", :precision => 3, :scale => 2
t.decimal "cost", :precision => 4, :scale => 2
t.integer "width"
t.integer "depth"
end
Can you give me some code to show me the best way to do this, thanks.
The
smarter_csv
gem was specifically created for this use-case: to read data from CSV file and quickly create database entries.You can use the option
chunk_size
to read N csv-rows at a time, and then use Resque in the inner loop to generate jobs which will create the new records, rather than creating them right away - this way you can spread the load of generating entries to multiple workers.See also: https://github.com/tilo/smarter_csv
This can help. It has code examples too:
http://csv-mapper.rubyforge.org/
Or for a rake task for doing the same:
http://erikonrails.snowedin.net/?p=212
It is better to wrap the database related process inside a
transaction
block. Code snippet blow is a full process of seeding a set of languages to Language model,Snippet below is a partial of
languages.csv
file,It's better to use CSV::Table and use
String.encode(universal_newline: true)
. It converting CRLF and CR to LFSimpler version of yfeldblum's answer, that is simpler and works well also with large files:
No need for with_indifferent_access or symbolize_keys, and no need to read in the file to a string first.
It doesnt't keep the whole file in memory at once, but reads in line by line and creates a Moulding per line.
The better way is to include it in a rake task. Create import.rake file inside /lib/tasks/ and put this code to that file.
After that run this command in your terminal
rake csv_model_import[file.csv,Name_of_the_Model]