I have a table with binary(32), blob and varchar utf-8 fields. from one mysql server to another I export data via csv:
select * INTO OUTFILE '$tmp_fname'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\\r\\n'
from mytable
and then
load data local infile '" . $mysqli->real_escape_string($glb) . "' ignore into table mytable_temp
CHARACTER SET 'utf8'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\\n'
I tried the same with bulk insert in MSSQL, and for simple types it works (I get another table with int and char(44) in it). But in this case I get import errors. Some detials: I need to make automated export-import - that's why I use csv, both servers can communicate only via http (php scripts). Tables have millions of rows.
So here are questions.
How blob field data should be formated in csv so that MS SQL can import it?
How can I export utf8 string for MS SQL? I tried convert(myfield using utf16), is it what I need?
Also I tried to export data in utf16 and specify DATAFILETYPE ='widechar' in bulk insert, but it throws an error on first int value. It can't actually read widechar?