I have a pandas dataframe I loaded via read_csv that I am trying to push to a database via to_sql when I attempt
df.to_sql("assessmentinfo_pivot", util.ENGINE)
I get back a unicodeDecodeError:
UnicodeEncodeError: 'ascii' codec can't encode characters in position 83-84: ordinal not in range(128)
There is no encoding option for to_sql to specify utf-8 for the to_sql and the Engine was created with encoding set to utf-8
ENGINE = create_engine("mssql+pymssql://" +
config.get_local('CEDS_USERNAME') + ':' +
config.get_local('CEDS_PASSWORD') + '@' +
config.get_local('CEDS_SERVER') + '/' +
config.get_local('CEDS_DATABASE'),
encoding="utf-8")
Any pandas insight into getting this working properly? most of my searched lead me to people having a similar error for to_csv which is just resolved by adding encoding="utf-8" but that is unfortunately not an option here.
I tried paring the file down but it still gives errors even when stripped down to just the headers: http://pastebin.com/F362xGyP
I have solved the issue changing the character set in MySQL database (UTF-8) and adding this to the pymysql connection:
charset='utf8'
.I experienced a similar problem on python 3.7.: UnicodeEncodeError: 'charmap' codec can't encode character '\ufffd' in position 0: character maps to
It was the way I defined my engine. I had charset defined to utf-8 in my engine, yet it did not pick it up:
This worked fine on python 2 but on python 3, the charmap error would occur. The only solution I found was to write engine in a different manner, and add charset to the definition string:
I experienced the exact same issue with the combination pymysql and pandas.to_sql
Update, here is what worked for me:
Instead of passing the charset as an argument, try attaching it directly to the connection string:
connect_string = 'mysql+pymysql://{}:{}@{}:{}/{}?charset=utf8'.format(DB_USER, DB_PASS, DB_HOST, DB_PORT, DATABASE)
The problem seems to happen in pymysql and the cause for the error seemingly is that the encoding you define is not properly forwarded and set when the pymsql connection is set.
For the sake of debugging, I harcoded
encoding = 'utf-8
in the pymysql
_do_execute_many
function and that explained it to me.