not sure if this is a question better suited for serverfault but I've been messing with amazon RDS lately and was having trouble getting 'file' privileges to my web host mysql user.
I'd assume that a simple:
grant file on *.* to 'webuser@'%';
would work but it does not and I can't seem to do it with my 'root' user as well. What gives? The reason we use load data is because it is super super fast for doing thousands of inserts at once.
anyone know how to remedy this or do I need to find a different way?
This page, http://docs.amazonwebservices.com/AmazonRDS/latest/DeveloperGuide/index.html?Concepts.DBInstance.html seems to suggest that I need to find a different way around this.
Help?
UPDATE
I'm not trying to import a database -- I just want to use the file load option to insert several hundred-thousand rows at a time.
after digging around this is what we have:
mysql> grant file on *.* to 'devuser'@'%';
ERROR 1045 (28000): Access denied for user 'root'@'%' (using password: YES)
mysql> select User, File_priv, Grant_priv, Super_priv from mysql.user;
+----------+-----------+------------+------------+
| User | File_priv | Grant_priv | Super_priv |
+----------+-----------+------------+------------+
| rdsadmin | Y | Y | Y |
| root | N | Y | N |
| devuser | N | N | N |
+----------+-----------+------------+------------+
You need to use LOAD DATA LOCAL INFILE
as the file is not on the MySQL server, but is on the machine you are running the command from.
As per comment below you may also need to include the flag:
--local-infile=1
For whatever it's worth... You can add the LOCAL operand to the LOAD DATA INFILE instead of using mysqlimport
to get around this problem.
LOAD DATA LOCAL INFILE ...
This will work without granting FILE permissions.
Pretty sure you can't do it yet, as you don't have the highest level MySQL privileges with RDS. We've only done a little testing, but the easiest way to import a database seems to be to pipe it from the source box, e.g.
mysqldump MYDB | mysql -h rds-amazon-blah.com --user=youruser --pass=thepass
I ran into similar issues. I was in fact trying to import a database but the conditions should be the same - I needed to use load data due to the size of some tables, a spotty connection, and the desire for a modest resume functionality.
I agree with chris finne that not specifying the local option can lead to that error. After many fits and starts I found that the mk-parallel-restore tool from Maatkit provided what I needed with some excellent extra features. It might be a great match for your use case.
Importing bulk data into Amazon MySQL RDS is possible two ways. You could choose anyone of below as per your convenience.
Using Import utility.
mysqlimport --local --compress -u <user-name> -p<password> -h <host-address> <database-name> --fields-terminated-by=',' TEST_TABLE.csv
--Make sure, here the utility will be inserting the data into TEST_TABLE only.
Sending a bulk insert SQL by piping into into mysql command.
mysql -u <user-name> -p<password> -h <host-address> <database-name> < TEST_TABLE_INSERT.SQL
--Here file TEST_TABLE_INSERT.SQL will have bulk import sql statement like below
--insert into TEST_TABLE values('1','test1','2017-09-08'),('2','test2','2017-09-08'),('3','test3','2017-09-08'),('3','test3','2017-09-08');