I need to use HDFS cluster from remote desktop through Java API. Everything works OK until it comes to write access. If I'm trying to create any file I receive access permission exception. Path looks good but exception indicates my remote desktop user name which is of course is not what I need to access needed HDFS directory.
The question is: - Is there any way to represent different user name using 'simple' authentication in Java API? - Could you please point some good explanation of authentication / authorization schemes in hadoop / HDFS preferable with Java API examples?
Yes, I already know 'whoami' could be overloaded in this case using shell alias but I prefer to avoid solutions like this. Also specifics here is I dislike usage of some tricks like pipes through SSH and scripts. I'd like to perform everything using just Java API. Thank you in advance.
After some studying I came to the following solution:
Sample code probably useful for people both for 'fake authentication' and remote HDFS access:
Useful reference for those who have a similar problem:
UPDATE:
Alternative for those who uses command line
hdfs
orhadoop
utility without local user needed:What you actually do is you read local file in accordance to your local permissions but when placing file on HDFS you are authenticated like user
hdfs
.This has pretty similar properties to API code illustrated:
sudo
.