We're just getting started evaluating the datalake service at Azure. We created our lake, and via the portal we can see the two public URLs for the service. (One is an https:// scheme, the other an adl:// scheme)
The datalake documentation states that there are indeed two interfaces: webHDFS REST API, and ADL. So, I am assuming the https:// scheme gets me the wehHDFS interface. However, I can find no more information at Azure about using this interface.
I tried poking at the given https:// URL, with web browser and curl. The service is responding. Replies are JSON, which is as expected, since a datalake is an instance of Hadoop. However, I cannot seem to get access to my files [which I uploaded into our lake via the portal].
If I do a GET to "/foo.txt", for example, the reply is an error, ResourceNotFound.
If I do a GET using the typical Hadoop HDFS syntax, "/webhdfs/v1/foo.txt", the reply is an error, AuthenticationFailed. Additional text indicates a missing access token. This seems more promising. However, can't find anything about generating such an access token.
There is some documentation on using the ADL interface, and .NET and Visual Studio, but this is not what I want, initially.
Any help much appreciated!
I am indebted to this forum post by Matthew Hicks which outlined how to do this with
curl
. I took it and wrapped it in PowerShell. I'm sure there are many ways to accomplish this, but here's one that works.First setup an AAD application so that you can fill in the client_id and client_secret mentioned below. (That assumes you want to automate this rather than having an interactive login. If you want an interactive login, then there's a link to that approach in the forum post above.)
Then fill in the settings in the first 5 lines and run the following PowerShell script: