I want to access a blob file that is getting generated out of azure ml web service along with the ilearner and csv file. The problem is that the file is getting generated automatically with guid as its name, and with no response mentioning the existence of that file. I know that the file is getting generated as i can access it through azure portal. i would like to automatically access the file and the only possibility i can see is by using the time stamp of other file created at the same instance. is there any api or method available to access blobs created at a particular instance using time stamp instead of file name?
问题:
回答1:
According to your description, I guess you used Export Data Module. As your requirements, it is highly recommended that you could replace Export Data with Execute Python Script in Azure Machine Learning which allows you to customize the blob file name.
For the introduction to Execute Python Script, you could refer to the official documentation here.
Please refer to the following steps to implement:
Step 1: Please use Python virtualenv create Python independent running environment, specific steps please refer to https://virtualenv.pypa.io/en/stable/userguide/, then use the pip install command to download Azure Storage related Scripts.
Compress all of the files in the Lib/site-packages folder into a zip package (I'm calling it azure - storage - package here)
Step 2: Upload the zip package into the Azure Machine Learning WorkSpace DataSet.
specific steps please refer to the Technical Notes.
After success, you will see the uploaded package in the DataSet List, dragging it to the third node of the Execute Python Script.
Step 3 : Customize the blob file name in the python script to the timestamp, you could even add GUID to ensure uniqueness at the end of the file name. I provided a simple snippet of code:
import pandas as pd
from azure.storage.blob import BlockBlobService
import time
def azureml_main(dataframe1 = None, dataframe2 = None):
myaccount= '****'
mykey= '****'
block_blob_service = BlockBlobService(account_name=myaccount, account_key=mykey)
block_blob_service.create_blob_from_text('test', 'str(int(time.time()))+'.txt', 'upload image test')
return dataframe1,
Also,you could refer to the SO thread Access Azure blog storage from within an Azure ML experiment.
Hope it helps you.