currently I can pass one parameter to u-sql script in data factory workflow. and with that parameter i can apply some pattern to generate files paths. is there any way to pass collection of datetimes parameters to u-sql and apply pattern to generate file paths?
相关问题
- Azure Data Factory connecting to Blob Storage via
- How to run python egg (present in azure databricks
- Azure Data Factory and SharePoint
- How to get the Azure Data Factory parameters into
- Can I use Regular Expressions in USQL?
相关文章
- Spark Predicate Push Down, Filtering and Partition
- Execute python scripts in Azure DataFactory
- Azure Data Factories vs SSIS
- How to use Azure Data Lake Store as an input data
- Modify global parameter ADF pipeline
- How read line separated json file from azure data
- Optimizing SSIS package for millions of rows with
- Transfer file from Azure Blob Storage to Google Cl
Pass a Json parameter. Then handle it with u-sql.
You can pass multiple parameters. U-SQL also allows parameters of type
SqlArray<>
. I am not sure though if ADF supports passing in such typed values. I think the PowerShell APIs do allow it.I assume that passing the values as a file will not work, since you will not get compile time partition elimination with it.