while processing a CSV file, I am getting an error about maximum string size. "String size exceeds the maximum allowed size".
相关问题
- How to run python egg (present in azure databricks
- Can I use Regular Expressions in USQL?
- Can we use Azure CLI to upload files to Azure Data
- Data Lake Analytics U-SQL EXTRACT speed (Local vs
- Not able to see 'Lifecycle management' opt
相关文章
- Spark Predicate Push Down, Filtering and Partition
- How to use Azure Data Lake Store as an input data
- Modify global parameter ADF pipeline
- How read line separated json file from azure data
- 30Mb limit uploading to Azure DataLake using DataL
- Convert Rowset variables to scalar value
- 定制的并行提取 - U-SQL(Custom parallel extractor - U-SQL
- USQL JsonTextWriter.Writevalue抛出错误“类型‘URI’在一个组件定义的
Currently the maximum allowed size for a string in U-SQL is 128 KB.
If you need to handle larger sizes than that for now, then use the byte[] type instead when reading from the CSV file. Later, as the rowsets are processed in the script in the body of some C# code you can transform the byte[] into a string and do whatever string operations you need in the C# code.
NOTE: Rows in U-SQL also have a maximum size (Currently 4MB). And this technique is also subject to that limitation.
If you are interested in scenarios that support a string size that is greater than 128 KB, please vote on the feature request here, adding your scenario as comments would be super helpful as well.. https://feedback.azure.com/forums/327234-data-lake/suggestions/13416093-usql-string-data-type-has-a-size-limit-of-128kb