Azure Data Lake Analytics IOutputter E_RUNTIME_USE

2019-09-11 15:46发布

I'm trying to write the results of my custom IOutputter to an intermediate file on the local disk.

After that I want to copy the database file (~20MB) to the adl output store.

Sadly the script terminates with:

An unhandled exception of type 'Microsoft.Cosmos.ScopeStudio.BusinessObjects.Debugger.ScopeDebugException' occurred in Microsoft.Cosmos.ScopeStudio.BusinessObjects.Debugger.dll

Additional information: {"diagnosticCode":195887112,"severity":"Error","component":"RUNTIME","source":"User","errorId":"E_RUNTIME_USER_ROWTOOBIG","message":"The row has exceeded the maximum allowed size of 4MB","description":"","resolution":"","helpLink":"","details":"The row has exceeded the maximum allowed size of 4MB","internalDiagnostics":" 7ffe97231797\tScopeEngine!?ToStringInternal@KeySampleCollection@SSLibV3@ScopeEngine@@AEAA?AV?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@XZ + 11b7\t\n 7ffe971d7261\tScopeEngine!??0ExceptionWithStack@ScopeEngine@@QEAA@W4ErrorNumber@1@AEBV?$initializer_list@VScopeErrorArg@ScopeCommon@@@std@@_N@Z + 121\t\n 7ffe971d7f6a\tScopeEngine!??0RuntimeException@ScopeEngine@@QEAA@W4ErrorNumber@1@PEBD@Z + aa\t\n 7ffe6de06aca\t(no module)!(no name)\t\n

    public class CustomOutputter : IOutputter
    {
        private Stream stream;

        public override void Close()
        {
            base.Close();

            using (var fs = File.Open("mydb.data", FileMode.Open))
            {
                fs.CopyTo(stream);
            }
        }

        public override void Output(IRow input, IUnstructuredWriter output)
        {
            if(stream == null)
                stream = output.BaseStream;

            myDb.Insert("somestuff");
        }
    }

Any ideas on this problem?

1条回答
再贱就再见
2楼-- · 2019-09-11 16:21

As the error message indicates there is currently a limit to the length of rows read or written by USQL and that is 4MB. If you use record-oriented files like CSVs you will hit this limit.

There is an example of a byte-oriented file read/write UDO that can help you handle files as binaries at https://github.com/Azure/usql/tree/master/Examples/FileCopyUDOs/FileCopyUDOs. You can effectively chunk data using this.

查看更多
登录 后发表回答