I get an error when using
_adlsFileSystemClient.FileSystem.Create(_adlsAccountName, destFilePath, stream, overwrite)
to upload files to datalake. The error is associated with files larger than 30 MB. It works great with smaller files.
Error:
in Microsoft.Azure.Management.DataLake.Store.FileSystemOperations.d__16.MoveNext () --- The end of the stack trace from the previous place where the exception was thrown is on System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (Task tasks) under System. Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (Task tasks) under Microsoft.Azure.Management.DataLake.Store.FileSystemOperationsExtensions.d__23.MoveNext () --- End of the stack trace from the previous place where Runtime exception was thrown. System .TaskAwaiter.ThrowForNonSuccess (Task tasks) under System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (Task tasks) under Microsoft.Azure.Management.DataLake.Store.FileSystemOperationsExtensions.Create (IFileS tName, String directFilePath, Stream streamContents, Nullable 1 overwrite, Nullable 1 syncFlag) in AzureDataFunctions.DataLakeController.CreateFileInDataLake (String destFilePath, Stream stream, Boolean overwrite) in F: \ GitHub \ ZutoDW \ ADFLFFiles_FileFile_File_File_File_Files_File_Files_File_File_File_Files_File_Files_File_File_Files
Has anyone else come across this? Or have you observed similar behavior? I will get around this by dividing the files into 30 MB and downloading them.
However, this is not practical in the long run, because the source file is 380 MB, and potentially it is much larger. I do not want to have 10-15 split files in my datalake in the long run. I would like to download as a single file.
I can upload the same file to datalake through the portal interface.
source share