How to upload a file to Azure blob storage without writing my own program?

I created an Azure Storage account. I have a 400 megabyte .zip file that I want to put in the blob repository for later use.

How can I do this without writing code? Is there an interface for this?

+44
windows azure azure-storage azure-storage-blobs
Jul 05 2018-11-11T00:
source share
14 answers

Free tools:

  • Visual Studio 2010 - Install Azure Tools and You Can Find Drops in Server Explorer
  • Cloud Berry Lab CloudBerry Explorer for Azure Blob Storage
  • ClumpsyLeaf CloudXplorer
  • Azure Storage Explorer from CodePlex (try beta version 4)

An old Azure Blob Explorer program has appeared, or something that no longer works with the new Azure SDK.

Of these, I personally like CloudBerry Explorer.

+36
Jul 06 '11 at 2:10
source share

The easiest way is to use Azure Storage PowerShell. He provided many commands to manage the storage container / blob / table / queue.

For your mentioned case, you can use Set-AzureStorageBlobContent , which can load a local file into the azure storage as a block blob or block page.

Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname 

See http://msdn.microsoft.com/en-us/library/dn408487.aspx for more details.

+18
Sep 03 '13 at 7:11
source share

If you are looking for a tool for this, can I suggest you take a look at our Cloud Storage Studio tool ( http://www.cerebrata.com/Products/CloudStorageStudio ). It is a commercial tool for managing Windows Azure Storage and Hosted Service. You can also find an exhaustive list of Windows Azure Storage Management management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx

Hope this helps.

+11
Jul 05 2018-11-11T00:
source share

StorageClient has this built in. No need to write anything:

 var account = new CloudStorageAccount(creds, false); var client = account.CreateCloudBlobClient(); var blob = client.GetBlobReference("/somecontainer/hugefile.zip"); //1MB seems to be a pretty good all purpose size client.WriteBlockSizeInBytes = 1024; //this sets # of parallel uploads for blocks client.ParallelOperationThreadCount = 4; //normally set to one per CPU core //this will break blobs up automatically after this size client.SingleBlobUploadThresholdInBytes = 4096; blob.UploadFile("somehugefile.zip"); 
+6
Jul 05 '11 at 17:22
source share

I use Cyberduck to manage my memory storage.

It is free and very easy to use. It also works with other cloud storage solutions.

I recently found this one too : CloudXplorer

Hope this helps.

+3
Sep 29 '11 at 17:39
source share

A new OpenSource tool provided by Microsoft has appeared:

  • Project Deco is a cross-platform Microsoft Azure Storage Account Explorer.

Please check these links:

+3
Mar 02 '16 at 10:46 on
source share

You can use Cloud Combine to reliably and quickly upload files to the Azure blob storage.

+2
Apr 11 '13 at 12:50
source share

A simple batch file using Microsoft AzCopy will do the trick. You can drag and drop files into the following batch file to upload it to your blob storage container:

upload.bat

 @ECHO OFF SET BLOB_URL=https://<<<account name>>>.blob.core.windows.net/<<<container name>>> SET BLOB_KEY=<<<your access key>>> :AGAIN IF "%~1" == "" GOTO DONE AzCopy /Source:"%~d1%~p1" /Dest:%BLOB_URL% /DestKey:%BLOB_KEY% /Pattern:"%~n1%~x1" /destType:blob SHIFT GOTO AGAIN :DONE PAUSE 

Note that the above method only downloads one or more files individually (since the Pattern flag is specified) instead of loading the entire directory.

+2
Jun 09 '16 at 10:07 on
source share

You can upload large files directly to the Azure Blob repository directly using the HTTP PUT verb, the largest file I tried with the code below is 4.6 GB. You can do it in C # as follows:

 // write up to ChunkSize of data to the web request void WriteToStreamCallback(IAsyncResult asynchronousResult) { var webRequest = (HttpWebRequest)asynchronousResult.AsyncState; var requestStream = webRequest.EndGetRequestStream(asynchronousResult); var buffer = new Byte[4096]; int bytesRead; var tempTotal = 0; File.FileStream.Position = DataSent; while ((bytesRead = File.FileStream.Read(buffer, 0, buffer.Length)) != 0 && tempTotal + bytesRead < CHUNK_SIZE && !File.IsDeleted && File.State != Constants.FileStates.Error) { requestStream.Write(buffer, 0, bytesRead); requestStream.Flush(); DataSent += bytesRead; tempTotal += bytesRead; File.UiDispatcher.BeginInvoke(OnProgressChanged); } requestStream.Close(); if (!AbortRequested) webRequest.BeginGetResponse(ReadHttpResponseCallback, webRequest); } void StartUpload() { var uriBuilder = new UriBuilder(UploadUrl); if (UseBlocks) { // encode the block name and add it to the query string CurrentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString())); uriBuilder.Query = uriBuilder.Query.TrimStart('?') + string.Format("&comp=block&blockid={0}", CurrentBlockId); } // with or without using blocks, we'll make a PUT request with the data var webRequest = (HttpWebRequest)WebRequestCreator.ClientHttp.Create(uriBuilder.Uri); webRequest.Method = "PUT"; webRequest.BeginGetRequestStream(WriteToStreamCallback, webRequest); } 

UploadUrl is generated by Azure itself and contains a shared signature, this SAS URL indicates where blob will be downloaded, and how long access to security is provided (write access in your case). You can create a SAS URL as follows:

 readonly CloudBlobClient BlobClient; readonly CloudBlobContainer BlobContainer; public UploadService() { // Setup the connection to Windows Azure Storage var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString"); BlobClient = storageAccount.CreateCloudBlobClient(); // Get and create the container BlobContainer = BlobClient.GetContainerReference("publicfiles"); } string JsonSerializeData(string url) { var serializer = new DataContractJsonSerializer(url.GetType()); var memoryStream = new MemoryStream(); serializer.WriteObject(memoryStream, url); return Encoding.Default.GetString(memoryStream.ToArray()); } public string GetUploadUrl() { var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy { Permissions = SharedAccessPermissions.Write, SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(60) }); return JsonSerializeData(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier); } 

I also have a topic on this subject, where you can find more information here. How to upload huge files to Azure blob from a web page

+1
Jul 08 2018-11-11T00:
source share

You can upload files to Azure Storage Blob using Command Prompt .

Install Microsoft Azure Storage .

And then the CLI command will upload it to your blob account:

 AzCopy /Source:"filepath" /Dest:bloburl /DestKey:accesskey /destType:blob 

Hope this helps ... :)

+1
Mar 23 '17 at 15:36
source share

I used all the tools mentioned in the post and all work moderately well with block blobs. However my favorite BlobTransferUtility

By default, BlobTransferUtility only blocks locks. However, changing only 2 lines of code, and you can also load pages. If you, like me, need to upload a virtual machine image, it needs to be a page blob.

(For a difference, please see this MSDN article. )

To load pages, you simply change lines 53 and 62 from BlobTransferHelper.cs from

new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob

to

new Microsoft.WindowsAzure.Storage.Blob.CloudPageBlob

The only thing you need to know about this application is to remove HELP when you first run the program to see the actual interface.

0
Jul 16 '13 at 22:03
source share

Check out this Azure Storage Download post, which explains how easy it is to upload any file through PowerShell to Azure Blob Storage.

0
Apr 01 '15 at 2:59
source share

You can use the Azcopy tool to upload the necessary files to the default azure storage - the blob u block can modify the template to suit your requirement

Syntax

 AzCopy /Source : /Destination /s 
0
Apr 20 '17 at 6:33
source share

Try the Blob Service API

http://msdn.microsoft.com/en-us/library/dd135733.aspx

However, 400mb is a large file, and I'm not sure if one API call will deal with something of this size, you may need to split it and restore it using custom code.

-one
Jul 05 2018-11-11T00:
source share



All Articles