Gzip and upload to Azure blob

I'm trying to gzip and load a static js file into azure light, but the result is a 0-byte blob. Can someone tell me what I'm doing wrong?

var filePath = "C:\test.js"; using (var compressed = new MemoryStream()) { using (var gzip = new GZipStream(compressed, CompressionMode.Compress)) { var bytes = File.ReadAllBytes(filePath); gzip.Write(bytes, 0, bytes.Length); var account = CloudStorageAccount.Parse("..."); var blobClient = new CloudBlobClient(account.BlobEndpoint, account.Credentials); var blobContainer = blobClient.GetContainerReference("temp"); var blob = blobContainer.GetBlockBlobReference(Path.GetFileName(filePath)); blob.Properties.ContentEncoding = "gzip"; blob.Properties.ContentType = "text/javascript"; blob.Properties.CacheControl = "public, max-age=3600"; blob.UploadFromStream(compressed); } } 
+4
source share
1 answer

You need to reset the position of the stream to zero in the compressed stream. You wrote the data to the stream, but when you go to load from the stream on the last line, the position of the stream is at the end of the data you wrote, so blob.UploadFromStream starts at the current position in the stream that has nothing after it.

add the following before downloading:

 compressed.Position = 0; 

This should load the full contents of the stream.

This is not necessarily an azure thing; most codes that work with streams work with the current position for the stream. I was burned by him several times.

+5
source

Source: https://habr.com/ru/post/1493978/


All Articles