Windows Azure Base Storage from SQL SSIS

Can I upload images to azure storage for Windows from the SQL SSIS package? SSIS will read new images (daily) from one of my built-in SQL Server (tables) and upload images to storage.

+6
source share
2 answers

What a funny question! I needed to combine many pieces that I had never tried.

First, I created a simple console application based on an excellent guide on HOW: Blob Storage . Knowing that I had working code allowed me to adapt it for SSIS.

I have created 3 SSIS parameters at the package level. AccountName, AccountKey and ContainerName. These are all String data types. They provide credentials + a folder in which my downloaded data will be located.

data flow and variables

Data stream

The general view of the data stream is quite simple. The data source for the Script component that will act as the destination. You will need two columns: one provides a unique name for blob, and the other will be a binary bit.

My source is a trivial table. It has the names of countries and their flag (stored as varbinary (max)), which you yourself can clear from the CIA Directory in the world, if you are so inclined.

The assignment will be a little C #. Add a Script component of type Destination.

In the Script tab, I have 3 ReadOnly variables listed by User::AccountKey,User::AccountName,User::ContainerName

On the Input Columns tab, select FlagImage and FlagImage .

The following is Script. As stated in "How to," you will need to add a link to the Microsoft.WindowsAzure.Storage assembly before you can access the last 3 assemblies there.

 using System; using System.Data; using System.IO; using Microsoft.SqlServer.Dts.Pipeline.Wrapper; using Microsoft.SqlServer.Dts.Runtime.Wrapper; // Must add reference to Microsoft.WindowsAzure.Storage for this to work // http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/ using Microsoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.Auth; using Microsoft.WindowsAzure.Storage.Blob; /// <summary> /// Watch me load data to Azure from SSIS /// </summary> [Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute] public class ScriptMain : UserComponent { /// <summary> /// The storage account used /// </summary> private CloudStorageAccount storageAccount; /// <summary> /// An entity to work with the Blobs /// </summary> private CloudBlobClient blobClient; /// <summary> /// Blobs live in containers /// </summary> private CloudBlobContainer container; /// <summary> /// blockBlob instead of a pageBlob /// </summary> private CloudBlockBlob blockBlob; /// <summary> /// This method is called once, before rows begin to be processed in the data flow. /// /// You can remove this method if you don't need to do anything here. /// </summary> public override void PreExecute() { base.PreExecute(); string cs = string.Empty; string csTemplate = string.Empty; string accountName = string.Empty; string accountKey = string.Empty; string containerName = string.Empty; accountName = Variables.AccountName; accountKey = Variables.AccountKey; containerName = Variables.ContainerName; csTemplate = "DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}"; cs = string.Format(csTemplate, accountName, accountKey); this.storageAccount = CloudStorageAccount.Parse(cs); this.blobClient = this.storageAccount.CreateCloudBlobClient(); this.container = this.blobClient.GetContainerReference(containerName); this.container.CreateIfNotExists(); this.container.SetPermissions(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob }); } /// <summary> /// For each row passing through, upload to Azure /// </summary> /// <param name="Row">The row that is currently passing through the component</param> public override void Input0_ProcessInputRow(Input0Buffer Row) { string blobName = string.Empty; using (MemoryStream memStream = new MemoryStream(Row.FlagImage.GetBlobData(0, (int)Row.FlagImage.Length))) { this.blockBlob = this.container.GetBlockBlobReference(Row.CountryName); this.blockBlob.UploadFromStream(memStream); } } } 

Global Assembly Cache (GAC)

The assemblies you want to use in SSIS must be in the GAC. Fees cannot go to the GAC if they are not signed. Fortunately, Azure assemblies are signed this way from the Visual Studio command line, type gacutil -if "C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.1\ref\Microsoft.WindowsAzure.Storage.dll" or equivalent of where your version of this assembly exists

Download successful

And as proof, here is a shot from Azure Storage Explorer

blobs everywhere

+13
source

In SSIS 2012 and above, there is a task supported by Microsoft for uploading / downloading data to Azure Storage:

Example. "SQL Server 2016 Integration Services Feature Set for Azure": https://www.microsoft.com/en-us/download/details.aspx?id=49492

just find 2012 and 2014 if that is what you are using.

Hope this helps!

+2
source

Source: https://habr.com/ru/post/957543/


All Articles