I am trying to formulate a valid dev / test / QA environment for my application package for migrating to Azure. However, I force us to have our dev / test / qa / etc. environments are actually hosted in place and deployed through the build server (for example, CC.NET, TeamCity, Jenkins, etc.).
In such a “test environment”, we need the ability to initiate the deployment of a specific snapshot of unreleased code (and data) for a team of quality assurance and business experts to test both technical testing and admission testing. Obviously, all these people will not compile and hit F5 in Visual Studio to do this testing, so we need a deployment environment. In our SDLC, we actually go through ~ 4 of these environments before we move on to staging and production. In short, for this we need a low-cost (automated deployment) and easily reproducible process.
When planning this environment, the question of “How to host Azure Services” is certainly a difficult one. So let's look at every part of Lazur. Italic options are the ones we want to go with.
- Roles on the Internet Well, IIS can more or less process this data for us (at least enough for dev / test situations - everything except real strong> load testing, which, obviously, we will need to do in Azure, which is good) .
- Worker roles We have two options. The first is to have a “wrapper application” that is a Windows service, and we can use this to host DLLs that invoke our worker roles for functionality (after all, our real project “Worker Role” should not be something larger than the configuration file and ~ 4 lines of code that calls the DLL to do the work). This option works, but requires very different application codes and deployment code. The second option is to use the Azure Compute emulator. This works as long as your worker roles do not need to control external ports or anything else. In our case, our work roles only need to track queues and subsequently gain access to various resources. The problem with this is related to various build scripts, since the only way to automate the deployment for Azure Emulator is to run
CSPack and CSRun on the computer hosting Azure Emultor, which is probably not your build server. Because of this, you will need to make some kind of remote script to execute this. - VM roles We do not care about these , so I completely ignore this aspect of testing.
- Queues Here we have 3 options. The first is to use MSMQ. Since this requires a completely different code base, which we do not have (or at least an abstraction around this other code base), I do not consider this option. The second is to keep the Queues in Lazur, as they are so small / cheap. We actually do this temporarily until we can try the third option. The third option is to use the Azure Storage Emulator. I am not sure, but I believe that this parameter will allow access only to services running on the local machine, to storage objects . For queues, our application code is the code that actually “expands” the queues, so this should be fine as long as our application code actually runs on the server hosting the Azure Storage emulator.
- Tables Here we have 3 options. First, I hate it, and I want to use a database and create a table in it to access these tables. I am not considering this option. The second is to save the tables in Azure. I don’t like it, because a lot back and forth for things that can store data of a significant size (up to 1 MB per record). Although the lines are incredibly light and cheap, the cost of a table can add up pretty quickly. This gives us a third option using the Azure Storage Emulator. I'm not sure, but I believe that this option will only allow services running on the local machine to access storage objects. I still don't understand all the pros / cons tables in the emulator.
- Blobs Here we have 2 options. The first is bad, and that keeps them in Lazur. These are most likely files of significant size, so it is unreasonable. So, the second option, again, is to use the Azure Storage Emulator. I think this is what we need to do.
So, given that we have MVC applications (web role), WCF web services (web role), queues, tables, clips, and worker roles that are started by queues, but access to WCF tables, blocks, and web services Does this as a reasonable way to host our internal QA environments (etc.)? And besides some of the troubles with the remote CSPack and CSRun script for deployment on an Azure emulator, does all this sound reasonable automatically with the build server?
source share