Monthly Archives: June 2011
Know the rules, before you buy the tools
I met Scott over three years ago and this was one of the quotes that stuck with me; not to mention he’s a very smart developer.
So over the last six months I turned off ReSharper and just used the refactoring tools in Visual Studio 2010.
I reached the place where I ‘m most annoyed at creating code that (ReSharper) short-cuts and macros can provide more quickly.
I did have a friend one time who shall remain nameless (you know who you are) that could type faster than Intellisense could on a wickedly fast machine; ReSharper literally slowed him down, yes, he was that fast of a typist. If you’re that fast, don’t bother; if you’re one of the normal folks that can use some (non-Mavis Beacon) typing support, check it out.
As a PSA, I did try to upgrade a 4.x license I had purchased a few years ago; but the folks that were handling my order let me know that I had already purchased the upgrade and arranged for a full refund. Hopefully you buy your tools from the same type of shop.
-ofc
Wrapping My Head Around Windows Azure–Deploying
I mentioned in one of my first posts that deployments made zero sense to me. I found a few blog posts today where a few folks were just disappointed with the level of documentation. And things are a bit cumbersome, clunky, or just not intuitive for folks trying to make “it’ happen – roles, storages, and certs.
I’m going to take a whack at trying to make some sense. I’ve got many words and pictures to share because I hate when stuff is hard, especially when I’m just trying to deploy it. I felt this way working with ClickOnce. You wouldn’t often redeploy simple or (mostly) static solutions, but if you do using the publish utilities it can get a bit frustrating.
If you look at my first post, it tells you which Azure bits I’m working with, make sure you’re up to date b/c stuff will continue to change until they get it right or make it better – or both would be nice. Patience.
So I’m assuming you have something that is tied to an Azure project that you want to publish, great! Let’s go!
From the top
Just like a web project we right-click on our Azure project and pick “Publish…” from the context menu. We got that.
Which deployment path should you choose? If you have not done the following already:
- Setup Remote Desktop Connections with Windows Azure
- Created and installed certificates on your Windows Azure subscription
then choose the “Create Service Package Only” option. We’ll start there, I did and didn’t get any new scars from it.
If it looks like this, click “OK”, we’ll talk about the other stuff later on, promise.
As soon as you click “OK”, you’ll see your solution start to build, that’s normal. It’s also going to build a deployment package and the configuration file for you build, and will create them in the file system under your project. They don’t get added to source control unless you wand to add them. I’m using GIT so the whole file system is being watched so mine were checked in. The files are added to a publish folder for the build type (Debug, Release) you have selected. So my “Debug” deployment files went here.
If you had a release build, it would show up in “bin\Release\Publish” instead. Those two files in the folder are what we’ll use when we are in the Azure admin site to deploy our app. Follow me over here, and stand right there and watch this.
My deployment has two roles and talks to one queue and one table, simple. So in the dev fabric on our local machines the magic is just spraying everywhere from the Azure colored fire hose and everything works. You probably stepped through the connection string in your configuration file and found “UseDevelopmentStorage=true” for your data connection haven’t you? Well, now we’re going to be talking to live queues and tables and that stuff won’t work any longer. So (as I figured out yesterday) we need to tell our deployment where our storage lives. First we need to create it if we haven’t already, but if your HelloWorld app doesn’t use anything but a website, you won’t need to do this now. However, I would encourage you to following along anyway.
To the cloud! I hate that line…
I found this yesterday, and it was quite handy indeed. From the cloud project in your solution, right-click and choose browse to portal.
This gets you where you need to go. When you get, log in and go to your Azure account.
When the portal opens look in the lower left hand corner and find the Hosted Service, Storage Accounts & CDN button. It looks like this, click it:
Here’s my subscription with no hosted services or storage accounts.
Let’s create a storage account; a storage account is something that groups together your blob data, table data, and queue messages. We can delete this when we’re done if we don’t need it, or want to leave it online.
The main thing to get here is that it’s tied to your subscription and you can have as many distinct ones as you need. And yes they do cost money when you start dropping data into them – YMMV. And the URLs they use for communication resembled based on the name you give the storage account. So if called my storage account “junkinthetrunk”, the endpoints my applications use to get stuff would look like these:
https://junkinthetrunk.blob.core.windows.net
https://junkinthetrunk.queue.core.windows.net
https://junkinthetrunk.table.core.windows.net
Before we go on let’s talk about affinity groups. Affinity groups tell Azure what part of the country you want to store your data in. If you born in the midwest, you might choose that region, but if your app is serving the South Eastern US, you might want to rethink that choice.
There’s only a few choices and not in all cases but in most, you’ll want to pick a data center that’s closer to your business especially if you are running a hybrid solution. For example, a native ASP.NET hosted by an ISP and uses Windows Azure for some type of storage.
Click “Affinity Groups”, then click on your subscription record in the grid, then click the cool looking “New Affinity Group” button – in that order. You’ll get one of the dialogs pictured below, and as far as I know they are free. Fill in your choice of group name, pick a region or anywhere if you like, and click “OK”. Here’s how I named mine.
So, let’s build the storage account, we’ll use the affinity group as we do that. Click on Storage Accounts (the green thing above) then click on this cool file cabinet button. This will cause a dialog box to show up that’s going to ask you a few more questions about what and where you want to store you “file cabinet”.
Let’s fill in some more blanks, k?
Using the affinity group we created earlier, here’s what it looks like.
I only have one subscription so it was filled in for me.
I only enter the unique part of the URL – this will be the name of the storage account, and it shows up in the management portal and in your config. So make it a good, clean name and not something like “thisappsucks”.
A variation I came up with was to divide the storage accounts into different stages so I could potentially promote code from the development fabric, then to staging, then finally to production. Today, “junkinthetrunk” was already taken, so I used “dev0junkinthetrunk” instead. Also, Windows Azure ensures it unique as you type out the name.
If this makes sense to you use it, just remember that the more storage you have the more you pay when you use it. My process has been to delete everything in my subscription when I reach a milestone, then just start over. Call it deliberate practice.
I’ve clicked “OK” on the dialog above and Azure is off creating my awesome storage account.
If anything in documentation (yours or a vendor’s) asks for an account name, it’s probably referring to a storage account.
So let’s back up and recap a bit.
You opened the Azure Management Portal.
You created an Affinity Group.
You created a new Storage Account and applied the Affinity Group to it.
The point I want to make here is getting the cart in front of the horse; think about your storage while you’re sketching your architecture on the back of your dinner napkin. If you don’t need it, fine you’re already ahead. But if your app needs it, it is better to set it up ahead of time.
You can create storage after the fact like I did once – BUT you’ll have to alter the configuration file in the management portal and (based on how busy, but you’ll wait forever for the instances to tear down and build up.
Any changes you make to configuration will restart any instances you have running – also its probably fair to mention this is why you cannot change the Service Definition in the management portal, you cannot add roles on the fly, just the number of instances through the configuration.
Let’s go back to the deployment files and update them with the storage account information.
We only are going to open the ServiceConfiguraton.cscfg file and add our storage information and access key to it.
My application has one web site (role), and one worker role and each is named in the configuration file and initialized with (one)1 instance of each. If you know you want two or three instances, change it here and they’ll all spin up when Azure tries to initialize your package.
Also, which is more important to note, each role has it’s own data connection, that’s where we’re going to plug in our junkinthetrunk storage pointers we discussed above. Here are the before and after for the connection information. I snipped the account key (it is called access key in the storage account descriptors) just to make it fit. You need to enter the entire key.
Before:
<Setting name="DataConnectionString" value="UseDevelopmentStorage=true” />
After:
<Setting name="DataConnectionString" value="DefaultEndpointsProtocol=https; AccountName=dev0junkinthetrunk; AccountKey=2V6JrRdFnrST2PCQHA <snip>" />
Back in the Azure Management Portal click on “Home” or Hosted Services, Storage Accounts & CDN”.
Now we’ll create a new Hosted Service by clicking this really cool “New Hosted Service” button.
The dialog that’s presented needs more information, I fill in my blanks like this:
Once you click “OK”, you’ll see this modal dialog, it’s harmless. It’s basically telling you that you will only have one instance created, change as need if you want to. Click “Yes”.
I color-coded each section above to describe and compare, after, the service gets created so you can see what information gets stored where in the portal.
Back in the management portal Azure is busy deploying our package…
From the color-coded box above the service name, URL prefix, deployment name, and add certificate items are described below, so I’ll skip those for now.
The value in the dark green box is the same Affinity Group we created for our storage account. Now our roles/applications are running in the same data center as our data tables, blobs, and queues. Building the group makes this easier to get right when we deploy.
The light green blocks are the defaults, I stuck with those for getting my stuff into the staging environment.
Package location and Configuration files were the first things we discussed; remember they were created when we chose the “Create Service Package Only” option.
Checking on the deployment again…
And now things are spinning up…
Back to the comparison…
Of all of the blanks we filled in, here’s where a few entries landed, compare this to the color-coded box above.
The “Windows Azure Tools” certificate is something I created yesterday while I was trying to get the Remote Desktop Tools working for a deployment update. From where I was sitting the deployment kept timing out so I gave up for the moment.
At any rate, if you want to associate a certificate (.pfx) with your deployment click the “Add Certificate” button and drill to the location where it’s at, enter your cert’s password then click “OK”. And if you fat-finger your password you will get a chance to enter again.
All done!
To take it for a spin we just need to click on the “Development.Registration.Svc.deploy” node, and then in the right-hand pane the staging DNS name will open the web site that’s been deployed.
The highlighted URL is the one we clicked on, so know we can add some data.
I’ve been using the Cloud Storage Studio tool from Cerebrata Software for this and we can see the data from my “junkinthetrunk”. The tool allows me to see my SQL Server backed development storage and Azure storage accounts.
If you’re just here for the good news, I hope this post help you in some way to flatten the Windows Azure curve.
If you want to stroll into the stuff that didn’t work for me, keep reading.
What didn’t work
Certificate Management and Installation
Once I had my initial deployment finished, I found a bug and needed to redeploy my solution. This is when I setup my certificate and associate it with my deployment. This sets up the trust between Azure and I.
Inside the management portal you can choose “Management Certificates”. This is looking for the .CER file, not a .PFX file. So then I needed to open the cert manager snap-in and export the public version and apply it to the solution. So now I had a cert, it’s the same one I used today.
My point is that you need to create the cert before you get into a deployment scenario. Create a cert, but create it in the way step five of this blog post describes:
http://msdn.microsoft.com/en-us/library/gg443832.aspx
This will allow the cert to be used during the initial deployment, and to be added to the deployment as well in a way that satisfies Windows Azure certificate manager.
Redeployment
I kept getting time outs from Windows Azure during a redeployment. Once I figured out the cert management issues, I couldn’t get a 90 second time out threshold somewhere. All of the blogs I found were related to pushing a big ol’ VM up to Windows Azure. This was an app that I had already deployed six or seven times. I’m going to try somewhere else, maybe my network or Windows Azure was just having a bad day.
Oh, and if the certs fail for any reason or trust is an issue, the blue ring around this deployment warning turns bright red.
Documentation
There were a few complaints out there from the recent refresh that took place where not all of the documentation had been updated (folks using old docs on new bits) and the fact that things were just a little too cumbersome.
My only response to that (and something I’ve shared with readers who have worked with me) is this. Do you remember the 1.2 toolkit for SharePoint 2007? And what a pain in the butt it was to use? We are so far from that experience and I’m glad for it. We had already gone through the 2003 version and tried to sync (in our minds) the precautions to take for WSS and MOSS before we tried to, or thought about, upgrading.
I’m sorry but the teams that are building this stuff must have listened or are working for someone who experienced much wailing and gnashing of teeth, not to mention a lot of bitching when it came to poor tooling back then. I’m thinking, it will only get better from here, and right now its not bad at all where I’m sitting.
Resources
I used a lot of links to get this far, and some of it just by thrashing about a bit as well. But here are some links that helped out. Some of the information I gathered from these posts I reposted in a different way but I still wanted to give some credit to the folks that put the first bits of guidance before me.
// UseDevelopmentStorage=false
http://simonwdixon.wordpress.com/2011/05/06/azure-usedevelopmentstoragefalse-deployment-hangs/
Channel9 – CloudCover : http://channel9.msdn.com/Shows/Cloud+Cover
Setting Up Named Auth Credentials : http://msdn.microsoft.com/en-us/library/ff683676.aspx
Basic VS2010 Deployment : http://msdn.microsoft.com/en-us/library/ff683672.aspx
*The* ASP.NET/Azure Quick Start: http://msdn.microsoft.com/en-us/library/gg651132.aspx
There’s one more gap I want to fill and that’s the code I wrote for the solution. I only got into the data (model, messages, and entities) before, and I’d like to talk more about the infrastructure code before I go dark and get really busy building out this solution.
Again, if you read this far into the post, thanks. I hope it helped you in some way to learn to use Windows Azure.
You must be logged in to post a comment.