Know the rules, before you buy the tools

resharperI met Scott over three years ago and this was one of the quotes that stuck with me; not to mention he’s a very smart developer.

So over the last six months I turned off ReSharper and just used the refactoring tools in Visual Studio 2010.

I reached the place where I ‘m most annoyed at creating code that (ReSharper) short-cuts and macros can provide more quickly.

I did have a friend one time who shall remain nameless (you know who you are) that could type faster than Intellisense could on a wickedly fast machine; ReSharper literally slowed him down, yes, he was that fast of a typist.  If you’re that fast, don’t bother; if you’re one of the normal folks that can use some (non-Mavis Beacon) typing support, check it out.

As a PSA, I did try to upgrade a 4.x license I had purchased a few years ago; but the folks that were handling my order let me know that I had already purchased the upgrade and arranged for a full refund.  Hopefully you buy your tools from the same type of shop.


Wrapping My Head Around Windows Azure–Deploying

I mentioned in one of my first posts that deployments made zero sense to me.  I found a few blog posts today where a few folks were just disappointed with the level of documentation.  And things are a bit cumbersome, clunky, or just not intuitive for folks trying to make “it’ happen – roles, storages, and certs.

I’m going to take a whack at trying to make some sense.  I’ve got many words and pictures to share because I hate when stuff is hard, especially when I’m just trying to deploy it.  I felt this way working with ClickOnce.  You wouldn’t often redeploy simple or (mostly) static solutions, but if you do using the publish utilities it can get a bit frustrating.

If you look at my first post, it tells you which Azure bits I’m working with, make sure you’re up to date b/c stuff will continue to change until they get it right or make it better – or both would be nice.  Patience.

So I’m assuming you have something that is tied to an Azure project that you want to publish, great! Let’s go!

From the top

Just like a web project we right-click on our Azure project and pick “Publish…” from the context menu.  We got that.

Which deployment path should you choose?  If you have not done the following already:

  • Setup Remote Desktop Connections with Windows Azure
  • Created and installed certificates on your Windows Azure subscription

then choose the “Create Service Package Only” option.  We’ll start there, I did and didn’t get any new scars from it.

If it looks like this, click “OK”, we’ll talk about the other stuff later on, promise.


As soon as you click “OK”, you’ll see your solution start to build, that’s normal.  It’s also going to build a deployment package and the configuration file for you build, and will create them in the file system under your project.  They don’t get added to source control unless you wand to add them.  I’m using GIT so the whole file system is being watched so mine were checked in.  The files are added to a publish folder for the build type (Debug, Release) you have selected.  So my “Debug” deployment files went here.


If you had a release build, it would show up in “bin\Release\Publish” instead.   Those two files in the folder are what we’ll use when we are in the Azure admin site to deploy our app.  Follow me over here, and stand right there and watch this.

My deployment has two roles and talks to one queue and one table, simple.  So in the dev fabric on our local machines the magic is just spraying everywhere from the Azure colored fire hose and everything works.  You probably stepped through the connection string in your configuration file and found “UseDevelopmentStorage=true” for your data connection haven’t you?  Well, now we’re going to be talking to live queues and tables and that stuff won’t work any longer.  So (as I figured out yesterday) we need to tell our deployment where our storage lives.  First we need to create it if we haven’t already, but if your HelloWorld app doesn’t use anything but a website, you won’t need to do this now.  However, I would encourage you to following along anyway.

To the cloud!  I hate that line…

BrowseToPortalI found this yesterday, and it was quite handy indeed.  From the cloud project in your solution, right-click and choose browse to portal.  

This gets you where you need to go.  When you get, log in and go to your Azure account.

When the portal opens look in the lower left hand corner and find the Hosted Service, Storage Accounts & CDN button.  It looks like this, click it:


Here’s my subscription with no hosted services or storage accounts. 


Let’s create a storage account; a storage account is something that groups together your blob data, table data, and queue messages.  We can delete this when we’re done if we don’t need it, or want to leave it online. 

The main thing to get here is that it’s tied to your subscription and you can have as many distinct ones as you need.  And yes they do cost money when you start dropping data into them – YMMV.  And the URLs they use for communication resembled based on the name you give the storage account.  So if called my storage account “junkinthetrunk”, the endpoints my applications use to get stuff would look like these:

Before we go on let’s talk about affinity groups.  Affinity groups tell Azure what part of the country you want to store your data in.  If you born in the midwest, you might choose that region, but if your app is serving the South Eastern US, you might want to rethink that choice. 

There’s only a few choices and not in all cases but in most, you’ll want to pick a data center that’s closer to your business especially if you are running a hybrid solution.  For example, a native ASP.NET hosted by an ISP and uses Windows Azure for some type of storage.

Click “Affinity Groups”, then click on your subscription record in the grid, then click the cool looking “New Affinity Group” button – in that order.  You’ll get one of the dialogs pictured below, and as far as I know they are free.  Fill in your choice of group name, pick a region or anywhere if you like, and click “OK”. Here’s how I named mine.



imageSo, let’s build the storage account, we’ll use the affinity group as we do that.  Click on Storage Accounts (the green thing above) then click on this cool file cabinet button.  This will cause a dialog box to show up that’s going to ask you a few more questions about what and where you want to store you “file cabinet”.

Let’s fill in some more blanks, k?


Using the affinity group we created earlier, here’s what it looks like. 

I only have one subscription so it was filled in for me.

I only enter the unique part of the URL – this will be the name of the storage account, and it shows up in the management portal and in your config.  So make it a good, clean name and not something like “thisappsucks”.

A variation I came up with was to divide the storage accounts into different stages so I could potentially promote code from the development fabric, then to staging, then finally to production.  Today, “junkinthetrunk” was already taken, so I used “dev0junkinthetrunk” instead.  Also, Windows Azure ensures it unique as you type out the name. 

If this makes sense to you use it, just remember that the more storage you have the more you pay when you use it.  My process has been to delete everything in my subscription when I reach a milestone, then just start over.  Call it deliberate practice.

I’ve clicked “OK” on the dialog above and Azure is off creating my awesome storage account.


If anything in documentation (yours or a vendor’s) asks for an account name, it’s probably referring to a storage account.

So let’s back up and recap a bit.

You opened the Azure Management Portal.

You created an Affinity Group.

You created a new Storage Account and applied the Affinity Group to it.

The point I want to make here is getting the cart in front of the horse; think about your storage while you’re sketching your architecture on the back of your dinner napkin.  If you don’t need it, fine you’re already ahead.  But if your app needs it, it is better to set it up ahead of time. 

You can create storage after the fact like I did once – BUT you’ll have to alter the configuration file in the management portal and (based on how busy, but you’ll wait forever for the instances to tear down and build up. 

Any changes you make to configuration will restart any instances you have running – also its probably fair to mention this is why you cannot change the Service Definition in the management portal, you cannot add roles on the fly, just the number of instances through the configuration.

Let’s go back to the deployment files and update them with the storage account information.


We only are going to open the ServiceConfiguraton.cscfg file and add our storage information and access key to it.

My application has one web site (role), and one worker role and each is named in the configuration file and initialized with (one)1 instance of each.  If you know you want two or three instances, change it here and they’ll all spin up when Azure tries to initialize your package. 


Also, which is more important to note, each role has it’s own data connection, that’s where we’re going to plug in our junkinthetrunk storage pointers we discussed above.  Here are the before and after for the connection information.  I snipped the account key (it is called access key in the storage account descriptors) just to make it fit.  You need to enter the entire key.


<Setting name="DataConnectionString" value="UseDevelopmentStorage=true” />


<Setting name="DataConnectionString"
AccountName=dev0junkinthetrunk; AccountKey=2V6JrRdFnrST2PCQHA <snip>" />

Back in the Azure Management Portal click on “Home” or Hosted Services, Storage Accounts & CDN”.

Now we’ll create a new Hosted Service by clicking this really cool “New Hosted Service” button.


The dialog that’s presented needs more information, I fill in my blanks like this:


Once you click “OK”, you’ll see this modal dialog, it’s harmless.  It’s basically telling you that you will only have one instance created, change as need if you want to.  Click “Yes”.


I color-coded each section above to describe and compare, after, the service gets created so you can see what information gets stored where in the portal. 

Back in the management portal Azure is busy deploying our package…


From the color-coded box above the service name, URL prefix, deployment name, and add certificate items are described below, so I’ll skip those for now.

The value in the dark green box is the same Affinity Group we created for our storage account.  Now our roles/applications are running in the same data center as our data tables, blobs, and queues.  Building the group makes this easier to get right when we deploy.

The light green blocks are the defaults, I stuck with those for getting my stuff into the staging environment.

Package location and Configuration files were the first things we discussed; remember they were created when we chose the “Create Service Package Only” option.

Checking on the deployment again…

And now things are spinning up…


Back to the comparison…

Of all of the blanks we filled in, here’s where a few entries landed, compare this to the color-coded box above.


The “Windows Azure Tools” certificate is something I created yesterday while I was trying to get the Remote Desktop Tools working for a deployment update.  From where I was sitting the deployment kept timing out so I gave up for the moment.

At any rate, if you want to associate a certificate (.pfx) with your deployment click the “Add Certificate” button and drill to the location where it’s at, enter your cert’s password then click “OK”.  And if you fat-finger your password you will get a chance to enter again.


All done!


To take it for a spin we just need to click on the “Development.Registration.Svc.deploy” node, and then in the right-hand pane the staging DNS name will open the web site that’s been deployed.


The highlighted URL is the one we clicked on, so know we can add some data.


I’ve been using the Cloud Storage Studio tool from Cerebrata Software for this and we can see the data from my “junkinthetrunk”.  The tool allows me to see my SQL Server backed development storage and Azure storage accounts.



If you’re just here for the good news, I hope this post help you in some way to flatten the Windows Azure curve.

If you want to stroll into the stuff that didn’t work for me, keep reading.

What didn’t work

Certificate Management and Installation

Once I had my initial deployment finished, I found a bug and needed to redeploy my solution.  This is when I setup my certificate and associate it with my deployment.  This sets up the trust between Azure and I. 

Inside the management portal you can choose “Management Certificates”.  This is looking for the .CER file, not a .PFX file.  So then I needed to open the cert manager snap-in and export the public version and apply it to the solution.  So now I had a cert, it’s the same one I used today.

My point is that you need to create the cert before you get into a deployment scenario.  Create a cert, but create it in the way step five of this blog post describes:

This will allow the cert to be used during the initial deployment, and to be added to the deployment as well in a way that satisfies Windows Azure certificate manager.



I kept getting time outs from Windows Azure during a redeployment.  Once I figured out the cert management issues, I couldn’t get a 90 second time out threshold somewhere.  All of the blogs I found were related to pushing a big ol’ VM up to Windows Azure.  This was an app that I had already deployed six or seven times.  I’m going to try somewhere else, maybe my network or Windows Azure was just having a bad day. 

Oh, and if the certs fail for any reason or trust is an issue, the blue ring around this deployment warning turns bright red.


There were a few complaints out there from the recent refresh that took place where not all of the documentation had been updated (folks using old docs on new bits) and the fact that things were just a little too cumbersome. 

My only response to that (and something I’ve shared with readers who have worked with me) is this.  Do you remember the 1.2 toolkit for SharePoint 2007?  And what a pain in the butt it was to use?  We are so far from that experience and I’m glad for it.  We had already gone through the 2003 version and tried to sync (in our minds) the precautions to take for WSS and MOSS before we tried to, or thought about, upgrading. 

I’m sorry but the teams that are building this stuff must have listened or are working for someone who experienced much wailing and gnashing of teeth, not to mention a lot of bitching when it came to poor tooling back then.  I’m thinking, it will only get better from here, and right now its not bad at all where I’m sitting.


I used a lot of links to get this far, and some of it just by thrashing about a bit as well.  But here are some links that helped out.  Some of the information I gathered from these posts I reposted in a different way but I still wanted to give some credit to the folks that put the first bits of guidance before me.

// UseDevelopmentStorage=false

Channel9 – CloudCover :

Setting Up Named Auth Credentials :

Basic VS2010 Deployment :

*The* ASP.NET/Azure Quick Start:

There’s one more gap I want to fill and that’s the code I wrote for the solution.  I only got into the data (model, messages, and entities) before, and I’d like to talk more about the infrastructure code before I go dark and get really busy building out this solution.

Again, if you read this far into the post, thanks.  I hope it helped you in some way to learn to use Windows Azure.

Wrapping my head around Windows Azure – Steel Threading Complete


I finished the Steel Thread late last week but didn’t get the blog post up over the long weekend.   What I came up with was just a few brief paragraphs on how it went (great) and what I learned (a lot).

The goal was to get used to using a worker role, a web(site) role, a table and queue for storage.

On the infrastructure side of things I realized that I needed specific types to store messages and the different formats they take on during processing.  One of the things I used to bootstrap was some guidance from the PnP team.  This post has more to do with using entities than it does with learning to use Windows Azure, but once you dive into Azure Table Storage, you’ll enjoy the ideas it presents.  There are more updated examples from the PnP team that employ some fabric stuff I’m targeting to use as well.

The guidance I used demonstrated some interesting abstractions around table storage, queues and their messages that I experimented with putting this together.  For this example I just wanted to to the following:


I wanted to submit a new registration into the application’s home page; create a Registration (model) object from the page; create a NewRegistrationMessage which is actually the message I putting on the queue.

That completes the first line of the diagram.  The second line of the diagram starts with the little blue guy running in a circle every few seconds, he’s the worker role.  The worker role tries to dequeue a message from the registration queue.  If the role find a queue message it turns it into a RegistryEntity which is a descendant of TableServiceEntity which I used to insert into the Registration Table, which is the one piece of table storage I need to use for this exercise.

This exercise help me figure out what extension methods I needed to write, and which pieces make the most sense to work together.  I could have skipped the queue implementation and just wrote directly to an azure table, but my first idea to store all of the messages first, then write to the queue.  For other types of messages and models (inventory inserts and updates)I probably will.  But for larger things which might need some moderation (most media types) I’ll probably push them onto a queue before bringing them into a staging area.

The little red wp7 phone is one of the clients that I want to access the data as well.  Last week I saw the WP7-Azure toolkit demo’d so I want to make that my next spike.  Yeah, spike.  I get that I can figure out all of the processing bits, but I want to make sure I get the phone bits right.

Here’s the stuff I’m going to keep from this exercise.  Some of it is directly copied from the PnP sample I mentioned; and some of it was just stuff that I wrote that I needed, not complicated, just code.

imageI have a class library called Data that has the following classes in it.  A couple of base types the MVC app uses; and the rest is used by Windows Azure.

Here are a few gists for the types I created, including the repository bits.


Wrapping my head around Windows Azure – Steel Threading

I saw a LEAN webcast last year and learned about something creating a Steel Thread that can emulate what you want to application to do without any business logic.  This is a “throw your code away” exercise, so there’s no pressure, right?  Well sort of.  At this point we know very little about the problem, and less about how the solution will look. 

The idea here is to use what you do know based on customer interviews, and their ideas for success and turn that into something you can deploy in a day or two, but works.  Again, we’re not going to re-use any of this code since when we finish it will effectively be a hot mess of coupled and brittle code we wouldn’t want to keep around anyway, maybe we can call it an ‘ugly spike’.

Steel Threading supports the notion that you can start at point “A” and traverse 4, 8, 9, or (n) other points arriving at the destination in the state you expected – think “Chutes and Ladders”; you may or may not make it from one point to the next, but you’re learning as you go. 

The ultimate goal is to work inside the box and not prescribe something for the solution that doesn’t work or they can’t sustain.  It is a spike of sorts, but it helps kick off the project in a direction that makes sense today, tomorrow or next week might be totally different – we’ll worry about next week, next week.

The Initial Stories

I have 15 stories with high-to-medium level detail to start with. This isn’t going to be an Amazon or Zappos, but it will be something ecommerce in nature. Since most of the common scenarios have been solved by the industry we can take some fairly intelligent hints on how we like things to happen.  For example, don’t put a product in my shopping cart – and let me pay for it, if it’s been discontinued.  That said, I can focus on executing on my customers goals for this solution.

Where Azure Comes In

All that said, I’ve got an MVC web role, and a worker role in my (disposable) steel thread solution.  I can add more projects to the solution while I’m finishing this exercise if I need to, but this is the starting point right now.

Our local Orlando user group hosted Scott Densmore from the Microsoft PnP Team who discussed taking a stock ASP.NET project and wiring it to AppFabric’s Access Control services.  That’s what I need to do to support a few of my stories I’ve collected so far.

The web role will submit stuff that needs to be picked up and persisted, so I’m thinking about using a queue to intercept these submissions, then move the submission into table storage.  There’s already guidance around this in a few places so it should wire up quickly and work. The content types I’m concerned with have different rules on submission, so there may be a bit of a workflow involved with the submission that might it a bit prickly.

I’m going to run most of this from my local DevFabric, I’m not sure there’s a need to deploy it except for the fact of getting good/better at deployments.  Besides, deployment looks a bit involved to me at the moment, so getting a few deployments behind me will only help. 

On a side note, I was a bit jealous this morning reading about a VS2010 plug-in for Rackspace deployments – very cool stuff.  The case study was written up by Microsoft so I would only think they are working out the same types of features to keep us out of their management portal and inside the IDE – maybe a new project property tab based on the type of project template?  That would be cool.


Here are my two development environments:

–  Both work and home are running the 64-bit Windows 7 Awesome SKU

– Both work and home have at least 4Gb of RAM (work has more)

– Both work and home have at least 2 cores (work has many more)

– Both are running SQL Server 2008 R2

– Azure 1.3 Toolkit

– VS2010 SP1

– GitHub for SCM, for now anyway

That’s it for now.


Getting my head around Windows Azure, Preamble Ramble



I’ve become very intrigued with Azure lately and also an Azure idiot at the same time.  Well, maybe not a total idiot but it’s a huge platform.  I’ve been following some of the guidance from the MS PnP team to get my feet wet but recently I’ve just wanted to dive in.

The main idea for taking this approach is to just focus on one thing.  I’ve been a scatter-brained developer for the last few years and recently after coming off a long project, I just want to focus on one (albeit large) technology stack.  There’s one or two other things I’ll poke at but this is one that’s going to get a lot of my daytime focus.

So as to not get any readers tangled up in the who or what I’m building, I’ll be *very* generic in my descriptions of everything except the technology.  So hopefully some of this will be simple enough to use on your own projects, and help with the why, that’s the plan anyway.

Barcamp Sarasota 2011


My daughter and I went to Sarasota for Barcamp Sarasota this weekend.  The weather was awesome and the event was great.  This part of the community seems to be erupting in a good way, we are definitely going to be keeping our ears and eyes on this community.

My talk was called “Learn Your Customers’ Language”.  If you attended my Barcamp talk, here are the slides.  I hope you enjoyed it and if you have any questions, please shoot an email to me:

** Update **

Put up two github gists from the code samples I didn’t get to during my talk
at #bcsrq on Saturday.  One for SpecFlow and one for StoryQ.

Take your Queues from PnP

I received an email from our Scott Densmore earlier this week that pointed me to a GIT project which had some updates he was making to PnP Azure guidance.  Without pressing F5, I started my own code review on the solution to see how the code was laid out.  Here’s the post that discussed the changes, and here’s the GIT source he posted.

He mentioned a MultiEntity pattern he was plugging in based on another post.  Both posts was interesting, and made total sense to a Azure noob like me.  Hanselman is big on reading good code, and today I read some really good code.  How do I know?  I compared it to some Azure Boot Camp code that laid out a short talk on Azure Queues today, which is where I focused my review.  From there, I went back up the stack to the data services (Role, Member, Session) and dove into the decoupled plumbing code from there, then up to the ASP providers.

This discussion about queue processing, not the MultiEntity pattern, Scott nails that together, and if that doesn’t do it for you, check out the article Scott references in his post.

The interesting difference in from the boot camp code was how the worker role that managed the queues was working.  The PnP code had setup a Task to run the job at specific intervals with an object called QueueCommandHandler that handles the processing. Here was the difference.  The boot camp code looked at using while(true) to loop through the queues’ messages to process.  I will have to say, the boot camp webcast definitely helped me understand queue processing much-much better.  And I could read through the PnP code even more easily.

The generic command queue handler which the queue command handler derives from looks like this:


The queue command handler looks like this:


The command handler looks like this:



When the worker role starts, it handles task scheduling like this for queue and non-queue command processing:



The Do() still uses a loop to process, but the processing is still pulled apart and re-usable a bit more to my liking.


The only thing I would to differently at first glance at all of this code was to incorporate some type of rheostat to handling slowing the processing if the queues are so busy. 

The boot camp web cast had an interesting approach to this called exponential back-off polling when means to us mortals is that if I look at queue every 5 seconds and it’s empty, I’m going to set an interval somewhere to 10 seconds so I check less often.

I’ve built things like this into some projects I’ve built, and usually from configuration, but this needs to be more dynamic (hands-free).  I’ll probably toy with this to build something easier to ready (which is why I’m not posting it here).

Take a look at Scott’s code and pick out some part of Azure that interests you as see if there any queues you want to take from their guidance.