Squeezing The Rain Out Of A Cloud

mostly.cloudy.rain.xlarge.mod

This week’s weather has been overcast and cloudy, and this morning the forecast looks like the image above (according to an online weather source).  So I thought I’d share a few things I worked through yesterday and this morning to get some rain out my cloud.

Here’s what the solution is using, so far:

MVC 4 Internet project – and all the scaffolding bits it gives us. Nuget packages only include Unity.Mvc4 & Microsoft.ServiceBus right now.  I’ve got some ideas for logging that’ll use some of the PnP Enterprise Library goodness to wire-up some business and system logging for this solution, as well as tap into Wasabi for scaling & throttling ideas I have for the solution – more on the last two later on though.

The solution is pretty simple, and that’s how I’d like to keep it for now.  But I needed to start pulling some mock data into grids to start with, and historically being a server-side coder I started out trying to unload my collections with some of the HTML helpers Razor exposes to us.  But it really just didn’t feel right and I found myself trying to hack something together to get some data into the view.  The night before I was having a conversation with John Papa around UI/UX content enablers.  He makes all of this stuff look so easy, so I thought, sure I can do this too so I set out to jam in a JQuery plug-in to spin through my data.

I settled on jqGrid with only a few minutes of looking around the jQuery home page.  So not being the client-side junkie most folks are these days I started Binging for code samples.  After a few hours (seriously, I’m not that smart and somethings take longer than it should) I found a post on Haack’s site that talked about what conventions need to match and which ones don’t.  Oh, and after a minutes I fixed the signature on my controller action and the code started lighting up.  Now it’s only formatting I need for a few views which I won’t burn too many cycles on.

Using Phil Haack’s example, I had jqGrid running in a few minutes.  But I will say this – I did learn alot about the Html Helpers ASP offers, they are powerful and very handy.

The side of effect of this choice was larger than I thought though and required a bit of refactoring.  The view receiving this data was strongly-typed with  IEnumerable and now that the data was coming from a jqGrid call to an action that returned a JSON payload, I didn’t need that.  The repository method that was serving the data to the controller looked funny now.  I needed to scope the method to just return the requestor’s data, not all of the data.  I may still split this interface up because of ISP, but I’ll keep my eyes open for code smells just in case.

So, there’s a bit of refactoring going on today before I hookup the Azure persistence piece which is next.  I haven’t quite figured that out yet, but soon.  The easy part about this is I can still target my Azure Service Bus queues, and tap into the local storage emulator on my local box until I figure out data and or document shapes for the storage thingee.

Here’s a gist with the view and controller source.

HTH /oFc

Advertisements

Living Inside a Cloud Should Be Easy, Right?

crescent-wrench

It’s been a few months since I dropped the Azure SDK on my desktop and the tooling and changed considerably to say the least.  The portal changed a bit as well, but once you get used to it, it just works for you, and unlike before you can see everything that’s going on “up there” at a glance.

However, back in the IDE, particularly inside the code there are more pieces that are supposed to bolt up to your Azure code.  And if you’re using an MVC 4 web role you can push in a NuGet package called Unity.Mvc4 with comes with the this handy little bootstrapper you can use to load your Unity container using the UnityConfig class that runs next to the bundler and routing configs in the App_Start folder.

This was one thing that I didn’t realize was new to the MVC 4 scaffolding.   These config classes help keep things we’ve piled into the Global.asax for a long time.  And the UnityConfig class follows suit nicely.

The idea with the bootstrapper is to help keep the type mappings contained, but loading when the app domain spins up each time.  All of the other pieces appear to act the same, i.e. life-time management, aliasing, and child containers.

The last thing I’ll mention about things fitting together is when I started this solution months ago, I was using a previous version and the upgrade wizard didn’t fire-up so I didn’t get a bump on my web role “and that’s when the fight started”.

If you’re trying to preserve your old solution and you’re trying to get it to act like an MVC 4 template, don’t.  If you don’t get the version bump from the IDE, stop.  Create (or add) a proper MVC 4 project from the project template dialog and go from there.  Copy your code to the new one, fix up the usings and references and keep going.

While I was doing this refactoring and sorting out my existing unit tests the code started to thin out and I realized that the MVC 4 bits could do what I was making the older MVC project do.  It just took a bit of frustration and brute force to recognize this and keep coding.

I had the unique pleasure of deleting a lot of code, and still have everything work, well.  Just had to sync with the tooling and the way things are supposed to fit together now.  Same tools, just a different approach sometimes when the bits are out in front of the IDE.  Not a bad thing, just different, and better.

*** Update ***

So I didn’t need the UnityConfig class anyway.  The NuGet step actually plugged the bootstrapper into the root of the website and exposed a static RegisterTypes(IUnityContainer container) method that handles the mappings.  I usually don’t wrap my type registrations in code, but rather in the configuration file so I can easily add types on the fly.  The bootstrapper exposes a static method that handles returning the container.  Here’s a code snippet with one registration added.

Bootstrapper

 

 

 

.

 

HTH /oFC

Clouds Need to Make Rain, right?

Clouds Need to Make Rain, right?So I’ve been working on this cloud stuff off and on for a few months now.  And while the cloud vendors try to make it easy to work with cloud stuff, things aren’t always intuitive unless you clear your mind and don’t try to do what you remember, but actually how you’re being told they need to work.

Then after taking your code around the block a few times, you take something someone else coded or created and make it your own.  Most of the time it works this way, but there are times when it doesn’t and you just have to apply brute force and push that rock back up the hill.  And once you do the first time, everything starts to click (and work).

I guess the idea here is working with cloud technology is fun, and challenging but you have to keep your eyes on what you set out to build initially and not get bogged down in why something doesn’t work.

If if  doesn’t work, start from scorched earth, as in, throw away *all of your code you just wrote* (hard to do sometimes) and start all over.  I did yesterday and tossed about 1,000 lines of source code – and worked around a problem in about 15 minutes I’d been dealing with for a while.

Of course there were other (positive) external forces that helped me get beyond the block I was experiencing, but scorched earth was the right, first, step to take.

And as it worked out, my piece of the cloud started raining on the scorched earth and once all of the smoldering finished, I had something really nice to work with and continue working with.

HTH – oFc

Wrapping My Head Around Windows Azure–Deploying

I mentioned in one of my first posts that deployments made zero sense to me.  I found a few blog posts today where a few folks were just disappointed with the level of documentation.  And things are a bit cumbersome, clunky, or just not intuitive for folks trying to make “it’ happen – roles, storages, and certs.

I’m going to take a whack at trying to make some sense.  I’ve got many words and pictures to share because I hate when stuff is hard, especially when I’m just trying to deploy it.  I felt this way working with ClickOnce.  You wouldn’t often redeploy simple or (mostly) static solutions, but if you do using the publish utilities it can get a bit frustrating.

If you look at my first post, it tells you which Azure bits I’m working with, make sure you’re up to date b/c stuff will continue to change until they get it right or make it better – or both would be nice.  Patience.

So I’m assuming you have something that is tied to an Azure project that you want to publish, great! Let’s go!

From the top

Just like a web project we right-click on our Azure project and pick “Publish…” from the context menu.  We got that.

Which deployment path should you choose?  If you have not done the following already:

  • Setup Remote Desktop Connections with Windows Azure
  • Created and installed certificates on your Windows Azure subscription

then choose the “Create Service Package Only” option.  We’ll start there, I did and didn’t get any new scars from it.

If it looks like this, click “OK”, we’ll talk about the other stuff later on, promise.

Azure.Deploy.PkgOnly.00

As soon as you click “OK”, you’ll see your solution start to build, that’s normal.  It’s also going to build a deployment package and the configuration file for you build, and will create them in the file system under your project.  They don’t get added to source control unless you wand to add them.  I’m using GIT so the whole file system is being watched so mine were checked in.  The files are added to a publish folder for the build type (Debug, Release) you have selected.  So my “Debug” deployment files went here.

Azure.Deploy.PkgOnly.02

If you had a release build, it would show up in “bin\Release\Publish” instead.   Those two files in the folder are what we’ll use when we are in the Azure admin site to deploy our app.  Follow me over here, and stand right there and watch this.

My deployment has two roles and talks to one queue and one table, simple.  So in the dev fabric on our local machines the magic is just spraying everywhere from the Azure colored fire hose and everything works.  You probably stepped through the connection string in your configuration file and found “UseDevelopmentStorage=true” for your data connection haven’t you?  Well, now we’re going to be talking to live queues and tables and that stuff won’t work any longer.  So (as I figured out yesterday) we need to tell our deployment where our storage lives.  First we need to create it if we haven’t already, but if your HelloWorld app doesn’t use anything but a website, you won’t need to do this now.  However, I would encourage you to following along anyway.

To the cloud!  I hate that line…

BrowseToPortalI found this yesterday, and it was quite handy indeed.  From the cloud project in your solution, right-click and choose browse to portal.  

This gets you where you need to go.  When you get, log in and go to your Azure account.

When the portal opens look in the lower left hand corner and find the Hosted Service, Storage Accounts & CDN button.  It looks like this, click it:

image

Here’s my subscription with no hosted services or storage accounts. 

Subscription.Service.Storage

Let’s create a storage account; a storage account is something that groups together your blob data, table data, and queue messages.  We can delete this when we’re done if we don’t need it, or want to leave it online. 

The main thing to get here is that it’s tied to your subscription and you can have as many distinct ones as you need.  And yes they do cost money when you start dropping data into them – YMMV.  And the URLs they use for communication resembled based on the name you give the storage account.  So if called my storage account “junkinthetrunk”, the endpoints my applications use to get stuff would look like these:

https://junkinthetrunk.blob.core.windows.net

https://junkinthetrunk.queue.core.windows.net

https://junkinthetrunk.table.core.windows.net

Before we go on let’s talk about affinity groups.  Affinity groups tell Azure what part of the country you want to store your data in.  If you born in the midwest, you might choose that region, but if your app is serving the South Eastern US, you might want to rethink that choice. 

There’s only a few choices and not in all cases but in most, you’ll want to pick a data center that’s closer to your business especially if you are running a hybrid solution.  For example, a native ASP.NET hosted by an ISP and uses Windows Azure for some type of storage.

Click “Affinity Groups”, then click on your subscription record in the grid, then click the cool looking “New Affinity Group” button – in that order.  You’ll get one of the dialogs pictured below, and as far as I know they are free.  Fill in your choice of group name, pick a region or anywhere if you like, and click “OK”. Here’s how I named mine.

AffinityGroupName

                   AffinityGroupName.Entered

imageSo, let’s build the storage account, we’ll use the affinity group as we do that.  Click on Storage Accounts (the green thing above) then click on this cool file cabinet button.  This will cause a dialog box to show up that’s going to ask you a few more questions about what and where you want to store you “file cabinet”.

Let’s fill in some more blanks, k?

CreatingStorageAccounts

Using the affinity group we created earlier, here’s what it looks like. 

I only have one subscription so it was filled in for me.

I only enter the unique part of the URL – this will be the name of the storage account, and it shows up in the management portal and in your config.  So make it a good, clean name and not something like “thisappsucks”.

A variation I came up with was to divide the storage accounts into different stages so I could potentially promote code from the development fabric, then to staging, then finally to production.  Today, “junkinthetrunk” was already taken, so I used “dev0junkinthetrunk” instead.  Also, Windows Azure ensures it unique as you type out the name. 

If this makes sense to you use it, just remember that the more storage you have the more you pay when you use it.  My process has been to delete everything in my subscription when I reach a milestone, then just start over.  Call it deliberate practice.

I’ve clicked “OK” on the dialog above and Azure is off creating my awesome storage account.

Storage.Descriptors

If anything in documentation (yours or a vendor’s) asks for an account name, it’s probably referring to a storage account.

So let’s back up and recap a bit.

You opened the Azure Management Portal.

You created an Affinity Group.

You created a new Storage Account and applied the Affinity Group to it.

The point I want to make here is getting the cart in front of the horse; think about your storage while you’re sketching your architecture on the back of your dinner napkin.  If you don’t need it, fine you’re already ahead.  But if your app needs it, it is better to set it up ahead of time. 

You can create storage after the fact like I did once – BUT you’ll have to alter the configuration file in the management portal and (based on how busy, but you’ll wait forever for the instances to tear down and build up. 

Any changes you make to configuration will restart any instances you have running – also its probably fair to mention this is why you cannot change the Service Definition in the management portal, you cannot add roles on the fly, just the number of instances through the configuration.

Let’s go back to the deployment files and update them with the storage account information.

Azure.Deploy.PkgOnly.02

We only are going to open the ServiceConfiguraton.cscfg file and add our storage information and access key to it.

My application has one web site (role), and one worker role and each is named in the configuration file and initialized with (one)1 instance of each.  If you know you want two or three instances, change it here and they’ll all spin up when Azure tries to initialize your package. 

ServiceConfig.Deploy.00

Also, which is more important to note, each role has it’s own data connection, that’s where we’re going to plug in our junkinthetrunk storage pointers we discussed above.  Here are the before and after for the connection information.  I snipped the account key (it is called access key in the storage account descriptors) just to make it fit.  You need to enter the entire key.

Before:

<Setting name="DataConnectionString" value="UseDevelopmentStorage=true” />

After:

<Setting name="DataConnectionString"
value="DefaultEndpointsProtocol=https;
AccountName=dev0junkinthetrunk; AccountKey=2V6JrRdFnrST2PCQHA <snip>" />

Back in the Azure Management Portal click on “Home” or Hosted Services, Storage Accounts & CDN”.

Now we’ll create a new Hosted Service by clicking this really cool “New Hosted Service” button.

image

The dialog that’s presented needs more information, I fill in my blanks like this:

Create.HostedService.After

Once you click “OK”, you’ll see this modal dialog, it’s harmless.  It’s basically telling you that you will only have one instance created, change as need if you want to.  Click “Yes”.

image

I color-coded each section above to describe and compare, after, the service gets created so you can see what information gets stored where in the portal. 

Back in the management portal Azure is busy deploying our package…

image

From the color-coded box above the service name, URL prefix, deployment name, and add certificate items are described below, so I’ll skip those for now.

The value in the dark green box is the same Affinity Group we created for our storage account.  Now our roles/applications are running in the same data center as our data tables, blobs, and queues.  Building the group makes this easier to get right when we deploy.

The light green blocks are the defaults, I stuck with those for getting my stuff into the staging environment.

Package location and Configuration files were the first things we discussed; remember they were created when we chose the “Create Service Package Only” option.

Checking on the deployment again…

And now things are spinning up…

image

Back to the comparison…

Of all of the blanks we filled in, here’s where a few entries landed, compare this to the color-coded box above.

Create.HostedService.After..Comparison

The “Windows Azure Tools” certificate is something I created yesterday while I was trying to get the Remote Desktop Tools working for a deployment update.  From where I was sitting the deployment kept timing out so I gave up for the moment.

At any rate, if you want to associate a certificate (.pfx) with your deployment click the “Add Certificate” button and drill to the location where it’s at, enter your cert’s password then click “OK”.  And if you fat-finger your password you will get a chance to enter again.

image

All done!

image

To take it for a spin we just need to click on the “Development.Registration.Svc.deploy” node, and then in the right-hand pane the staging DNS name will open the web site that’s been deployed.

Deployment.DNS.Link.Opened

The highlighted URL is the one we clicked on, so know we can add some data.

image

I’ve been using the Cloud Storage Studio tool from Cerebrata Software for this and we can see the data from my “junkinthetrunk”.  The tool allows me to see my SQL Server backed development storage and Azure storage accounts.

image

 

If you’re just here for the good news, I hope this post help you in some way to flatten the Windows Azure curve.

If you want to stroll into the stuff that didn’t work for me, keep reading.

What didn’t work

Certificate Management and Installation

Once I had my initial deployment finished, I found a bug and needed to redeploy my solution.  This is when I setup my certificate and associate it with my deployment.  This sets up the trust between Azure and I. 

Inside the management portal you can choose “Management Certificates”.  This is looking for the .CER file, not a .PFX file.  So then I needed to open the cert manager snap-in and export the public version and apply it to the solution.  So now I had a cert, it’s the same one I used today.

My point is that you need to create the cert before you get into a deployment scenario.  Create a cert, but create it in the way step five of this blog post describes:

http://msdn.microsoft.com/en-us/library/gg443832.aspx

This will allow the cert to be used during the initial deployment, and to be added to the deployment as well in a way that satisfies Windows Azure certificate manager.

Redeployment

VS2010.Deployment.Failure

I kept getting time outs from Windows Azure during a redeployment.  Once I figured out the cert management issues, I couldn’t get a 90 second time out threshold somewhere.  All of the blogs I found were related to pushing a big ol’ VM up to Windows Azure.  This was an app that I had already deployed six or seven times.  I’m going to try somewhere else, maybe my network or Windows Azure was just having a bad day. 

Oh, and if the certs fail for any reason or trust is an issue, the blue ring around this deployment warning turns bright red.

Documentation

There were a few complaints out there from the recent refresh that took place where not all of the documentation had been updated (folks using old docs on new bits) and the fact that things were just a little too cumbersome. 

My only response to that (and something I’ve shared with readers who have worked with me) is this.  Do you remember the 1.2 toolkit for SharePoint 2007?  And what a pain in the butt it was to use?  We are so far from that experience and I’m glad for it.  We had already gone through the 2003 version and tried to sync (in our minds) the precautions to take for WSS and MOSS before we tried to, or thought about, upgrading. 

I’m sorry but the teams that are building this stuff must have listened or are working for someone who experienced much wailing and gnashing of teeth, not to mention a lot of bitching when it came to poor tooling back then.  I’m thinking, it will only get better from here, and right now its not bad at all where I’m sitting.

Resources

I used a lot of links to get this far, and some of it just by thrashing about a bit as well.  But here are some links that helped out.  Some of the information I gathered from these posts I reposted in a different way but I still wanted to give some credit to the folks that put the first bits of guidance before me.

// UseDevelopmentStorage=false

http://simonwdixon.wordpress.com/2011/05/06/azure-usedevelopmentstoragefalse-deployment-hangs/

Channel9 – CloudCover : http://channel9.msdn.com/Shows/Cloud+Cover

Setting Up Named Auth Credentials : http://msdn.microsoft.com/en-us/library/ff683676.aspx

Basic VS2010 Deployment : http://msdn.microsoft.com/en-us/library/ff683672.aspx

*The* ASP.NET/Azure Quick Start: http://msdn.microsoft.com/en-us/library/gg651132.aspx

There’s one more gap I want to fill and that’s the code I wrote for the solution.  I only got into the data (model, messages, and entities) before, and I’d like to talk more about the infrastructure code before I go dark and get really busy building out this solution.

Again, if you read this far into the post, thanks.  I hope it helped you in some way to learn to use Windows Azure.

Wrapping my head around Windows Azure – Steel Threading Complete

image

I finished the Steel Thread late last week but didn’t get the blog post up over the long weekend.   What I came up with was just a few brief paragraphs on how it went (great) and what I learned (a lot).

The goal was to get used to using a worker role, a web(site) role, a table and queue for storage.

On the infrastructure side of things I realized that I needed specific types to store messages and the different formats they take on during processing.  One of the things I used to bootstrap was some guidance from the PnP team.  This post has more to do with using entities than it does with learning to use Windows Azure, but once you dive into Azure Table Storage, you’ll enjoy the ideas it presents.  There are more updated examples from the PnP team that employ some fabric stuff I’m targeting to use as well.

The guidance I used demonstrated some interesting abstractions around table storage, queues and their messages that I experimented with putting this together.  For this example I just wanted to to the following:

image

I wanted to submit a new registration into the application’s home page; create a Registration (model) object from the page; create a NewRegistrationMessage which is actually the message I putting on the queue.

That completes the first line of the diagram.  The second line of the diagram starts with the little blue guy running in a circle every few seconds, he’s the worker role.  The worker role tries to dequeue a message from the registration queue.  If the role find a queue message it turns it into a RegistryEntity which is a descendant of TableServiceEntity which I used to insert into the Registration Table, which is the one piece of table storage I need to use for this exercise.

This exercise help me figure out what extension methods I needed to write, and which pieces make the most sense to work together.  I could have skipped the queue implementation and just wrote directly to an azure table, but my first idea to store all of the messages first, then write to the queue.  For other types of messages and models (inventory inserts and updates)I probably will.  But for larger things which might need some moderation (most media types) I’ll probably push them onto a queue before bringing them into a staging area.

The little red wp7 phone is one of the clients that I want to access the data as well.  Last week I saw the WP7-Azure toolkit demo’d so I want to make that my next spike.  Yeah, spike.  I get that I can figure out all of the processing bits, but I want to make sure I get the phone bits right.

Here’s the stuff I’m going to keep from this exercise.  Some of it is directly copied from the PnP sample I mentioned; and some of it was just stuff that I wrote that I needed, not complicated, just code.

imageI have a class library called Data that has the following classes in it.  A couple of base types the MVC app uses; and the rest is used by Windows Azure.

Here are a few gists for the types I created, including the repository bits.

HTH.

Wrapping my head around Windows Azure – Steel Threading

I saw a LEAN webcast last year and learned about something creating a Steel Thread that can emulate what you want to application to do without any business logic.  This is a “throw your code away” exercise, so there’s no pressure, right?  Well sort of.  At this point we know very little about the problem, and less about how the solution will look. 

The idea here is to use what you do know based on customer interviews, and their ideas for success and turn that into something you can deploy in a day or two, but works.  Again, we’re not going to re-use any of this code since when we finish it will effectively be a hot mess of coupled and brittle code we wouldn’t want to keep around anyway, maybe we can call it an ‘ugly spike’.

Steel Threading supports the notion that you can start at point “A” and traverse 4, 8, 9, or (n) other points arriving at the destination in the state you expected – think “Chutes and Ladders”; you may or may not make it from one point to the next, but you’re learning as you go. 

The ultimate goal is to work inside the box and not prescribe something for the solution that doesn’t work or they can’t sustain.  It is a spike of sorts, but it helps kick off the project in a direction that makes sense today, tomorrow or next week might be totally different – we’ll worry about next week, next week.

The Initial Stories

I have 15 stories with high-to-medium level detail to start with. This isn’t going to be an Amazon or Zappos, but it will be something ecommerce in nature. Since most of the common scenarios have been solved by the industry we can take some fairly intelligent hints on how we like things to happen.  For example, don’t put a product in my shopping cart – and let me pay for it, if it’s been discontinued.  That said, I can focus on executing on my customers goals for this solution.

Where Azure Comes In

All that said, I’ve got an MVC web role, and a worker role in my (disposable) steel thread solution.  I can add more projects to the solution while I’m finishing this exercise if I need to, but this is the starting point right now.

Our local Orlando user group hosted Scott Densmore from the Microsoft PnP Team who discussed taking a stock ASP.NET project and wiring it to AppFabric’s Access Control services.  That’s what I need to do to support a few of my stories I’ve collected so far.

The web role will submit stuff that needs to be picked up and persisted, so I’m thinking about using a queue to intercept these submissions, then move the submission into table storage.  There’s already guidance around this in a few places so it should wire up quickly and work. The content types I’m concerned with have different rules on submission, so there may be a bit of a workflow involved with the submission that might it a bit prickly.

I’m going to run most of this from my local DevFabric, I’m not sure there’s a need to deploy it except for the fact of getting good/better at deployments.  Besides, deployment looks a bit involved to me at the moment, so getting a few deployments behind me will only help. 

On a side note, I was a bit jealous this morning reading about a VS2010 plug-in for Rackspace deployments – very cool stuff.  The case study was written up by Microsoft so I would only think they are working out the same types of features to keep us out of their management portal and inside the IDE – maybe a new project property tab based on the type of project template?  That would be cool.

The IDE

Here are my two development environments:

–  Both work and home are running the 64-bit Windows 7 Awesome SKU

– Both work and home have at least 4Gb of RAM (work has more)

– Both work and home have at least 2 cores (work has many more)

– Both are running SQL Server 2008 R2

– Azure 1.3 Toolkit

– VS2010 SP1

– GitHub for SCM, for now anyway

That’s it for now.

-ofc

Getting my head around Windows Azure, Preamble Ramble

 

azurethumb

I’ve become very intrigued with Azure lately and also an Azure idiot at the same time.  Well, maybe not a total idiot but it’s a huge platform.  I’ve been following some of the guidance from the MS PnP team to get my feet wet but recently I’ve just wanted to dive in.

The main idea for taking this approach is to just focus on one thing.  I’ve been a scatter-brained developer for the last few years and recently after coming off a long project, I just want to focus on one (albeit large) technology stack.  There’s one or two other things I’ll poke at but this is one that’s going to get a lot of my daytime focus.

So as to not get any readers tangled up in the who or what I’m building, I’ll be *very* generic in my descriptions of everything except the technology.  So hopefully some of this will be simple enough to use on your own projects, and help with the why, that’s the plan anyway.