Windows 8 Accelerator Labs

Windows 8

Windows 8

This week the Microsoft office in Tampa hosted a three-day Windows 8 accelerator lab for anyone wanting to port or create an application to the latest version of Windows Phone or Windows 8 Metro.  I think most folks stuck close to the C# / XAML flavors for their applications and a lot of applications were updated, ported, or created by the group.

I was able to attend for a day and a half and worked on creating a Windows 8 version of an application I’ve had on the shelf for a while.  The app I was converting was a Mango flavor but instead of creating an updated version I decided to create a version that would run on Windows 8 instead.  I had a lot of challenges at first but leaning on the documentation help quite a bit.

The biggest things were moving items from Windows Phone isolated storage to Windows Storage (that’s the namespace) on Windows8, and then there was navigation.  Some of the built in templates handled navigation out of the box without disrupting the workflow I had in my in my original Windows Phone application.  I focused on those two because they are tied together, it’s a matter of knowing (or better, trying to understand) when the application moves from one view to another.  Once it does, the application needs to understand when, and if, it needs to save what’s been entered.  The other opportunity here is understanding where to save what was added or changed.  This was the majority of my challenges for the work I completed over the last day and half of the labs and it was very educational to be sure.

Hats off to Jim Blizzard for being the MC and host of the event, he did a great job (as did others but I didn’t get their names) walking around the room and answering questions and handing out advice for the challenges folks were having.  At different times of the day folks gave demos of the work they had completed or started, and for that they offered them a new Windows Phone, not a bad deal.  Many folks trying their hands at using XAML, even one Android developer that created a Windows Phone app on the last day that she demoed to the group.  One person hadn’t worked with XAML before and created a roving repairman application by just using the developer documentation and the application templates that are available.  They Windows 8 templates are few in number, but they are definitely just enough code to get you started.  There a numerous samples that can help with starting out on the application type you want to build if you just want to get something quick and dirty up and running.

The review process is a bit more “picky” in that you can vet your own application before you submit it to the marketplace so you can fix any of the obvious problems the review process might notice, but that’s ok, it’ll save the time you tap your foot waiting for the acceptance email and help you focus on the problems your  solution might have.  I looks like the static analyzers (FxCop / StyleCop) built into Visual Studio has to help you write better code, only you get a quick pass or fail notification for how your application is built.

I really enjoyed this event by taking the time to dive into Windows 8, but as I was driving away from the MS office I could help but think that Windows 8 is the new Silverlight target for applications, but its deeper than just spinning up a C# / XAML application like we did for Silverlight that runs in a browser, this one has a much larger platform to run on.   The project templates target much more than just a Silverlight solution, there’s the HTML5 flavor that runs like a website, so if you’re a web developer and you don’t mind traipsing through some Windows namespaces for your client-side code, you might like this next version of Visual Studio (2012 RC dropped today) for your development desires.

HTH,

ofc

Wrapping My Head Around Windows Azure–Deploying

I mentioned in one of my first posts that deployments made zero sense to me.  I found a few blog posts today where a few folks were just disappointed with the level of documentation.  And things are a bit cumbersome, clunky, or just not intuitive for folks trying to make “it’ happen – roles, storages, and certs.

I’m going to take a whack at trying to make some sense.  I’ve got many words and pictures to share because I hate when stuff is hard, especially when I’m just trying to deploy it.  I felt this way working with ClickOnce.  You wouldn’t often redeploy simple or (mostly) static solutions, but if you do using the publish utilities it can get a bit frustrating.

If you look at my first post, it tells you which Azure bits I’m working with, make sure you’re up to date b/c stuff will continue to change until they get it right or make it better – or both would be nice.  Patience.

So I’m assuming you have something that is tied to an Azure project that you want to publish, great! Let’s go!

From the top

Just like a web project we right-click on our Azure project and pick “Publish…” from the context menu.  We got that.

Which deployment path should you choose?  If you have not done the following already:

  • Setup Remote Desktop Connections with Windows Azure
  • Created and installed certificates on your Windows Azure subscription

then choose the “Create Service Package Only” option.  We’ll start there, I did and didn’t get any new scars from it.

If it looks like this, click “OK”, we’ll talk about the other stuff later on, promise.

Azure.Deploy.PkgOnly.00

As soon as you click “OK”, you’ll see your solution start to build, that’s normal.  It’s also going to build a deployment package and the configuration file for you build, and will create them in the file system under your project.  They don’t get added to source control unless you wand to add them.  I’m using GIT so the whole file system is being watched so mine were checked in.  The files are added to a publish folder for the build type (Debug, Release) you have selected.  So my “Debug” deployment files went here.

Azure.Deploy.PkgOnly.02

If you had a release build, it would show up in “bin\Release\Publish” instead.   Those two files in the folder are what we’ll use when we are in the Azure admin site to deploy our app.  Follow me over here, and stand right there and watch this.

My deployment has two roles and talks to one queue and one table, simple.  So in the dev fabric on our local machines the magic is just spraying everywhere from the Azure colored fire hose and everything works.  You probably stepped through the connection string in your configuration file and found “UseDevelopmentStorage=true” for your data connection haven’t you?  Well, now we’re going to be talking to live queues and tables and that stuff won’t work any longer.  So (as I figured out yesterday) we need to tell our deployment where our storage lives.  First we need to create it if we haven’t already, but if your HelloWorld app doesn’t use anything but a website, you won’t need to do this now.  However, I would encourage you to following along anyway.

To the cloud!  I hate that line…

BrowseToPortalI found this yesterday, and it was quite handy indeed.  From the cloud project in your solution, right-click and choose browse to portal.  

This gets you where you need to go.  When you get, log in and go to your Azure account.

When the portal opens look in the lower left hand corner and find the Hosted Service, Storage Accounts & CDN button.  It looks like this, click it:

image

Here’s my subscription with no hosted services or storage accounts. 

Subscription.Service.Storage

Let’s create a storage account; a storage account is something that groups together your blob data, table data, and queue messages.  We can delete this when we’re done if we don’t need it, or want to leave it online. 

The main thing to get here is that it’s tied to your subscription and you can have as many distinct ones as you need.  And yes they do cost money when you start dropping data into them – YMMV.  And the URLs they use for communication resembled based on the name you give the storage account.  So if called my storage account “junkinthetrunk”, the endpoints my applications use to get stuff would look like these:

https://junkinthetrunk.blob.core.windows.net

https://junkinthetrunk.queue.core.windows.net

https://junkinthetrunk.table.core.windows.net

Before we go on let’s talk about affinity groups.  Affinity groups tell Azure what part of the country you want to store your data in.  If you born in the midwest, you might choose that region, but if your app is serving the South Eastern US, you might want to rethink that choice. 

There’s only a few choices and not in all cases but in most, you’ll want to pick a data center that’s closer to your business especially if you are running a hybrid solution.  For example, a native ASP.NET hosted by an ISP and uses Windows Azure for some type of storage.

Click “Affinity Groups”, then click on your subscription record in the grid, then click the cool looking “New Affinity Group” button – in that order.  You’ll get one of the dialogs pictured below, and as far as I know they are free.  Fill in your choice of group name, pick a region or anywhere if you like, and click “OK”. Here’s how I named mine.

AffinityGroupName

                   AffinityGroupName.Entered

imageSo, let’s build the storage account, we’ll use the affinity group as we do that.  Click on Storage Accounts (the green thing above) then click on this cool file cabinet button.  This will cause a dialog box to show up that’s going to ask you a few more questions about what and where you want to store you “file cabinet”.

Let’s fill in some more blanks, k?

CreatingStorageAccounts

Using the affinity group we created earlier, here’s what it looks like. 

I only have one subscription so it was filled in for me.

I only enter the unique part of the URL – this will be the name of the storage account, and it shows up in the management portal and in your config.  So make it a good, clean name and not something like “thisappsucks”.

A variation I came up with was to divide the storage accounts into different stages so I could potentially promote code from the development fabric, then to staging, then finally to production.  Today, “junkinthetrunk” was already taken, so I used “dev0junkinthetrunk” instead.  Also, Windows Azure ensures it unique as you type out the name. 

If this makes sense to you use it, just remember that the more storage you have the more you pay when you use it.  My process has been to delete everything in my subscription when I reach a milestone, then just start over.  Call it deliberate practice.

I’ve clicked “OK” on the dialog above and Azure is off creating my awesome storage account.

Storage.Descriptors

If anything in documentation (yours or a vendor’s) asks for an account name, it’s probably referring to a storage account.

So let’s back up and recap a bit.

You opened the Azure Management Portal.

You created an Affinity Group.

You created a new Storage Account and applied the Affinity Group to it.

The point I want to make here is getting the cart in front of the horse; think about your storage while you’re sketching your architecture on the back of your dinner napkin.  If you don’t need it, fine you’re already ahead.  But if your app needs it, it is better to set it up ahead of time. 

You can create storage after the fact like I did once – BUT you’ll have to alter the configuration file in the management portal and (based on how busy, but you’ll wait forever for the instances to tear down and build up. 

Any changes you make to configuration will restart any instances you have running – also its probably fair to mention this is why you cannot change the Service Definition in the management portal, you cannot add roles on the fly, just the number of instances through the configuration.

Let’s go back to the deployment files and update them with the storage account information.

Azure.Deploy.PkgOnly.02

We only are going to open the ServiceConfiguraton.cscfg file and add our storage information and access key to it.

My application has one web site (role), and one worker role and each is named in the configuration file and initialized with (one)1 instance of each.  If you know you want two or three instances, change it here and they’ll all spin up when Azure tries to initialize your package. 

ServiceConfig.Deploy.00

Also, which is more important to note, each role has it’s own data connection, that’s where we’re going to plug in our junkinthetrunk storage pointers we discussed above.  Here are the before and after for the connection information.  I snipped the account key (it is called access key in the storage account descriptors) just to make it fit.  You need to enter the entire key.

Before:

<Setting name="DataConnectionString" value="UseDevelopmentStorage=true” />

After:

<Setting name="DataConnectionString"
value="DefaultEndpointsProtocol=https;
AccountName=dev0junkinthetrunk; AccountKey=2V6JrRdFnrST2PCQHA <snip>" />

Back in the Azure Management Portal click on “Home” or Hosted Services, Storage Accounts & CDN”.

Now we’ll create a new Hosted Service by clicking this really cool “New Hosted Service” button.

image

The dialog that’s presented needs more information, I fill in my blanks like this:

Create.HostedService.After

Once you click “OK”, you’ll see this modal dialog, it’s harmless.  It’s basically telling you that you will only have one instance created, change as need if you want to.  Click “Yes”.

image

I color-coded each section above to describe and compare, after, the service gets created so you can see what information gets stored where in the portal. 

Back in the management portal Azure is busy deploying our package…

image

From the color-coded box above the service name, URL prefix, deployment name, and add certificate items are described below, so I’ll skip those for now.

The value in the dark green box is the same Affinity Group we created for our storage account.  Now our roles/applications are running in the same data center as our data tables, blobs, and queues.  Building the group makes this easier to get right when we deploy.

The light green blocks are the defaults, I stuck with those for getting my stuff into the staging environment.

Package location and Configuration files were the first things we discussed; remember they were created when we chose the “Create Service Package Only” option.

Checking on the deployment again…

And now things are spinning up…

image

Back to the comparison…

Of all of the blanks we filled in, here’s where a few entries landed, compare this to the color-coded box above.

Create.HostedService.After..Comparison

The “Windows Azure Tools” certificate is something I created yesterday while I was trying to get the Remote Desktop Tools working for a deployment update.  From where I was sitting the deployment kept timing out so I gave up for the moment.

At any rate, if you want to associate a certificate (.pfx) with your deployment click the “Add Certificate” button and drill to the location where it’s at, enter your cert’s password then click “OK”.  And if you fat-finger your password you will get a chance to enter again.

image

All done!

image

To take it for a spin we just need to click on the “Development.Registration.Svc.deploy” node, and then in the right-hand pane the staging DNS name will open the web site that’s been deployed.

Deployment.DNS.Link.Opened

The highlighted URL is the one we clicked on, so know we can add some data.

image

I’ve been using the Cloud Storage Studio tool from Cerebrata Software for this and we can see the data from my “junkinthetrunk”.  The tool allows me to see my SQL Server backed development storage and Azure storage accounts.

image

 

If you’re just here for the good news, I hope this post help you in some way to flatten the Windows Azure curve.

If you want to stroll into the stuff that didn’t work for me, keep reading.

What didn’t work

Certificate Management and Installation

Once I had my initial deployment finished, I found a bug and needed to redeploy my solution.  This is when I setup my certificate and associate it with my deployment.  This sets up the trust between Azure and I. 

Inside the management portal you can choose “Management Certificates”.  This is looking for the .CER file, not a .PFX file.  So then I needed to open the cert manager snap-in and export the public version and apply it to the solution.  So now I had a cert, it’s the same one I used today.

My point is that you need to create the cert before you get into a deployment scenario.  Create a cert, but create it in the way step five of this blog post describes:

http://msdn.microsoft.com/en-us/library/gg443832.aspx

This will allow the cert to be used during the initial deployment, and to be added to the deployment as well in a way that satisfies Windows Azure certificate manager.

Redeployment

VS2010.Deployment.Failure

I kept getting time outs from Windows Azure during a redeployment.  Once I figured out the cert management issues, I couldn’t get a 90 second time out threshold somewhere.  All of the blogs I found were related to pushing a big ol’ VM up to Windows Azure.  This was an app that I had already deployed six or seven times.  I’m going to try somewhere else, maybe my network or Windows Azure was just having a bad day. 

Oh, and if the certs fail for any reason or trust is an issue, the blue ring around this deployment warning turns bright red.

Documentation

There were a few complaints out there from the recent refresh that took place where not all of the documentation had been updated (folks using old docs on new bits) and the fact that things were just a little too cumbersome. 

My only response to that (and something I’ve shared with readers who have worked with me) is this.  Do you remember the 1.2 toolkit for SharePoint 2007?  And what a pain in the butt it was to use?  We are so far from that experience and I’m glad for it.  We had already gone through the 2003 version and tried to sync (in our minds) the precautions to take for WSS and MOSS before we tried to, or thought about, upgrading. 

I’m sorry but the teams that are building this stuff must have listened or are working for someone who experienced much wailing and gnashing of teeth, not to mention a lot of bitching when it came to poor tooling back then.  I’m thinking, it will only get better from here, and right now its not bad at all where I’m sitting.

Resources

I used a lot of links to get this far, and some of it just by thrashing about a bit as well.  But here are some links that helped out.  Some of the information I gathered from these posts I reposted in a different way but I still wanted to give some credit to the folks that put the first bits of guidance before me.

// UseDevelopmentStorage=false

http://simonwdixon.wordpress.com/2011/05/06/azure-usedevelopmentstoragefalse-deployment-hangs/

Channel9 – CloudCover : http://channel9.msdn.com/Shows/Cloud+Cover

Setting Up Named Auth Credentials : http://msdn.microsoft.com/en-us/library/ff683676.aspx

Basic VS2010 Deployment : http://msdn.microsoft.com/en-us/library/ff683672.aspx

*The* ASP.NET/Azure Quick Start: http://msdn.microsoft.com/en-us/library/gg651132.aspx

There’s one more gap I want to fill and that’s the code I wrote for the solution.  I only got into the data (model, messages, and entities) before, and I’d like to talk more about the infrastructure code before I go dark and get really busy building out this solution.

Again, if you read this far into the post, thanks.  I hope it helped you in some way to learn to use Windows Azure.

The Community Steps Up, Big Time!

Three weeks ago the leader of my local .NET User Group were having problems landing a speaker for our meeting which took place last night.  I guess there’s some type of lull in the activity for speakers because we usually don’t have a speaker drought anytime in the year, until this month.

So my big campaign for the user group when I was interviewing for a board position was “Putting community back in the group”.  I put a lot of stuff on the table and this was one of them.  When I stepped in we already had a huge list of speakers choose from, and the group was getting ready for some fairly large product introductions (we also call them drops).  Our existing tools were changing and new tools were being introduced.  The speakers help sew up the gaps in our understanding of what we don’t know since depending on the speaker it’s usually their forte.

Last night we had three community speakers step up and talk about their passion and they did very well in fact.  Vastly different topics (GPU data processing, VIM/Auto-Hot-Key for keeping your hands off the mouse, and an authentication talk using pluggable providers-OAuth, OpenId, etc.  All very well thought out and presented, and I heard many compliments to the speakers on the topic and delivery.

Last night we definitely put the community back in the group, big time.