Multi-Targeting: Using an Azure Cloud WCF Service for Desktop, Silverlight and Windows Phone 7

by 18. February 2011 04:07

Note: To successfully share the same code (consuming the WCF Azure web service), on the Desktop platform you must "Add Service Reference", click Advanced and ensure "Generate asynchronous operations" is checked; asynchronous is the default for Silverlight and Phone when you Add Service Reference to these projects.

What I am most impressed with is the performance of the Azure Web Service.  When I access the link it displays immediately; I'm accustomed to "spin-up" time which has successive access generally fast, but I have never seen this kind of performance accessing a web service! 

Most of us are using Rest services (which are very convenient) but in the case of my project I am going the WCF road for performance reasons;  I want the Phone access to be extremely fast without any additional overhead.   Anyone who has created enterprise level applications using WCF services understands the ramifications of this (you have to update any Web References if the interface changes), however since I'm using dependency injection and a data access layer my enterprise application configuration carries very little overhead (as I will demonstrate below).

Setup was only hard because of the dis-information and outdated data that is available when Binging the topic on adding WCF services (particularly for Windows Phone).   The following is an updated process as of 2/20/2011.

Creating the service

  1. Add new project - select Cloud
  2. Select Windows Azure Project [OK]
  3. Select WCF Service Web Role [OK]
    • It will create two projects, a Web Role and an Azure cloud application
  4. I renamed the WCFServiceWebRole1 to WCFGwnServiceWebRole, this caused me some issues because I was permitted to successfully launch the web role service directly but once deployed the Azure cloud application would not run,  I had to turn on CustomErrors to find out that the rename didn't rename the GwnService.svc class definition (line 4 on the bottom right pane in the image below); once I fixed this the application warned me that I should have 2 instances - I made the change on line 4 of the top right pane in image below. 

Publishing the service

This was more challenging then it needed to be  perhaps these steps will save you a few hours.  

Create Certificate

  1. Right click on the Azure project (not the Web Role) and select Deploy your Windows Azure project to Windows Azure.
  2. Click Credentials dropdown list and select Add
  3. Click dropdown list for option 1 (Create or select an existing...) and select <create> and provide a name [OK]
  4. Select the link on option 2 to Copy the full path
  5. Click on link to launch Windows Azure Portal
  6. Select Hosted Services, Storage Accounts & CDN
  7. Select Manage Certificates
  8. Click Add Certificate
  9. Click [Browse] and paste the certificate path and click [Done]
  10. Cancel the Visual Studio model windows to return to Visual Studio.

Deploy your Service

  1. Right click on the Azure project (not the Web Role) and select Publish
  2. I attempted to add a credential but this sent me into a chicken-before-the-egg corner and after I configured the necessary pieces it could not connect to the allocated storage.   It is EASIEST if you click the Create Service Package Only radio button and [OK].   
    • An explorer window will open that will contain the Service package, copy the path to the clipboard
    • Go to the Windows Azure portal
    • Create a Storage Account - you'll need this to publish

    • Click on Hosted Services in left menu
    • Click on New Staging Deployment in top menu
    • Provide a Deployment name
    • Click [Browse Locally] and paste the path previously saved folder path from the clipboard to select both the Package location and Configuration file.   Click [OK] and have patience while it imports and deploys the package to your staging area 

At any point after publishing you can click on Hosted Services, select your service and then get the DNS name as shown below - simply add your service name to the end.   Note: this name will change everytime you publish which is why configuration needs to be simplified (using dependency injection as shown below)

After the initial publish, successive publishes will be simple - simply right click on the Azure project and pick the Credential configured for this solution and click [OK]

Consuming the newly published Web Service

The following is the test data access layer I created for the Desktop, Silverlight and Windows Phone applications (they all share this code).   I added a Web Service Reference being sure to use the same name.

When I got done adding the phone's web reference the auto generated ServiceReferences.ClientConfig was empty (only had a single line for configuration).   I found I had to reinstall the Silverlight4_tools.exe and all was well after that.    I added Web References to the Silverlight and Desktop applications (changeset 85264 on   I plugged in a hook that executes GetData() during the ReadList() command for the Password Manager IF the AppConfig.ServiceAddress is not null (AppConfig is injected in class base via dependency injection).

I had to republish a few times because of the web role rename issue and each time the address changed.  To simplify matters I updated the bootstrapper (single location shared by all of the applications in the main app projects) so that it configured the address from a single location (line 134 below); this is the only change I have to make because I programmatically configure the binding and address (lines 24-29 above).  With this technique configuration files are no longer required - all I'll have to do is update web references for my data access layer project(s) if the interface changes and I'll be done. 

IMPORTANT: because I inject the value into an IAppConfig singleton (line 133 below) it is available throughout the entire application (ClassBase has the unity dependency property IAppConfig, line 13 above, so it is easily available to my data acess layer). 

Note: I remarked out line 133 in the changeset because odds are the address will change again and I don't want to introduce a bug that will crash the app for those that might download the source code.

With the above noted configurations I was able to successfully launch and run the Desktop and Phone application and watch the output log display that the contents were successfully returned.   Silverlight gave an "attempting to access a service in a cross domain way" error which I resolved by adding a ClientAccessPolicy.xml file per the following LINK.    With that change, and redeploying the web service, all three platforms successfully ran the code.


VS2010 RC - Unit Testing Azure Create, Read, Update and Delete (CRUD)

by 27. February 2010 04:26

I did some head-banging this morning while writing a unit test for Azure table storage CRUD operations.   My WebRole application uses the PersonDataSource (line 8) as an ObjectDataSource and works without issue so I was perplexed as to why my unit test kept crashing on line 40.   The UpdateItem(testdata) was successfully executing but the value was not being updated to "TestOrganization" as expected.

    1 /// <summary>

    2 /// Determines whether this instance can create read update delete

    3 /// azure table records.

    4 /// </summary>

    5 [TestMethod]

    6 public void CanCreateReadUpdateDeleteAzureTableRecords()

    7 {

    8     PersonDataSource mockData = new PersonDataSource();

    9     EHRPerson testData = null;


   11     string uniqueId = Guid.NewGuid().ToString();


   13     //:::::[ CREATE ]:::::::

   14     bool isSuccessful = mockData.CreateItem(new EHRPerson

   15     {

   16         ID = uniqueId,

   17         Name = "EHRTest",

   18         Organization = "EHROrganization",

   19         PersonType = "EHRPersonType",

   20         ProfessionalTraining = "EHRProfessionalTraining"

   21     });

   22     Assert.IsTrue(isSuccessful);


   24     //:::::[ READ ]:::::::

   25     testData = new List<EHRPerson>(mockData.Read())

   26         .FirstOrDefault<EHRPerson>(p => p.ID.Equals(uniqueId));


   28     Assert.IsNotNull(testData);

   29     Assert.AreEqual("EHRTest", testData.Name);

   30     Assert.AreEqual("EHROrganization", testData.Organization);

   31     Assert.AreEqual("EHRPersonType", testData.PersonType);

   32     Assert.AreEqual("EHRProfessionalTraining", testData.ProfessionalTraining);


   34     //:::::[ UPDATE ]:::::::

   35     testData.Organization = "TestOrganization";


   37     mockData.UpdateItem(testData);

   38     testData = new List<EHRPerson>(mockData.Read())

   39         .FirstOrDefault<EHRPerson>(p => p.ID.Equals(uniqueId));

   40     Assert.AreEqual("TestOrganization", testData.Organization);


   42     //:::::[ DELETE ]:::::::

   43     mockData.DeleteItem(testData);


   45     testData = new List<EHRPerson>(mockData.Read())

   46         .FirstOrDefault<EHRPerson>(p => p.ID.Equals(uniqueId));

   47     Assert.IsNull(testData);


   49 }

What I found was that my item (TItem) parameter was being reset to default items on line 330 (image below) effectively overwriting my changes with original values.    However, if I ran my Cloud application it successfully updated the data (PersonDataSource as ObjectDataSource).   

I discovered that I was being returned the reference of the item that was read previously (line 25 above) which meant I was passing around a reference so line 330 below effectively replaced it with the original contents thus entry = item.  This wasn't a problem with the ObjectDataSource because I trust that ASP.NET creates a new instance and populates it with form values prior to sending it to the ObjectDataSource.

I resolved the issue by updating my TableServiceEntity base class to implement ICloneable so that I don't have to manage it at higher levels.    All of my table entities, i.e., EHRPerson will be POCO so this should work well.  



VS2010 RC - Azure table create, read, update and delete (CRUD) / ObjectDataSource delete bug

by 20. February 2010 22:19

There seem to be plenty of examples for working with Azure blobs (which is great because it is my next step) but I found information on tables to be lacking.   Particularly CRUD operations outside of read and write.  

Once I had the fundamentals down I found there were a number of mistakes one could make that would result in a ResourceNotFound error.    My goal is to abstract the complexities away into base classes and keep it simple in the parent classes - important because the Electronic Health Record system will have numerous tables (based on Microsoft HealthVault classes - highly normalized).

The following is the minimal code required to get an application running with an ObjectDataSource (using GWN.Contrib.Azure project) - all code and default.aspx form below:

The base classes actually handle all of the wire-up and plumbing however the ObjectDataSource requires strongly typed objects (not generics) to bind to so I had to provide wrapper methods in my PersonDataSource above (lines 11-22).   If you are not using an ObjectDataSource no lines of code are required.

Default.aspx and code-behind  ---  everything is handled by the ObjectDataSources

Once I dropped a ListBox and DetailsView on my .ASPX page I clicked on the data binding tab of each control (shown on the top right hand corner of the ListBox above) and created two ObjectDataSources with both of them bound to the same PersonDataSource.   The odsList binds to the Read() method (in the baseclass) and the odsDetails binds to the CreateItem, ReadItem, UpdateItem and DeleteItem methods shown in the first image above. Attempting to bind to the generic baseclass methods generated errors noting this was not permittted so I created the wrapper methods and all was well.  

Note:  When creating the ObjectDataSources using the wizard VS2010 RC doesn't like to show all of the available classes the first time you select the dropdown list.  I had to show the list, close everything back to the control and then restart the process; VS will then consistently show all classes permitting you to access the PersonDataSource which resides in a separate project (below) - feature or bug???

ObjectDataSource DELETE BUG:   **UPDATED**
Unlike it's Insert and Update counterparts the Delete method would consistently send the PersonDataSource DeleteItem method an empty item (line 20 in the first image).  It had been a while since worked with ObjectDataSources so I figured maybe I was missing something but after a few hours of research and trying different things came to the conclusion it was a bug.  

My work-around was to tap into the odsDetails_Deleting method and manually instantiate the PersonDataSource, grab the selected value and call the DeleteItem() method on it.    This results in the PersonDataSource being called twice with the second results (empty value) being ignored by the base classes.

Posted by Microsoft on 2/23/2010 at 1:19 PM
The behavior you are seeing is actually the way that data controls in ASP.NET work. Our controls basically build an object from the controls that are on the page. In the case of an update there are controls on the page (TextBox, CheckBox, etc) that when the page is posted back retain their state. We take the values of these and build a new object from that state. In the case of a delete operation since there are no controls that retain state between pages we do not have the data necessary to reconstruct the object.

You can run into a similar problem with update operations if you only have controls for SOME of the properties. The properties for which you do not have controls will not have their state retained.

To work around this there is a DataKeys property on the data control. You can specify a list of all the columns that you would like persisted by specifying them in this property. We keep this state all the time so it allows you to control how much state you want us to keep between postbacks.

Head Banger

The following method in my DataContextBase (GWN.Contrib.Azure) was the source of hours of lost work.  Instead of doing the following I created an IQuerable<TItem> List<TItem>() method in the base classes to handle its functionality - this worked great and the coding went on.   When I was done I reset the development storage, which effectively wiped out all of the existing data, and my application would not recreate the table (ResourceNotFound error)!!! 

public IQueryable<TItem> List
        // The string name must match property name
return this.CreateQuery<TItem>("List");

Moral to the story was that the above property is important to the CreateTablesFromModel method (in the DataSourceBase static constructor).

Note: currently the base classes blow by errors (won't crash the application).  I have TODO: statements where I will be logging the errors to Azure since preliminary research indicates this is the only way our application can communicate issues to us.   Work in progress - I have Blobs to work on!!


Azure error "Resource not found" - catch all error?

by 13. February 2010 23:05

Microsoft Bug Report  ID: 533542
Title: Linq query against TableStorageRoleProvider results in InvalidOperation if role is not found

If you ended up here looking for a reason for your ResourceNotFound error your search may have just started.  I've encountered this for having my Cloud project settings configured for "DataConnection" where the app was looking for "DataConnectionString" and not deriving my table class from TableServiceEntity.  Today I'm getting it because of an InvalidOperationException.

With a basic understanding under my belt for adding a table and record to Azure it is time to build base classes for CRUD operations.   The perfect starting point for this seems to be the PersonalWebSite application from the Windows Azure Platform Training Kit as it holds the code for all basic operations.   I loaded the solution and executed it to receive the following message:

Figure A. 

I find as I step through the code that the Global.asax is attempting to access roles Administrators, Friends and admin - if they don't exist it will create them.   THE PROBLEM IS that if the role doesn't exist, the linq query (Figure C line 480) throws an actual invalid operation error - NOT - a DataServiceClientException as expected.   To verify this I clicked on the ASP.NET Configuration button (arrow in Figure B, which is configured in Web.Config to use TableStorageRoleProvider) and added the Administrators role - it worked as expected and then crashed on the Friends role.

Figure B.  
Figure C.

It appears that DataServiceClientException is derived from InvalidOperation but in our case the error is an actual InvalidOperationException as shown below - the code will never return false allowing continued operation of the program which will create the role.  

With the above problem resolved I ran into authentication errors.   This led to other errors which I confirmed to be incompatibility with the Visual Studio 2010 RC.

Excerpt from follows:

Note The samples included with the Windows Azure SDK will not run on the Microsoft Visual Studio 2010 CTP. New samples for the Microsoft Visual Studio 2010 CTP are available for download. These samples run only in the Microsoft Visual Studio 2010 environment.

We'll see if they run on in the Visual Studio 2010 RC environment.


From: Chris Mullins []
Sent: Tuesday, March 30, 2010 5:12 PM
Subject: Suggestion:Azure error

This is related to your post at:

A workaround I’ve been using, has been to append the “&& true” to the end of the query. This allows the following code to work as you would expect rather than having the issue.

internal UnclaimedTokenEntity TryGetUnclaimedToken(string token)
                // Note the '&& True' hack at the end. Ick. Without this, an exception is thrown if the row
                // doesn't exist.
                var result = from c in this.UnclaimedTokensTable
                             where c.PartitionKey == token && c.RowKey == token && true
                             select c;

                return result.FirstOrDefault();
                /// …           

I hope this help you a little bit…

Chris Mullins

Tags: ,


Azure error "Non-static method requires a target." on SaveChanges()

by 12. February 2010 10:11

Living on the bleeding edge has more rewards (coding gets easier and easier) than injury but today I had to do a wee bit of bleeding....   Numerous hours spent on my first stab at saving a Person record in the Azure cloud.   It wasn't the "ResourceNotFound" errors that caused the most blood loss, they were contributed to simple mistakes on my part such as configuring DataConnection in the Cloud settings while using DataConnectionString in code and not deriving my Person class from TableServiceEntity.  

It was the obscure "Non-static method requires a target" error after finally getting to a point where I could submit my changes via SaveChanges() that had an artery going --- wasn't sure what it meant or how to stop the bleeding....   Since I just upgraded to Visual Studio 2010 RC (from Beta 2 which resulted in me losing all of my UML diagrams as they are not compatible with RC) I couldn't be totally sure that the problem wasn't associated with the upgrade.

To get past these I found it easiest to locate the most basic lesson I could find in the Azure training package and then ensure it ran successfully, I then proceeded to copy/paste the code into my solution but it didn't resolve this error.

Bing'ing the issue reveal enough information to learn that others have experienced this issue if they have a property that is based on an interface or if it returns null (reflection crashes and burns without much indication why).   Since the MicrosoftVault SDK Person field has deep levels of derived base classes I decided at this stage of the game to go with a new EHRPerson POCO and see if I could successfully save a record.

This solved the problem!   With "Create" nicely out of the way it is time to move on to the other CRUD operations.  The bleeding has stopped (for now)

Default.aspx.cs (code behind) 

using System;
using System.Collections.Generic;
using System.Data.Services.Client;
using System.Linq;
using EHREntities.ServiceContext;
using GWN.EHR.Entities;
using Microsoft.WindowsAzure;

namespace GWN.EHR.CloudWebRole
 public partial class _Default : System.Web.UI.Page
        CloudStorageAccount account = null;
        PersonDataContext context = null;

        protected void Page_Load(object sender, EventArgs e)
            account = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
            context = new PersonDataContext(account.TableEndpoint.ToString(), account.Credentials);

            txtID.Text = Guid.NewGuid().ToString();


        protected void Button1_Click(object sender, EventArgs e)
            var statusMessage = String.Empty;
                context.AddPerson(new EHRPerson
                    Name = txtName.Text ,
                    ID = txtID.Text,
                    Organization = txtOrganization.Text,
                    PersonType = txtPersonType.Text,
                    ProfessionalTraining = txtTraining.Text,
            catch (DataServiceRequestException ex)

        public void GetList()
            lstPersonList.DataSource = context.PersonList.ToList<EHRPerson>();


using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.WindowsAzure.StorageClient;

namespace GWN.EHR.Entities
    /// <summary>
    /// </summary>
    public class EHRPerson : TableServiceEntity
        public EHRPerson()
            PartitionKey = "Person";

            // Row key allows sorting, so we make sure the rows come
            // back in time order.
            RowKey = string.Format("{0:10}_{1}",
                DateTime.MaxValue.Ticks - DateTime.Now.Ticks,

        public string ID { get; set; }
        public string Name { get; set; }
        public string Organization { get; set; }
        public string ProfessionalTraining { get; set; }
        public string PersonType { get; set; }

        public override string ToString()
            return string.Format("{0}: {1}, {2}", Name, Organization, ID);


using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.StorageClient;
using Microsoft.Health.ItemTypes;
using GWN.EHR.Entities;

namespace EHREntities.ServiceContext
 public class PersonDataContext : TableServiceContext
       public IQueryable<EHRPerson> PersonList
                return this.CreateQuery<EHRPerson>("PersonList");

       public PersonDataContext(string baseAddress, StorageCredentials credentials)
            : base(baseAddress, credentials)

        public void AddPerson(EHRPerson person)
            this.AddObject("PersonList", person );


using System.Linq;
using Microsoft.WindowsAzure.Diagnostics;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.StorageClient;
using EHREntities.ServiceContext;

namespace GWN.EHR.CloudWebRole
        public class WebRole : RoleEntryPoint
            public override bool OnStart()

                // For information on handling configuration changes
                // see the MSDN topic at
                RoleEnvironment.Changing += RoleEnvironmentChanging;

                // This code sets up a handler to update CloudStorageAccount instances when their corresponding
                // configuration settings change in the service configuration file.
                CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
                    // Provide the configSetter with the initial value

                    RoleEnvironment.Changed += (anotherSender, arg) =>
                        if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
                            .Any((change) => (change.ConfigurationSettingName == configName)))
                            // The corresponding configuration setting has changed, propagate the value
                            if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
                                // In this case, the change to the storage account credentials in the
                                // service configuration is significant enough that the role needs to be
                                // recycled in order to use the latest settings. (for example, the
                                // endpoint has changed)

                /// Create data table from MessageDataServiceContext
                /// It is recommended the data tables should be only created once. It is typically done as a
                /// provisioning step and rarely in application code.
                var account = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");

                // dynamically create the tables
                                                       account.TableEndpoint.AbsoluteUri, account.Credentials);

                return base.OnStart();

            private void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
                // If a configuration setting is changing
                if (e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange))
                    // Set e.Cancel to true to restart this role instance
                    e.Cancel = true;

Tags: ,


Azure - the OutputPath property is not set for this project

by 31. January 2010 12:10

I'm not sure what I did to start receiving the error "the OutputPath property is not set for this project" but I suspect it had something to do with me trying to move the EHRDataProviderRole out of the root into a peer folder with the CloudDataProvider (which I just learned is refered to as a Database - I assumed it to be a provider).   This did not go well so I had to recreate the EHRDataProviderRole (back in the root) and then manually attach it to the database again.

The fix actually came from the following link Jamie Laflen MSFT

The only difference was that I had both a Debug and Release configuration.   I also had the following: <OutputPath>bin\Debug\</OutputPath> in both locations.  I changed the "bin" to a period as shown by the two arrows below and then all was well.

Tip: I started by right clicking on the project and selecting "Unload".  I then right clicked on the unloaded project and selected "Edit" as shown in the bottom pane. 

Tags: ,



Blog videos and references to CodePlex projects are no longer valid