Archive for the '.NET' Category

Jun 08 2012

CRM2011 Automated Testing – Integration Testing

Published by under .NET,General,Microsoft CRM

The integration testing is concerned with testing customisations within the context of CRM to ensure that the various components work together correctly. In my current work this is mostly related to custom plugins in CRM and custom integration with SharePoint. The key point is that with this type of testing I am not concerned with the user interface, rather the underlying processes.

In many ways the integration tests are very similar to the unit tests, they are code based test created using the Gallio/MbUnit framework. However, instead of testing code in isolation and mocking dependencies, the integration tests test whole systems. This is achieved by making create, update, delete, etc. calls to the web service to simulate user actions.

Test setup and tear down code can become length as you need to be able to setup scenarios to be able to test, including creating test records and enabling/disabling plugin to allow actions to be performed that may otherwise be blocked.

It is also during integration testing that I introduce testing as specific users to test security related constraints. This is easily achieved in CRM as connections to the Organization Service support impersonation.

Integration tests are also good where a user has identified an issue with a plugin in a scenario you hadn’t thought of before. I tend to use this as a trigger to create new integration test to match that case, and then one or two more boundary cases around it.

As these tests require connection and interaction with CRM, they can be slow to run. As a result, I tend to run these in TeamCity, where they can be left to run on their own, or I will run specific tests in order to help track down an issue. Whilst setting them up may seem to take a long time, relative to “productive development” time, they soon become invaluable for regression testing as new functionality is developed or existing code is refactored to improve performance, etc.


namespace Tests.Plugin.AddressTests
    public class AddressTests
        #region Setup/Teardown

        public void BeforeEachTest()
            _testEntity = createEntityForTests();

        public void AfterEachTest()
            if (_testEntity != null)


        #region Instance variables

        protected CrmEntity _testEntity;
        protected static readonly string ExpectedExceptionMessageContains = "This record already has an active";


        public void TestMultipleMailingAddressesCanBeUnchecked()
            var address1 = CustomerAddressBuilder.NewMailingAddress(_testEntity.ToEntityReference(), "Mailing 1");
            var address2 = CustomerAddressBuilder.NewMailingAddress(_testEntity.ToEntityReference(), "Mailing 2");
            var address3 = CustomerAddressBuilder.NewMailingAddress(_testEntity.ToEntityReference(), "Mailing 3");

            address1.C5_Mailing = false;


        #region Protected Methods

        protected enum PluginStepState
            Enabled = 0,
            Disabled = 1

        protected void setAddressPluginState(PluginStepState state)
            using (var context = ServicesFactory.CrmContext())
                var steps = from step in context.CreateQuery<SdkMessageProcessingStep>()
                            where step.Name.Contains("Address")
                            select new SdkMessageProcessingStep { SdkMessageProcessingStepId = step.SdkMessageProcessingStepId, Stage = step.Stage };

                foreach (var step in steps)
                    EntityReference entityref = step.ToEntityReference();

                    SetStateRequest req = new SetStateRequest();
                    req.EntityMoniker = entityref;
                    req.State = new OptionSetValue((int)state);
                    req.Status = new OptionSetValue(-1);

                    SetStateResponse resp = (SetStateResponse)context.Execute(req);


In this example you can see that the test is disabling a plugin, setting up some test address records, re-enabling the plugin and then executing the test. After the test is complete it cleans up the test records.

Related Posts

No responses yet

Jun 07 2012

CRM2011 Automated Testing – Unit Testing

Published by under .NET,Microsoft CRM

Ideally the code being tested should not rely on external resources, where such dependencies exist they should be mocked to isolate the code being tested form the external system. In the case of plugins, this means that testing units of code should not require the code to have a connection to an actual instance of CRM.

CRM2011 has improved the way that plugins are developed against CRM, especially with the introduction of the CrmSvcUtil.exe tool for creating early bound classes for entities in CRM that can be used alongside the CRM SDK assemblies. Additionally Microsoft has provided interfaces for accessing CRM and data provided by CRM to the plugin:

  • IOrganizationService
  • IPluginExecutionContext
  • ITracingService

This means it is simple to mock the CRM services to test the plugin in isolation.

However, mocking a whole execution context would be a bit of a lengthy exercise, especially the IPluginExectionContext object is different depending on what messages and what pipeline stages you are writing the plugin for.

Thankfully there is a project on Codeplex that makes this job simple: CrmPluginTestingTools. There are two parts to this tool:

  • A plugin that you register in CRM as you will register your final plugin (including configuring pre- and post-images) that serialises the plugin execution context to xml files.
  • A library of classes for de-serialising the xml files into objects that match the IPluginExecutionContext interface.

Once you’ve registered their plugin, perform the steps that you plugin will handle; grab the output xml files; head into Visual Studio and start crafting your unit tests.

I tend to use the de-serialized xml files in my unit tests as a starting point and then use the object model to update the IPluginExecutionContext objects to structure the data for the specific test.

The execution context is only half of the story, I also use Moq to create a mock IOrganizationService that code being tested will be given instead of an actually connection to CRM. By doing this I am able to isolate the code from the CRM server plus I can determine what will be returned without having to setup records in CRM. I can even mock any type of exception if I want to test that aspect of my code.

As with all my automated tests, the unit tests are built on top of Gallio/MbUnit testing framework. This means that I don’t need to think about how to run the tests, I know they will all work with in Visual Studio, the Gallio Test Runner – for a more graphical UI on my workstation, and TeamCity continuous integration server.

Overall Process

Of course, the plugins I write and single-class monsters, instead I break up the code in logical units (classes/methods) of responsibility and I test each unit on its own. If a unit of functionality doesn’t require access to CRM or the plugin execution context then I don’t worry about mocking those for that specific test.

Solution Structure

This solution contains several projects, these being:

  • Crm2011.Common

This contains the early-bound classes generate by CrmSvcUtil.exe which are used by several solutions.

  • Crm2011.Plugins.Address

The plugin that is being written/tested.

  • Crm2011.Plugins.DeploymentPackage

The project for deploying the plugin into CRM. The steps and images registered in this project should match how the CrmPluginTestingTools plugin was registered.

  • Plugin.Unit.Tests

This contains the unit tests created for testing the plugin, as well as the serialized execution contexts used by the unit tests. This project also contains a .gallio file that is the project used to run the tests in Gallio; and a .msbuild file that is the MsBuild project used to compile and run the tests in TeamCity.

  • SerializePluginContext & TestPlugin

The CrmPluginTestingTools projects for de-serializing the xml execution contexts in the unit tests.


public class CompanyAddressPreDelete

	MyServiceProvider serviceProvider;
	IPluginExecutionContext pluginContext;

	public void Setup()
		var context = TestContext.CurrentContext.Test.Metadata["Context"][0];
		var service = new Mock&lt;iorganizationservice>();
		var assemblyPath = typeof(IndividualAddressPreCreateTests).Assembly.Location;
		var contextFile = Path.Combine(Path.GetDirectoryName(assemblyPath), @"Address\Contexts\Company\Delete\"+context+".xml");
		serviceProvider = new MyServiceProvider(service.Object, contextFile);
		pluginContext = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

	[Test, Metadata("Context", "PreOperation-BothTicked-Active")]
	public void both_active_should_clear_parent()
		var plugin = new Crm2011.Plugins.Address.PreAddressDelete();

		Assert.Exists(pluginContext.SharedVariables, sv => sv.Key == "AddressChanges", "Address Changes not registered in SharedVariables");
		var changes = new SharedVariables(pluginContext.SharedVariables).AddressChanges;
		Assert.Count(2, changes);
		Assert.Exists(changes, x => x.Address == AddressType.Mailing && x.Change == ChangeType.Remove);
		Assert.Exists(changes, x => x.Address == AddressType.Invoice && x.Change == ChangeType.Remove);

		Assert.Exists(pluginContext.SharedVariables, sv => sv.Key == "MakeChanges", "Make Changes not registered in SharedVariables");

The above code is a single test fixture (class that contains unit tests); a SetUp method that is run before each test; and a single unit test. I have added Metadata attribute to the test method so the setup method can determine which execution context xml files should be de-serialized for the test. The first two lines of the test setup and execute the plugin, the rest of the test ensures that the ShareVariables on the execution context have been correctly set of the given data.

The following example indicates how the organization service can be mocked to ensure that the plugin is making the correct call to CRM:

public class CompanyPostDeleteTests
	Mock&lt;iorganizationservice> service;
	MyServiceProvider serviceProvider;
	IPluginExecutionContext pluginContext;

	public void Setup()
		service = new Mock&lt;iorganizationservice>();
		var assemblyPath = typeof(IndividualAddressPreCreateTests).Assembly.Location;
		var contextFile = Path.Combine(Path.GetDirectoryName(assemblyPath), @"Address\Contexts\Company\Delete\PostOperation.xml");
		serviceProvider = new MyServiceProvider(service.Object, contextFile);
		pluginContext = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

	public void deleted_mailing_and_invoice_address_should_clear_account_twice()
		var parentId = Guid.NewGuid();
		pluginContext.SharedVariables.Add("MakeChanges", true);
		pluginContext.SharedVariables.Add("AddressChanges", "Mailing:Remove|Invoice:Remove");
		pluginContext.PreEntityImages.Add("Image", CreateTestAddress(parentId));

		var plugin = new PostAddressDelete();

		service.Verify(s => s.Update(It.Is&lt;entity>(e =>
			 e.Id == parentId &&
			 e.LogicalName == "account" &&
			 e.Attributes.Contains("address1_line1") && (string)e["address1_line1"] == ""
		)), Times.Once());

		service.Verify(s => s.Update(It.Is&lt;entity>(e =>
			e.Id == parentId &&
			e.LogicalName == "account" &&
			e.Attributes.Contains("address2_line1") && (string)e["address2_line1"] == ""
		)), Times.Once());

By using unit testing in isolation of CRM, it is possible to create the vast majority of a plugin without having to touch the CRM server itself (with the exception of creating the test serialized execution contexts). Not only that, once you have created the unit tests, it is possible to run them again and again without having to manually click through CRM, so if you make changes to entities in CRM you can easily re-run the CrmSvcUtil.exe to update your early-bound classes and then run the unit tests to verify that they still pass.

Related Posts

One response so far

Jun 06 2012

CRM2011 Automated Testing – Tool Stack

Published by under .NET,Microsoft CRM

To mark my return to blogging, I’ve decided to write a series of blog posts around automated testing in CRM2011.  This is part 1, an overview of the types of tests I’m creating and the tools I’m using for creating and running the tests.  Future posts will look at each of the types of tests in turn and look at how I’m using TeamCity to automate these tests.

The automated testing in place for CRM2011 is broken downv into the following three types of tests:

  • Unit Tests
  • These are small, fast tests that are written in code and are designed to test isolated pieces of code.
  • Integration Tests
    • The tests are for complete systems or sub-systems, e.g. a plugin running in the context of CRM.  However, they are still code based and tested in without User Interface interaction.
  • User Interaction Tests
    • This is tests of the system from the perspective of the user.  They are based on a simulation of the user clicking and entering values in the browser.

    Testing Components

    The basic components that are being used for automated testing of CRM are as follows:

    • Used for creating the test classes and test runner used for executing the tests.
  • CrmPluginTestingTools
    • This is used to create copies of actual plugin execution contexts from CRM.  These are then used in plugin unit tests outside of CRM.
  • Moq
    • Used for creating mock copies of external systems, e.g. CRM Web Services, so that unit tests do not require access to CRM.
  • WatiN
    • This is a web browser runner that simulates a user clicking in the browser.
  • TestDriven.Net
    • This integrates testing functionality into Visual Studio, making it quicker and easier to write and run tests from within Visual Studio without having to rely on an external application.
  • TeamCity
    • This is an automated continuous integration system that is used to both compile the source code and run unit tests.  It has also been configured to create a backup of the customisation solution from CRM into Subversion.

    Tool Stack

    The following diagram illustrates the tool stack used for each of the types of tests, with the system under test at the top of the stack and the host environment at the bottom:

    All of the test can be run in either Visual Studio or TeamCity and all of the test are based on the Gallio/MbUnit test framework.  In addition, the unit tests uyse Moq and CrmPluginTestingTools to allow the unit tests to run in isolation; the integration tests simply call straight to the CRM Web Services for both setting up the test and confirming the results; the user interaction tests use WatiN to control IE to run the test.

    Related Posts

    One response so far

    Oct 06 2011

    Update SharePoint List ContentType Names

    Published by under .NET

    I know it’s been a while since I last posted but this is going to be short and sweet.  I’ve just been working on a SharePoint 2010 site where I’ve need to change the names of site content types and have the changes pushed down to a list that already uses the content types.

    The content types are deployed using a solution but that’s where the automation ends.  However, this is where Powershell steps up to the mark.  With Powershell we can quickly automate just about anything in SharePoint.

    Here’s the script I can up with:

    $web = Get-SPWeb
    $list = $web.Lists["TargetList"]
    $list.ContentTypes | ForEach {
        $_.Name = $_.Parent.Name

    No responses yet

    Nov 17 2009

    WSPBuilder Project Migration Errors

    Published by under .NET,Programming

    Today I tried migrating a Visual Studio WPSBuilder project from one machine to another.  The originating machine was 32bit Windows Server 2008 the destination machine was 32bit Windows Sever 2003 Standard.

    To migrate the project I simply zipped the solution folder and copied it across to the destination machine, unzipped and opened in Visual Studio.

    However, when I came to build the SharePoint solution file using the Tools -> WPSBuilder -> Build WSP, I got the following error in the output window:
    “Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection”

    When Googling the the error message the only result was on the WSP Builder CodePlex site and was about 64bit versions of the Cablib.dll assembly.  As neither of my machines is 64bit I figured this wasn’t the issue.

    I’m not sure exactly what was causing the problem but this was the solution:

    • Open the project folder in Explorer
      • Delete the bin folder
      • Delete the obj folder
      • Delete the wsp file in the project folder.
    • Re-open the project in Visual Studio
      • Compile the project
      • Build the WSP (Tools -> WPSBuilder -> Build WSP)

    After this everything was back to normal.

    7 responses so far

    May 15 2009

    VisendoSMTPExtender Management Web Service

    Published by under .NET,Programming,Software


    For work, I’m using a Windows 2008 Server virtual machine for doing all my SharePoint and .Net development on.  As it has got all of the cool stuff I’m working on, it is also the machine that I use to demo what I can do to clients.  Recently I’ve had a bit of a serge in the number of clients wanting to see Nintex Workflow 2007 (NFW2007) for SharePoint.

    One of the cool features of NWF2007 is the whole Lazy Approval system, whereby users don’t have to go into SharePoint to approve to decline requests, they can just reply to the notification email with “approved”, “declined”, “ok”, “yes”, “no” or any other recognised word as the first line of the email.  In or to demonstrate this I need to setup and email system on my local machine.  The SMTP (sending) side of things is easy as it is built in to Windows 2008.  However, POP3 is a bit of a problem.  Previous version of IIS had a simple POP3 service but that has been dropped in IIS7.  The Microsoft way would be to install Exchange Server but that is a little too heavyweight for what I am trying to acheive.  Luckily a company called Visendo provides a free solution to plug the gap.  So now I can demo Nintex notification features.

    Another feature I also wanted to demonstrate was setting up Active Directory accounts and then using those new accounts.  Nintex has got actions that allow you to interact with Active Directory but to then do anything usefull with the account required modifying xml config files and restarting the Visendo service.  But Nintex can call web services, so I’ve created a web service that has an AddAccount and DeleteAccount methods to update the Visendo configuration and restart the service.


    I’ve made the source code for this web service freely available should anyone else want to have this sort of functionality: VisendoSMTPService.  The code is written against .Net 3.5 and is provided “as is” with no sort of warranty and is most definitely NOT recommended for live systems.  The code is released under a BSD License.

    3 responses so far

    Mar 23 2009

    ScriptManager vs ClientScript

    Published by under .NET,General,Programming

    As part of a Microsoft Dynamics CRM customisation project, I was recently tasked with modifying an ISV add-in page so that individual sections of the page could be refreshed without having to refresh the entire page.

    That in itself was nice and easy, it just required the use of a couple of UpdatePanel and UpdateProgress controls to be dropped on the page along with a ScriptManager control to handle the partial postback requests.

    However, there were a couple of gotchas that I didn’t foresee.  Along with the partial updates, the client wanted the parts of the page to load the first time after the page had first loaded in the following order:

    1. Load base page.
    2. Start load of Application Status section.
    3. Start load of Risk section.
    4. Start load of Alerts section.

    Unfortunately, the ScriptManager that comes with ASP.Net AJAX only handles one postback at a time.  Any subsequent postbacks take precedence over earlier request, which is fine if you only have one update panel on a page but this initially resulted in only the Alerts section being loaded – the other postbacks being canceled.

    Luckily I found a very useful post on Amit’s Blog: Asp.Net Ajax UpdatePanel Simultaneous Update.  This post provides code that queues postback requests.  Ideally I would have liked to have the requests happening at the same time, but this is the next best thing.

    The other problem that I found was that one of the sections on the page had previously been dynamically creating and reqistering script that got loaded and executed when the page loaded using:

        "ShowNavBarItem", script.ToString());

    Unfortunately that doesn’t work with the AJAX ScriptManager.  Instead I had to change the code to use the ScriptManager to load and execute the code using the following:

        typeof(UpdatePanel), "showNavBarItems", script.ToString(), true);

    This means that each time the section is updated I can re-generate the script and it will be executed again.

    With this AJAX loading and some enhancements to the SQL code, users now get a much fasted load and response time for the page as the no longer have to wait for the whole page to be rendered before seeing information appearing on the page.

    kick it on

    One response so far

    Feb 11 2009

    Microsoft Enterprise Search Roadmap

    Published by under .NET,Programming,Software

    When I first started developing with SharePoint in June last year the last thing that was on my mind was Enterprise Search.  I had considered it the sort of technology that you just plug in a magic box and it just did all the work for you.

    Recently, however, I have been involved in developing a customised search solution using Microsoft Office SharePoint Server (MOSS) to allow a client to search client and non-client related documentation within their organisation.   What I rapidly discovered is that enterprise search is not a plug-and-play affair.  A lot of thought needs to go into the meta data that is used to define the taxonomy of the data (in this documents) and also how users are going to interact with search and how to make sure they get the information they need.

    I have also recently been on a training course at FAST Search,which was acquired by Microsoft in April 2008.  The course was both an introduction to the structure of FAST ESP and also an in-depth look into customising the internal, both feeding content into the indexing engine of FAST ESP and building a rich user experience for getting content from the FATS ESP search engine.

    The two activities have really awaken me to how powerfull enterprise search can be in empowering users to find information which previously they either may not have know how to access or, more likely, simply hadn’t known existed.  Whilst I was learning about FAST Search, it was generally anticipated that it would be included with the next generation of MOSS.  Today that was confirmed at FASTForward ’09 when Microsoft announced its roadmap for enterprise search which has two initial streams, firstly FAST Search for Internet Business which is mainly aimed at internet retail businesses – like you’d use to find products on Amazon.  The second, and more interesting for me, stream is FAST Search for SharePoint, which will integrate FAST Search more closely with SharePoint and would be used for the type of internal information discovery that I have been working on recently.

    Mark Harrisson also noted on his blog that Microsoft is going to be offering ESP for SharePoint immediately which is

    a special offering that allows customers to purchase high-end search capabilities today, with a defined licensing path to FAST Search for SharePoint when it becomes available.

    I haven’t been able to find out more information about ESP for SharePoint, but it certainly looks like it could be an interesting product to get hold of.  I the mean time I’m keen to continue working with MOSS Enterprise Search and have just the right project lined up to flex my new found love of search on.

    No responses yet

    Aug 11 2008

    Try…Catch For No Reason

    Published by under .NET,Programming

    I’ve seen this time and time again and I’m sure just about every developer out there has seem the same sort of thing:

    try {
        //many lines of code
    catch (Exception ex) {
        throw new Exception("Something went wrong dude!");

    This is probably the single most un-helpful piece of code a developer can write.  All you are doing is making you life and future developers lives harder when it comes to debugging.  The whole point of the try…catch block is for times when you know an exception may happen and it allows you to gracefully handle it without the whole system crashing to the ground.

    So lets have a closer look at what’s wrong with this code:

    Continue Reading »

    One response so far

    Jul 14 2008

    SharePoint Event Receiver Manager

    Published by under .NET,Programming,Software

    One of the things I’m working on is Event Receivers in SharePoint but I’ve found registering the event receivers to be a bit on the annoying side.  Yes you can do it with the feature.xml Receivers section and there is the great El Blanco Event Receivers Manager.

    EventReceiverManagerI personally prefer desktop GUI applications and so, based on code by Liron, I’ve created my own desktop Event Receiver Manager.  It allows you to select site & list, browse for an assembly so it can give you the full assembly name and a list of classes inside, and a list of receivers you can attach to.

    The current version only allows adding new event receivers.  Before adding a new receiver it will check to make sure the same receiver doesn’t already exist.

    Future plans for it include listing existing receivers to allow users to delete them but as I don’t need to do that myself yet I haven’t implemented it.

    Download a copy of EventReceiverManager.

    3 responses so far

    Next »