05 December 2006

First steps in Workflow Foundation

I have started to play with Windows Workflow Foundation (WF) a few days ago. So far it seems very interesting, shipping with handy pre-build activities, persistence and tracking services, prity designers for visual studio, the capability of hosting these designers anywhere else…web services interaction, etc. Promising.

I just want to take down some notes here so I don't forget what I've learned.

Some Material

Some Concepts
  • Windows Workflow Foundation (WF): Set of components, tools, and a designer that developers can use to create and implement workflows in .NET Framework applications. It is part of the Microsoft .NET Framework version 3.0.
  • Workflow: a set of activities that are stored as a model that describes a real-world process. A workflow is designed by laying out activities.
  • Activity: A step in a workflow. The unit of execution, re-use and composition for a workflow.
  • Types of workflows (Here is a post about how to decide which type to use):
    • Sequential: Consists of activities that execute in a predefined order. Has a clear direction of flow from top to bottom, although it can include loops, conditional tests, and other flow-control structures.
    • State-Machine: Consists of states and transitions that change a workflow instance from one state to another. Although there is an initial state and a final state, the states have no fixed order, and an instance can move through the workflow in one of many paths.
    • Data-Driven: Is usually a sequential workflow that contains constrained activity groups and policies. In a data-driven or rules-based workflow, rules that check external data determine the path of a workflow instance. The constrained activities check rules to determine the activities that can occur.

The framework component model

The WF framework consists on 3 assemblies, containing the following namespaces:

  • System.Workflow.Activities
    • System.Workflow.Activities: Defines activities that can be added to workflows to create and run an executable representation of a work process.
    • System.Workflow.Activities.Configuration: Provides classes that represent sections of the configuration file.
    • System.Workflow.Activities.Rules: Contains a set of classes that define the conditions and actions that form a rule.
    • System.Workflow.Activities.Rules.Design: Contains a set of classes that manage the Rule Set Editor and the Rule Condition Editor dialog boxes.
  • System.Workflow.ComponentModel
    • System.Workflow.ComponentModel: Provides the base classes, interfaces, and core modeling constructs that are used to create activities and workflows.
    • System.Workflow.ComponentModel.Compiler: Provides infrastructure for validating and compiling activities and workflows.
    • System.Workflow.ComponentModel.Design: Contains classes that developers can use to build custom design-time behavior for workflows and activities and user interfaces for configuring workflows and activities at design time. The design-time environment provides systems that enable developers to arrange workflows and activities and configure their properties. The classes and interfaces defined within this namespace can be used to build design-time behavior for activities and workflows, access design-time services, and implement customized design-time configuration interfaces. It includes the TypeBrowserEditor, a cool feature that can be reused.
    • System.Workflow.ComponentModel.Serialization: Provides the infrastructure for managing the serialization of activities and workflows to and from extensible Application Markup Language (XAML) and CodeDOM.

  • System.Workflow.Runtime
    • System.Workflow.Runtime: Classes and interfaces that control the workflow runtime engine and the execution of a workflow instance.
    • System.Workflow.Runtime.Configuration: Classes for configuring the workflow runtime engine.
    • System.Workflow.Runtime.DebugEngine: Classes and interfaces for use in debugging workflow instances.
    • System.Workflow.Runtime.Hosting: Classes that are related to services provided to the workflow runtime engine by the host application.
    • System.Workflow.Runtime.Tracking: Classes and an interface related to tracking services.

Wix install sequence

I am trying to figure out the exact behavor of the install sequence in the setup, as defined in a wix .wxs file. The confusion comes from the presence of the InstallUISequence and the InstallExecuteSequence tags, leaving aside the fact that there are actually four tags related to the actions sequence (AdminUiSequence and AdminExecuteSequence are used in administrative installs). As it is said in the wix turorial, "InstallExecuteSequence is always consulted by the installer to determine the actions, InstallUISequence is only considered when the installer runs in full or reduced UI mode." So, if executing without user interface, the order is determined for the InstallExecuteSequence and no doubts remain. But, if executing in full or reduced UI mode, both sequences are consulted. So, what is the resulting order of execution? If an action is placed in both sequences, it is executed twice?

As for the executing order, what I suppose happens is that all actions in the InstallUISequence are run first (since they gather the information required for the installation) and then those from the InstallExecuteSequence. But it is possible that some actions need to be executed before any UI dialog, so then we should place them in both sequences.

If an action is placed in both sequences, they will be apparently executed twice. Al least one exception for this rule is the AppSearch action. In the schema reference it says: “AppSearch should be authored into the InstallUISequence table and InstallExecuteSequence table. The installer prevents The AppSearch action from running in the InstallExecuteSequence sequence if the action has already run in InstallUISequence sequence.”

For custom actions, the execute attribute can be used for preventing the double execution.

07 November 2006

Isolating Office Extensions with the COM Shim Wizard

This article explains the importance of using a shim when implementing managed extensions for Office. It also provides a wizard to easily generate the shim from the managed extension. Very recommendable.

Problem with NAnt's xmlpoke task

There's an issue when trying to replace some part of an xml file that is under a default namespace using NAnt’s xmlpoke task. It happened to me the other day when trying to replace the version attribute in the following wix file:

<?xml version="1.0" encoding="utf-8"?>
<Wix xmlns="http://schemas.microsoft.com/wix/2003/01/wi">
<Product Name="MyProduct" Id="847A8D24-0F98-4b9f-AEA0-070ABD49F86C" Language="1033" Codepage="1252" Version="1.0.0" Manufacturer="MyCompany">

Because of the namespace definition in the Wix element, calling xmlpoke simple as
<xmlpoke file="${wix.file}" xpath="/Wix/Product/@Version" value="${build.new.version}"/>
does not work.

What it does work is the following use of the task:

<xmlpoke file="${wix.file}" xpath="/wx:Wix/wx:Product/@Version" value="${build.version}">
<namespace prefix="wx" uri="http://schemas.microsoft.com/wix/2003/01/wi" />


NAnt Contrib is a project that contributes with a lot of additional tasks to NAnt. For example, two of the most useful tasks for me are the trycatch and choose tasks.

27 August 2006

MSDN Conference's Material

The presentation of the conference about "Testing and guarantee of quality with Visual Studio Team System" that we gave at Microsoft last Thursday can be downloaded from Rodolfo's blog.

20 August 2006

How to add a custom rule to VSTS Code Analysis

The code analysis feature that ships with Visual Studio Team System (VSTS), FxCop, comes with a rich API to write custom rules. Since the API almost lacks of documentation, I've been struggling for a while before I got my custom rules to work. That's why I've decided to write this short guide on how to make your own rules: (It is required that you have the VSTS version of Visual Studio 2005 intalled)

  1. Create a new class library project.
  2. Add references to the Microsoft.cci.dll and FxCopSdk.dll assemblies (typically installed in C:\Program Files\Microsoft Visual Studio 8\Team Tools\Static Analysis Tools\FxCop).
  3. Create your custom rule class and make it derive from BaseIntrospectionRule.
    • Make the constructor of your class call its base constructor, which has three parameters: the rule's name, the name of the rule descriptor file (see step 4), and the assembly that contains the rule. Note that, if your project will consist on more than a single rule, it is convenient to write your own base class for all the rules not to repeat code.
    • I took as example the base class of the FxCop design rules:
      using System;
      using Microsoft.Cci;
      using Microsoft.FxCop.Sdk;
      using Microsoft.FxCop.Sdk.Introspection;

      namespace Microsoft.FxCop.Rules.Design{

      internal abstract class DesignIntrospectionRule
      : BaseIntrospectionRule {

      protected DesignIntrospectionRule(string name)
      : base(name,

      public override void AfterAnalysis(){
    • Override one of the Check methods to implement your custom rule's logic. Here's a snippet of one of the design rules that overrides the Check(TypeNode) method. Other Check overloads are available, for checking members, types, modules, etc. If it happends that the target being analyzed does not match the rule's condition, a new Problem is added to the Problems collection, and this collection is returned by the check method. The GetResolution method allows you to fetch a resolution from the rules xml file.

      internal sealed class AbstractTypesShouldNotHaveConstructors 
      : DesignIntrospectionRule {

      public AbstractTypesShouldNotHaveConstructors()
      : base("AbstractTypesShouldNotHaveConstructors") {

      public override ProblemCollection Check(TypeNode type) {
      if (!type.IsAbstract) {
      return null;
      for (int num1 = 0; num1 < type.Members.Length; num1++) {
      InstanceInitializer initializer1 = type.Members[num1]
      as InstanceInitializer;
      if ((initializer1 != null) && initializer1.IsPublic) {
      Resolution resolution1 = base.GetResolution(new string[]
      { type.Name.Name });
      Problem problem1 = new Problem(resolution1);
      return base.Problems;

    • Eventually override the BeforeAnalysis or AfterAnalysis methods if you need to execute something before or after the analysis process takes place respectivelly.
  4. Add the rules .xml descriptor file to the project as an embedded resource, with the not copy to the target directory option set. The file's name (including it's namespace, and without the extension) must match the second parameter of the BaseIntrospectionRule constructor. It seems that there's no schema available for the xml, but here's a portion of the DesignRules descriptor:

    <Rules FriendlyName="Design Rules">
    <Rule TypeName="AbstractTypesShouldNotHaveConstructors"
    <Name>Abstract types should not have constructors</Name>
    <Description>Public constructors for abstract types do
    not make sense because you cannot create instances of
    abstract types.</Description>
    <Resolution>Change the accessibility of all public constructors
    in '{0}' to protected.</Resolution>
    <MessageLevel Certainty="95">CriticalWarning</MessageLevel>
    <Owner />

  5. Integrate the custom rules with visual studio's code analysis
    • In order to register your rules with VS, you just need to copy the assembly containing your rules to the rules directory (c:\Program Files\Microsoft Visual Studio 8\Team Tools\Static Analysis Tools\FxCop\Rules).
    • All the rules contained in the assembly will be enabled by default.
    • You can now simply run the code analysis over a project to evaluate it with your own rules.

01 August 2006

MSDN Conference

This month I will take a little part in an MSDN conference at Microsoft Argentina. Along with Diego Gonzalez and Rodolfo Finochietti we will speak about Testing and quality assurance with Visual Studio Team System (VSTS).

It is the first time I am backwards the blackboard at a Microsoft's conference, so I am very enthusiastic about that. I hope to see everyone there. It is on August 24th and here's the link to register.

04 July 2006

Starting developing in .Net Framework 3.0

Here's a how to start developing .Net Framework 3.0 applications.

For any doubts on OS and framework's compatibility, read the article Deploying .Net Framework 3.0.

What's nedded for developing .Net3 applications (taken from here):

  • .Net Framework 3 Runtime Components (already ships with WindowsVista and Longhorn)
  • Windows SDK: includes the documentation, samples, tools and build environments to develop Windows applications - either native or .NET Framework 3.0.
  • [Optional] Visual Studio 2005
  • [Optional] "Orcas" .NET Framework 3.0 Development Tools: development tools that work with Visual Studio 2005 and provide functionality such as XAML Intellisense support through schema extensions for the editor, project templates for the Windows Presentation Foundation and Windows Communication Foundation namespaces and .NET Framework 3.0 SDK documentation integration.
  • [Optional] Visual Studio 2005 Extensions for Windows Workflow Foundation:extensions to Visual Studio 2005 include project templates, intellisense support for the Workflow namespace (System.Workflow) and integrated documentation.

.Net Glossary 3.0

This is just a summary of the new terms incorporated by the third generation of the .Net Framework.

The new managed-code programming model for Windows.
It's basically the sum of the .NET Framework 2.0 plus some new technologies.
It will ship with Windows Vista.
It is available for Windows XP and 2000 with SP2 and for Windows Server 2003.
Here's the .Net Framework 3.0's diagram (stolen from MSDN):

  • Authentic Energic Reflexive Open (AERO): Windows Vista's Look & Feel. Refers back to the UX Guidelines.
  • Card Space (ex InfoCard): Unified Digital Identity.
  • eXtensible Application Markup Language (XAML): XML based languaged used to code the WPF's object model.
  • Microsoft Expression: multimedia software package for graphic designers. It ships with 3 applications:
    • Acrilyc Graphic Designer: Design tool compatible with bitmaps and vectorial graphics.
    • Sparkle Interactive Designer: XAML, MS alternative to Macromedia Flash.
    • Quartz Web Designer: MS Frontpage's replacement.
  • ORCAS: codename for the next version of Visual Studio.
  • Team Foundation Server (TFS): workflow collaboration engine that enables the use of a team's customized process, as well as a centralized data warehouse that collects real-time intelligence on project history.
  • The LINQ Proyect: codename for a set of extensions to the .NET Framework that encompass Language-INtegrated Query, set, and transform operations. It extends C# and Visual Basic with native language syntax for queries and provides class libraries to take advantage of these capabilities.
  • User Experience (UX) Guidelines (UX Guide): Contain information on what’s new in Windows Vista, design principles, guidelines for controls, text, windows, and aesthetics.
  • Windows Comunication Foundation (WCF) (codename Indigo): Unified set of technologies to build Service Oriented Applications.
  • Windows Presentation Foundation (WPF) (codename Avalon)
  • WindowsVista (codename Longhorn): Major Windows Operating System
  • Windows Workflow Foundation (WF): programming model, engine and tools for quickly building workflow enabled applications.

Dependencies Metrics

In his paper about Stability (1997), Robert C. Martin describes a set of principles and metrics that can be used to measure the quality of a large object oriented designed proyect in terms of the interdependence between its packages.

The metrics

  • Abstract types: The number of abstract types contained in the assembly.
  • Total types: The number of total types contained in the assembly.
  • Abstractness (A): The ratio of abstract types in the assembly to the total number of types. A = abstract types / total types.
  • Efferent Couplings (Ce): The number of types inside the assembly that depend upon types outside the assembly.
  • Afferent Couplings (Ca): The number of types outside the assembly that depend upon types within the assembly.
  • Instability (I): I = Ce / (Ce+Ca). Measure of the facility in changing the assembly.
  • Distance from the Main Sequence (D): In an Abstraction vs Instability space, the distance from the assembly (I,A) possition to the Main Sequence (the line of maximum balance between abstractness and instability). D = |A+I-1|/sqrt(2)
  • Normalized Distance (D'): The above distance ranged between [0,1]. D’ = |A+I-1|
The Principles
  • Stable Abstraction Principle (SAP): Reference's direction must be from a less abstract to a more abstract assembly.
  • Stable Dependency Principle (SDP): Reference's directon must be from a more instable assembly to a more stable assembly.

<- The figure on the left shows a bad dependency (the red arrow) because As1 depends on As2, but As2 in less abstract than As1 and more instable too.

-> The figure on the right shows a good dependency because it goes in the abstraction's direction (As1 is more abstract thanAs2) and also in the stability's direction (As1 is more stable -less instable- than As2).

Why to use them

  • They are easy to calculate (already exists tools to do the job -at least for java-)
  • They are widely used and approved.
  • They will help a lot in determining the proyect's health in terms of its dependencies, although it is possible that not ALL references seen as bad references below the eye of this principles are REALLY bad references.
  • They allow to continuously monitor the quality aspects of code that can affect the long-term viability of your software architecture.
Tools for calculating them

For Java:
For .Net:

20 June 2006

Testing Data Access Layers (DALs)

It seems that testing DALs is harder than I thought. One of the problems I have run into is how to preserve DB integrity. Some of the alternatives are:

  1. Create mock objects that replace the database.
  2. Have a separate DB for testing.
  3. Restore the DB on every test run.
  4. Run the test inside transactions and then rollback.
Another concern is: How the tests must be designed in order for their success or failure to depend only on the coding and not on the data state?

Related Links

Testing Frameworks:

Unit Testing in .Net Proyects
by Jay Flowers

Transaction approach:

Simplify Data Layer Unit Testing using Enterprise Services
Alternative Testing Frameworks

Mock Objects:

Mock Objects to the Rescue! Test your .NET code with NMock
Mocks Aren't Stubs by Martin Fowler

Testing sequences (useful for CRUD operations):

Advanced Unit Testing, Part III - Testing Processes by Marc Clifton

26 May 2006

DSL Tools for VS2005

This is a guide for Building a DSL Designer using the DSL Tools for VS2005.


These components must be intalled:

  • Visual Studio 2005
  • Visual Studio 2005 SDK
  • DSL Tools for Visual Studio 2005

1. Create a new project in VS2005

The project template must be: Other Project Types -> Extensibility -> Domain Specific Language Designer.

Choose a template for the designer.

Note that VS has created a solution with 2 projects: Designer and DomainModel.

A DSL Designer consists of 3 components:

1. Domain elements (Domain Model)
2. Notational elements
3. Mapping between Domain and Notational elements

2. The Domain Model

The Domain Model is in the DomainModel.dsldm in the DomainModel Project. You can edit it by gragging and dropping the Domain Model Designer Tools of the Toolbox.

The Model contains classes and relationships. Classes may contain value properties.

3. The Notation

The Notation is represented in an xml file. The default description is in the file Designer.dsldd in the Designer Project.

The notation can include shapes which appear in the toolbox and can be dragged onto the design surface, connector for the shapes, etc.

4. Domain-Notation Mapping

This is also defined in the dsldd file. In general the mapping is between shapes and connector lines to classes and relationships respectively. So when a user creates shapes and connectors in the designer he also creates the classes and relationships in the domain.

5. Generate code and build

Click on the "Transform All Templates" button in the Solution Explorer Toolbar.
Build the solution.
Press Ctrl+F5 to run the designer.

6. Consuming the DSL

Templates consist of some directives followed by a mixture of literal blocks and control blocks. Literal blocks are just plain text in the template that you want to pass straight through to the output. Control blocks are things with some kind of <# #> marker around them.

The content of these blocks in your template contributes to a class which the templating system generates. This class derives from the abstract class Microsoft.VisualStudio.TextTemplating.TextTransformation and overrides the abstract method TransformText which, when executed, writes out the desired output of the transformation of the template. If you do nothing else in your template, this method will write out all of the text in literal blocks by simply writing out the raw text using a simple WriteLine()-style statement. The base class provides this WriteLine method (in various flavors) as well as Error and Warning methods that you can use in your custom template code.

7. Deployment

Add a Setup package to build a deployable designer.

DSL - A first approach to.

This is a summary of the Martin Fowler's article on Language Workbenches: The Killer-App for Domain Specific Languages? which I have recently read and found very interesting.


Language Workbench: IDE tools to help Language Oriented Programming.

Language Oriented Programming: a style of development which operates about the idea of building software around a set of Domain Specific Languages.

Domain Specific Language (DSL): Limited form of computer language designed for a specific class of problems.


Imagine you build a class library to solve some particular problem and that the objects involved are parameterizable. So, you need a first step to set up configuration and wire up composite objects before putting them to do the work.

When building this kind of class library, there is a marked distinction between the abstraction itself and the configuration. The abstraction may be reusable and less often to change, while the configuration tends to be specific, more simple, and likely to change more often.

This leads to the idea of putting the configuration out of the code, for example, in xml files or in some custom syntax file (what may be more readily). The structure of this configuration, the mapping with the objects in the abstraction, the parameters, etc, are nothing but a kind of Domain Specific Language, a very small programming language, suitable for the only purpose of solving some specific problem.

If we leave the configuration in the code, instead of moving it to a configuration file, we still have a case of DSL, a DSL embedded in the host language. So here comes the distinction between internal and external DSL.

External DSL: DSL written in a different language than the main language of an application.

Internal DSL: DSL written in the same language of the main language of an application.

Language Oriented Programming

Language oriented programming is about describing a system through multiple DSLs. It does not have to be a black or white thing; you can represent little or a lot of functionality of your system in DSLs.

Pros and Cons of Language Oriented Programming

External DSL



  • Parameters can be changed without recompiling. Evaluated at runtime.
  • Format / grammar free.
  • A translator needs to be built.
  • Lack of symbolic integration.
  • Refactoring will not propagate to the DSL.
  • Lack of sophisticated editor.
  • Language cacophony: a new language must be learned.
  • Difficulty of designing DSLs.

Internal DSL



  • Symbolic Integration.
  • Host language IDE can be used.
  • Easy refactoring.
  • Sophisticated editors.

  • Recompilation required when making changes (in most languages).
  • Format / grammar free.

Language Workbenches

Language Workbenches are complex tools that help implementing DSLs. They are based on the same model as post-IntelliJ IDEs, which consists of having four different representations of the code:

Abstract representation: In-memory representation. Helps with things like name completion and refactoring. It is the key persistent source that you manipulate (through the editable representation). It can persist incomplete or contradictory information.

Editable representation: Projection of the abstract representation in order to edit it. It does not have to be complete. There can be multiple projections, one for each aspect.

Storage representation: Serialization of the abstract representation, often as XML.

Executable representation: The CLR byte code. A code generator turns the abstract representation into the executable representation.

So, when defining a new DSL you need to:

  • Define the abstract syntax, the schema of the abstract representation.
  • Define an editor to let people manipulate the abstract representation through a projection.
  • Define a generator to translate the abstract representation into an executable representation. In practice the generator defines the semantics of the DSL.

04 May 2006

Configuration Files & NUnit

When using an application configuration file (App.config) in a nunit library application project, the following post build event is needed to copy the config file to the target directory:
copy "$(ProjectDir)App.config" "$(TargetPath).config"
Once this is done, we can access any configuration item through:

08 March 2006

Generating Documentation for .NET Project

Suppose we need to generate documentation for a .net project in the Microsoft HTML Help style, more known as .chm files. Different tools can be used depending on what it is being documenting.

Documenting code

We can use xml documentation by placing comments inside xml tags (like <summary> or <remarks>) within any line beginning with three space bars

. For a complete tag reference see TagsForDocumentationComments .

The C# compiler processes documentation comments within the code to an XML file. This is done in the build process if the documentation xml file was specified in the project's build properties. (Notice that a lot of warnings about documentation will prompt if warning level is set to 4 in the project properties.)

Now, once we have the xml file, we want to parse it in order to generate easy reading documentation. Luckily, there is a free tool that does that: NDoc. Ndoc provides several output formats, including the MSDN one.

Including custom content in the chm file

If we need to include custom content in the chm file, other than the code documentation generated by NDoc, we have to write it in HTML format and then compile it into the chm file. The HTML Help SDK can be used for the last purpose. The SDK shipps with a tool, the HTML Help Workshop (\hhw.exe), that is used to create HTML Help Projects.

An HTML Help project consists of three parts:

  • The Project itself (.hhp file)
  • The Table of Contents (.hhc file)
  • The Index (.hhk file)

Documenting XML schemas

It seems that the documentation of xml schemas is a little more complicated task. First we need to place comments inside the schema and then we need to generate the HTML help from them.

In order to place documentation inside an xml schema, the annotation component can be used..

But again, a way for generating the HTML Help from the documented schema is needed. This time I have found no free tool to do that. I have found a nice tool, Document!X, neither free nor open source. So, I decided to generate the MSDN documentation myself by transforming the xsd file into an HTML help file through a style sheet. The generated HTML file is later included in the chm file as any else custom content.



06 March 2006

Blog's Birth

My blog has born at last!

What is this blog about? This is a developer's blog, so just technical issues will be posted. The idea is to use this blog as a sort of knowlage base, just to write the daily little research and findings, for others to read and for me to remember. I was definitely needing a place to store all my writings and links and stuff, so I am very happy for this beginning and hope to keep it updated.

Of course averyone is welcome to contribute by posting comments or contact me.