Posts From January 2013 - Musing, Rants & Jumbled Thoughts

Header Photo Credit: Lorenzo Cafaro (Creative Commons Zero License)

This article is one in my series focusing on what I learned creating a Windows Installer using the WiX toolkit’s "Managed Bootstrapper Application" (Custom .NET UI) framework. Each post builds upon the previous posts, so I would suggest you at least skim through the earlier articles in the series.

Here’s an index into the articles in this series:

Running the action

Once the Engine's planning phase is complete (see previous post), we can call Engine.Apply(IntPtr.Zero) to execute the action. Note: the parameter to Apply() is a window handle. You could also pass handle to your UI window, but getting that value is beyond what I want to get into here. I believe this is used by Windows when prompting for UAC, but this all works fine with IntPtr.Zero. The Apply action will provide status via events as well:

BootstrapperApplication.ExecuteMsiMessage += EventProviderOnExecuteMsiMessage; 
BootstrapperApplication.Progress += EventProviderOnProgress; 
BootstrapperApplication.ExecuteComplete += EventProviderOnExecuteComplete; 
BootstrapperApplication.ExecuteFilesInUse += EventProviderOnExecuteFilesInUse; 
BootstrapperApplication.ExecutePackageBegin += EventProviderOnExecutePackageBegin; 
BootstrapperApplication.ExecutePackageComplete += EventProviderOnExecutePackageComplete; 
BootstrapperApplication.ApplyComplete += BootstrapperOnApplyComplete;

this.Engine.Apply(IntPtr.Zero); 

The ExecuteMsiMessage callback provides some non-user friendly message from the MSI engine, which I just log. I use the ExecuteComplete to set my percentage variables to 100% (since the last progress event might only by 95% or so) and the ExecutePackageBegin and ExecutePackageComplete to set a UI label saying "Currently installing XXX package".

The ExecutePackageBegin includes the packageId for the package currently being installed. You'll need to use that to lookup the Name, etc, from the Package models you generated previously (see Part 2 of this series).

private void EventProviderOnExecutePackageBegin(object sender, ExecutePackageBeginEventArgs executePackageBeginEventArgs) 
{
    var inFlightPkgId = executePackageBeginEventArgs.PackageId;
    var inFlightPkg = BundlePackages.FirstOrDefault(pkg => pkg.Id == inFlightPkgId);

    if (inFlightPkg == null)
    {
        CurrentlyProcessingPackageName = string.Empty;
    }
    else
    {
        CurrentlyProcessingPackageName = inFlightPkg.Name;
    }
} 

While the action is being performed, progress will be provided via events, which will tell you which package is currently being installed, percentage completions, etc. One note: if a rollback is initiated due to a failure or error in the install, the progress percentages will decrease as the rollback occurs. Also note, this is the best place to handle user cancellations (see below section for that topic).

private void EventProviderOnProgress(object sender, ProgressEventArgs progressEventArgs) 
{ 
    //update local properties (which are likely bound to a ProgressBar or something 
    CurrentComponentProgressPercentage = progressEventArgs.ProgressPercentage; 
    OverallProgressPercentage = progressEventArgs.OverallPercentage;

    //... handle user cancellations here

} 

For the ExecuteFileInUse, I haven't actually been successful in forcing this error, but here's an example of the code I have to handle it:

private void EventProviderOnExecuteFilesInUse(object sender, ExecuteFilesInUseEventArgs executeFilesInUseEventArgs) 
{
    var message = new StringBuilder("The following files are in use. Please close the applications that are using them.n"); 
    foreach (var file in executeFilesInUseEventArgs.Files) 
    { 
        message.AppendLine(" - " + file); 
    }

    var userButton = MessageBox.Show(message.ToString(), "Files In Use", MessageBoxButto.OKCancel, MessageBoxImage.Warning);

    if (userButton != MessageBoxResult.OK)
        executeFilesInUseEventArgs.Result = Result.Cancel;
} 

When the apply action is complete, it will fire the ApplyComplete handler. This is where you can set status, etc. Here is what my method looks like:

private void BootstrapperOnApplyComplete(object sender, ApplyCompleteEventArgs applyCompleteEventArgs) 
{ 
    BootstrapperApplication.ApplyComplete -= BootstrapperOnApplyComplete;

    //using "ActionResult" property to store the result for use
    // when I call Engine.Quit()

    if (applyCompleteEventArgs.Status >= 0)
    {
        ActionResult = ActionResult.Success;
    }
    else
    {
        ActionResult = ActionResult.Failure;
    }
} 

Finishing it out

When you're ready to close down the app, even if you haven't actually done anything (user cancellation, etc), you'll need to call Engine.Quit() with one of these ActionResult values: (hint: you'll need to cast the Enums to an int)

ActionResult.Success
The action was run and was successful.
ActionResult.UserExit
Action was cancelled by user, therefore was unsuccessful.
ActionResult.Failure
Action had errors and was unsuccessful.
ActionResult.NotExecuted
No actions were performed.

Be careful that you return the correct values, as that affects the return code of the executable (which the caller may be monitoring in a scripted install) as well how the Add/Remove Programs list shows (or doesn't show) your app and if locally cached instances of your installer are deleted, etc. An incorrect return value may result in an Add/Remove programs listing that never gets removed on uninstall, or never shows up in the first place, and other oddities.

Note too that you'll need to handle closing your UI windows, etc, on your own -- calling Engine.Quit() does not close your UI or terminate your threads, etc.

Engine.Quit((int) ActionResult.Success);

Canceling while action is being performed

Most of the WiX events have an EventArgs parameter that includes a Result property. This can be used to cancel asynchronous engine operations. For instance, the progress event fires frequently, so is a good candidate. If the user clicks your cancel button during the install, you can set Result=Result.Cancel on the next event to signal the engine to stop and initiate the rollback process. This assumes you provide the user a cancel button. You'll also want to monitor if the user closes your UI window using the "X" button in the top right corner.

private void EventProviderOnProgress(object sender, ProgressEventArgs progressEventArgs) 
{ 
    //.... 
    if (_userHasCancelled) 
        progressEventArgs.Result = Result.Cancel; 
} 


This article is one in my series focusing on what I learned creating a Windows Installer using the WiX toolkit’s "Managed Bootstrapper Application" (Custom .NET UI) framework. Each post builds upon the previous posts, so I would suggest you at least skim through the earlier articles in the series.

Here’s an index into the articles in this series:

Detecting current state

In order to determine which, if any, of the bundled packages and features, as well as the bundle itself, are already installed on the system, and at what versions, we'll ask the WiX engine to detect current state by calling Engine.Detect() on the BootstapperApplication base class. Note this is an asynchronous process, so the call will return immediately and we'll get our results via a series of events. As such, before calling Engine.Detect(), we need to register our event handlers for the detect events (these are also on the base class). Note: You may not need both DetectPackageComplete and DetectRelatedMsiPackage, depending on your needs.

//
// Call Engine.Detect, asking the engine to figure out what's on the machine.
// The engine will run async and use callbacks for reporting results.
//

// This is called when the bundle is detected
BootstrapperApplication.DetectRelatedBundle += HandleExistingBundleDetected;

// This is called when a package in the bundle is detected
BootstrapperApplication.DetectPackageComplete += SetPackageDetectedState;

// This is called when a package in the bundle is detected
BootstrapperApplication.DetectRelatedMsiPackage += HandleExistingPackageDetected;

// This is called when a Feature in the bundle's packages is detected
BootstrapperApplication.DetectMsiFeature += SetFeatureDetectedState;
BootstrapperApplication.DetectComplete += DetectComplete;
BootstrapperApplication.Detect();

As the engine determines a package or feature from our bundle is already on the system, it will fire the associated event. The specialized EventArgs parameter provided to each event handler will have the packageId (and featureId for features) and the current state. As each of these fire, you'll want to search the package and feature models you've built to find the package/feature identified by the event args, and set the current state Enum on each based on the incoming value.

Note: in these examples, the "BundlePackages" variable is a reference to the collection of Package model objects I suggested you create in Part 3 (Context Data)

private void HandleExistingPackageDetected(object sender, DetectRelatedMsiPackageEventArgs e)
{
    string existingPackageProductCode = e.ProductCode;

    RelatedOperation actionToBeApplicedToExistingPackage = e.Operation;
    string existingPackageId = e.PackageId;
    Version existingPackageVersion = e.Version;

    //update your model objects here (search models by PackageId)
}

private void HandleExistingBundleDetected(object sender, DetectRelatedBundleEventArgs e)
{
    Version existingBundleVersion = e.Version;
    string existingBundleProductCode  = e.ProductCode;
    RelatedOperation actionToBeAppliedToExistingBundle = e.Operation;

    //update your model object here
}

/// <summary>
/// when engine detects a package, populate the appropriate local objects,
/// including current installed state of the package on the system
/// </summary>
private void SetPackageDetectedState(object sender, DetectPackageCompleteEventArgs args)
{
    var package = BundlePackages.FirstOrDefault(pkg => pkg.Id == args.PackageId);
    PackageState currentState = args.State;
    package.CurrentInstallState = currentState;
}

/// <summary>
/// when engine detects a feature, populate the appropriate local objects,
/// including current installed state of the package on the system
/// </summary>
private void SetFeatureDetectedState(object sender, DetectMsiFeatureEventArgs args)
{
    var package = BundlePackages.FirstOrDefault(pkg => pkg.Id == args.PackageId);
    var feature = package.AllFeatures.FirstOrDefault(feat => feat.Id == args.FeatureId);
    FeatureState currentState = args.State;

    feature.CurrentInstallState = args.State;
}

Below are the values for RelatedOperation:

public enum RelatedOperation
{
    None,

    /// <summary>
    /// The related bundle or package will be downgraded.
    /// </summary>
    Downgrade,

    ///<summary>
    /// The related package will be upgraded as a minor revision.
    ///</summary>
    MinorUpdate,

    ///<summary>
    /// The related bundle or package will be upgraded as a major revision.
    ///</summary>
    MajorUpgrade,

    ///<summary>
    /// The related bundle will be removed.
    ///</summary>
    Remove,

    ///<summary>
    /// The related bundle will be installed.
    ///</summary>
    Install,

    ///<summary>
    /// The related bundle will be repaired.
    ///</summary>
    Repair,
};

Below are the values for FeatureState. For the most part, you'll likely only care about Unknown (state not yet discovered), Absent (not installed) and Local (installed).

public enum FeatureState
{
    Unknown, 
    Absent,  
    Advertised, 
    Local, 
    Source,
}

Below are the values for PackageState. For the most part, you'll likely only care about Unknown (state not yet discovered), Absent (not installed) and Present (installed)

public enum PackageState
{
    Unknown,
    Obsolete,
    Absent,
    Cached,
    Present,
    Superseded,
}

When the Detect action is complete, it will fire the DetectComplete handler, where you'll want to perform whatever UI actions you require before moving forward with the installation, such as prompting the user if they want to run a "typical" or "custom" install, or if packages are already installed, asking if they want to fully uninstall or just add/remove features. Here's what my method looks like:

    /// 
    /// Once the engine completes the Detect phase, unregister event handlers,
    /// release the main thread and register the planning phase event handlers
    /// 
    void DetectComplete( object sender, DetectCompleteEventArgs e)
    {
        BootstrapperApplication.DetectPackageComplete -= SetPackageDetectedState;
        BootstrapperApplication.DetectMsiFeature -= SetFeatureDetectedState;
        BootstrapperApplication.DetectComplete -= DetectComplete;

       //logic to continue here — likely to allow user to select package state, etc, in the UI
    }

Planning future state

Before we tell the windows installer to go off and run the install actions, we need to tell it what we want it to do. This is achieved by calling Engine.Plan() with an action enum (install, uninstall, etc) on the BootstrapperApplication base class. Similar to the Detect() sequence, this will initiate an asynchronous process, so before calling, we need to register event handlers:

        BootstrapperApplication.PlanPackageBegin += SetPackagePlannedState;
        BootstrapperApplication.PlanMsiFeature += SetFeaturePlannedState;
        BootstrapperApplication.PlanComplete += BootstrapperOnPlanComplete;
        this.Engine.Plan(LaunchAction.Install);

The engine will then fire an event for each package and feature in our bundle, each with a specialized event args parameter. Our job is to set the requested state on the event args for each.

    /// 
    /// when engine plans action for a package, set the requested future state of
    /// the package based on what the user requested
    /// 
    private void SetPackagePlannedState( object sender, PlanPackageBeginEventArgs planPackageBeginEventArgs)
    {
        var pkgId = planPackageBeginEventArgs.PackageId;
        var pkg = BundlePackages.FirstOrDefault(p => p.Id == pkgId);

        //I’m assuming a property “RequestedInstallState” on your model
        //of type RequestState.
        planPackageBeginEventArgs.State = pkg.RequestedInstallState;
    }

    /// 
    /// when engine plans action for a feature, set the requested future state of
    /// the package based on what the user requested
    /// 
    private void SetFeaturePlannedState( object sender, PlanMsiFeatureEventArgs planMsiFeatureEventArgs)
    {           
        var pkg = BundlePackages.First(p => p.Id == planMsiFeatureEventArgs.PackageId);
        var feature = pkg.AllFeatures.First(feat => feat.Id == planMsiFeatureEventArgs.FeatureId);

        //I’m assuming a property “RequestedState” on your model
        //of type FeatureState.
        planMsiFeatureEventArgs.State = feature.RequestedState;
    }

Below are the values for FeatureAction. For the most part, you'll likely only care about None (don't change from the current state), AddLocal (install it), Reinstall, Remove (uninstall).

public enum FeatureAction
{
    None,
    AddLocal,
    AddSource,
    AddDefault,
    Reinstall,
    Advertise,
    Remove,
}

Below are the values for RequestState. For the most part, you'll likely only care about None (don't change from current state), ForceAbsent (force uninstall), Absent (uninstall), Present (install) and Repair.

public enum RequestState
{
    None,
    ForceAbsent,
    Absent,
    Cache,
    Present,
    Repair,
}

When the plan action is complete, it will fire the PlanComplete event handler. This is where you'll want to start up the next set of actions. In my case, I'm going straight from Plan to Apply (see below), so my method looks like this:

private void BootstrapperOnPlanComplete(object sender, PlanCompleteEventArgs args)
    {           
        BootstrapperApplication.PlanComplete -= BootstrapperOnPlanComplete;

        //Code to initiate Apply action goes here.. See Part 5 post for more details.
    }


This article is one in my series focusing on what I learned creating a Windows Installer using the WiX toolkit’s "Managed Bootstrapper Application" (Custom .NET UI) framework. Each post builds upon the previous posts, so I would suggest you at least skim through the earlier articles in the series.

Here’s an index into the articles in this series:

Getting access to installer (and package) metadata

A lot of information is embedded in the WiX xml files, such as package/feature layout, names, descriptions, ids, etc, which we use to build out our bundle models, but almost none of it is made available at runtime via the event args. However, WiX does generate a BootstrapperApplicationData.xml file which includes a lot of that information and is included in the files available at runtime. We can parse that file at runtime in order to access that metadata, which I suggest you do before you run the detection logic (see below) in order to have a populated model to use in the event handlers. Since the file, along with all of our assemblies and .msi files, are placed in a randomly-name temp folder, we can’t know ahead of time where the file will live, so we must use our assembly’s path to find it.

You can then parse the XML to get the metadata. I would suggest running a makeshift installer in debug mode and setting a breakpoint here to inspect the contents of the XML in order to get a full list of what’s available. Here’s an example of how I get data from the file. Note: in this example, my domain objects are MBAPrereqPackage, BundlePackage and PackageFeature, each of which take an XML node object in their constructor and further parse the data into the object’s properties.

const  XNamespace ManifestNamespace = ( XNamespace) “http://schemas.microsoft.com/wix/2010/BootstrapperApplicationData” ;

public void Initialize()
{

    //
    // parse the ApplicationData to find included packages and features
    //
    var bundleManifestData = this.ApplicationData;
    var bundleDisplayName = bundleManifestData 
                              .Element(ManifestNamespace + “WixBundleProperties“ )
                              .Attribute( “DisplayName“)
                              .Value;

    var mbaPrereqs = bundleManifestData.Descendants(ManifestNamespace + “WixMbaPrereqInformation“)
                                       .Select(x => new MBAPrereqPackage(x))
                                       .ToList();

    //
    //exclude the MBA prereq packages, such as the .Net 4 installer
    //
    var pkgs = bundleManifestData.Descendants(ManifestNamespace + “WixPackageProperties“)
                                 .Select(x => new BundlePackage(x))
                                 .Where(pkg => !mbaPrereqs.Any(preReq => preReq.PackageId == pkg.Id));

    //
    // Add the packages to a collection of BundlePackages
    //
    BundlePackages.AddRange(pkgs);

    //
    // check for features and associate them with their parent packages
    //
    var featureNodes = bundleManifestData.Descendants(ManifestNamespace + “WixPackageFeatureInfo“);
    foreach ( var featureNode in featureNodes)
    {
       var feature = new PackageFeature(featureNode);
       var parentPkg = BundlePackages.First(pkg => pkg.Id == feature.PackageId);
       parentPkg.AllFeatures.Add(feature);
       feature.Package = parentPkg;
    }
}

/// 
/// Fetch BootstrapperApplicationData.xml and parse into XDocument.
/// 
public XElement ApplicationData
{
    get
    {
        var workingFolder = Path.GetDirectoryName(this.GetType().Assembly.Location);
        var bootstrapperDataFilePath = Path.Combine(workingFolder, “BootstrapperApplicationData.xml”);

        using (var reader = new StreamReader(bootstrapperDataFilePath))
        {
            var xml = reader.ReadToEnd();
            var xDoc = XDocument.Parse(xml);
            return xDoc.Element(ManifestNamespace + “BootstrapperApplicationData“);                   
        }
    }
}

Access to command line parameters (Install/Upgrade/Modify/Uninstall, Silent mode, etc)

Along with the Engine property provided in the base class, a Command property is also exposed. There are a few properties off that Command object that are very useful:

The Action property, which exposes a LaunchAction enum value, tells you how the installer was initiated. If the user just double-clicked on the executable, it will come in as Install, but if command-line parameters are used to execute a specific action, that will be translated into this enum. This includes clicking “Uninstall” from the Add/Remove programs list, etc.

/// 
/// Requested action from the commandline
/// 
public LaunchAction RunMode { get { return Command.Action; } }

public enum LaunchAction
{
  Unknown,
  Help,
  Layout,
  Uninstall,
  Install,
  Modify,
  Repair,
}

The Display property, which exposes a Display enum value, tells you if the user wants silent mode, etc. These map to the Windows Installer commandline allowed values.

/// 
/// Requested display mode from the commandline
/// (Full, Passive/Silent, Embedded)
/// 
public Display DisplayMode { get { return Command.Display; } }

public enum Display
{
  Unknown,
  Embedded,
  None,
  Passive,
  Full,
}

And then the CommandLine property exposes the rest of the command line. Note that WiX will actually remove several of the parameters that are exposed via other properties (Display, Action, etc)

/// 
/// Full application command line
/// 
public IEnumerable<string> CommandLine { get { return (Command.CommandLine ?? string.Empty).Split(‘ ‘ ); } }


This article is one in my series focusing on what I learned creating a Windows Installer using the WiX toolkit’s "Managed Bootstrapper Application" (Custom .NET UI) framework. Each post builds upon the previous posts, so I would suggest you at least skim through the earlier articles in the series.

Here’s an index into the articles in this series:

You’ll need at least two projects in your Visual Studio solution: The Bootstrapper project and a .Net assembly for it to run.  I would suggest adding a WPF project for your .Net assembly.

The Bootstrapper Project

Creating the bundle itself uses XML files similar to the Package WiX files you may already be accustomed to. You’ll need to install the WiX toolkit “Votive” component, which provides Visual Studio templates and integration.

In your Visual Studio solution, you’ll first, add a "Bootstrapper" project, which will add some placeholder .wxs files.

There's a <Bundle> element, which includes general metadata about the top-level installer (name, version, etc).  If you're building your own UI using .NET, you'll need to include <BootstrapperApplicationRef Id="ManagedBootstrapperApplicationHost"> where ManagedBootsrapperApplicationHost is a pre-defined trigger for WiX to load your managed UI. Inside that tag, you'll need to define a <PayloadGroup> element (or <PayloadGroupRef> to define the group elsewhere).  The PayloadGroup defines files that are unpacked at runtime along with your assembly and is used for other assemblies yours depends upon.  You'll want to include BootstrapperCore.config and Microsoft.Deployment.WindowsInstaller.dll in any case, as well as you're installer assembly.

Then, you'll need a <Chain> element to define the MSI's that will be installed by the bundle.  You'll likely want to include one of the pre-defined .NET installers to ensure .Net is on the system before you're .Net-based UI is loaded.  In the example below, I'm using the NetFx40Web PackageGroup which pulls the .Net 4.0 web installer, but only if .Net 4 (full) is not already installed on the system.  The web installer UI will be shown to the user before any of your code is executed.   One HUGE word of caution here:  Because the .Net installer will technically be part of your install chain, if the user installs .Net but then cancels your install, your installer will still be listed in the Add/Remove programs since one if it's components (the .Net installer) completed.  Tread with caution.

Of final note: There is a <MsiProperty> element inside the MsiPackage tag that is used to allow Engine variables (things you can set in your code) to pass-through to the MSIs.

<?xml version="1.0" encoding =" UTF-8" ?>
<Wix xmlns="http://schemas.microsoft.com/wix/2006/wi"
     xmlns:util="http://schemas.microsoft.com/wix/UtilExtension" >

  <Bundle Name="My Super Great Product Bundle"
          Version="1.0.0.0"
          Manufacturer="John M. Wright"
          IconSourceFile="jwright.ico"
          UpgradeCode="{D4578DG3-ABCD-1234-8693-ACAAF4A3A785}"
          AboutUrl="http://wrightthisblog.blogspot.com"
          Compressed="yes" >
   
      < BootstrapperApplicationRef Id =" ManagedBootstrapperApplicationHost" >
               < PayloadGroupRef Id =" InstallerPayload" />
      </ BootstrapperApplicationRef>

      <Chain>
           <!-- Install .Net 4 Full -->
           < PackageGroupRef Id =" NetFx40Web" />

           <!— my packages to install -->
           < PackageGroupRef Id =" InstallerPackages" />     
       </ Chain>
   </ Bundle>

 < Fragment>
    < PayloadGroup Id =" InstallerPayload">
      < Payload SourceFile =" $(var.jwright.Installer.TargetPath)"/>
      < Payload SourceFile="$(var.jwright.Installer.TargetDir)\\BootstrapperCore.config" />     
      < Payload SourceFile="$(var.jwright.Installer.TargetDir)\\Microsoft.Deployment.WindowsInstaller.dll" />
    </ PayloadGroup>
 </ Fragment>

 < Fragment>
    < PackageGroup Id =" InstallerPackages" >

      < MsiPackage SourceFile="$(var.MyProductInstaller.TargetPath)"
          Compressed="yes" EnableFeatureSelection="yes" Vital="yes">

        < MsiProperty Name="APPLICATIONFOLDER" Value="[MyInstallFolder]" />
      </ MsiPackage>

      < MsiPackage SourceFile="$(var.AnotherProductInstaller.TargetPath)"
          Compressed="yes" EnableFeatureSelection="yes" Vital="yes">

    </ PackageGroup>
  </Fragment>
</Wix>

Your .Net Assembly Project

Once the native code bootstrapper loads up, it will attempt to load our managed code.  We must do two things in order to get the handoff and communication working.

First, we must create a class that extends the WiX BootstrapperApplication base class.  This class must then override the Run() method, which is what gets called by WiX once our class is loaded.

Important Note:  The BootstrapperApplication base class includes an Engine property, which is a reference to the WiX engine.  Throughout this blog, when I say you call Engine.Somemethod(), you would do this from within this custom class by calling this.Engine.Somemethod();

namespace jwright.Installer
{
    /// <summary>
    /// This is the main entry point into the installer UI, including communication with the
    /// installer engine process via the WiX base class.
    /// </summary>
    public class CustomBootstrapperApplication : BootstrapperApplication
    {

        /// <summary>
        /// Entry point of managed code
        /// </summary>
        protected override void Run()       
        {          
             //... do your thing here 
        }
    }
}

Then, we must add an attribute to the assembly (in our AssemblyInfo.cs) designating that class as the one we want WiX to load:

//WiX -- denotes which class is the Managed Bootstrapper
[assembly: BootstrapperApplication( typeof( CustomBootstrapperApplication))]

Additionally, you'll need to create a file called BootstrapperCore.config in your .Net project which will have app.config style data for your installer.  One key element to include is the node with , which denotes which runtime(s) can be used for your app.  Additionally, you'll want to include the tag under which further defines the  .Net runtime version(s) you want to utilize, including Full vs Client designations.  You can also include any other elements that would normally go into an App.Config file, such as web service endpoint declarations.  In the below example, I'm stipulating that the full .net 4 framework is the only one supported for my managed installer assembly.

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
 
  <configSections>
    <sectionGroup name="wix.bootstrapper" type="Microsoft.Tools.WindowsInstallerXml.Bootstrapper.BootstrapperSectionGroup, BootstrapperCore">
      <section name="host" type="Microsoft.Tools.WindowsInstallerXml.Bootstrapper.HostSection, BootstrapperCore" />
    </sectionGroup>    
  </configSections>
 
  <startup useLegacyV2RuntimeActivationPolicy="true">
    <supportedRuntime version="v4.0" />
  </startup>
 
  <wix.bootstrapper>
    <host assemblyName="Your.Installer.Assembly.Name.Goes.Here">
      <supportedFramework version="v4\\Full" />
    </host>
  </wix.bootstrapper>

</configuration>

Classes for keeping track of state at runtime

My suggestion is for you to create a set of Bundle, Package and Feature model objects to collect and track the metadata and instance data at runtime, since this will all come from several different sources at different times, during execution.  I'll assume you're going this route in the rest of this series.

Some thoughts on your models:

  • Bundle (there will only be one of these so you could just use your bootstrapper class as the model)
    • Has an Id
    • Contains a collection of Packages
    • Can have addition metadata you may want to display to the user, such as Name, Description, version, etc
  • Package
    • Has an Id
    • Contains a collection of Features, which may be empty depending on how you configure your bundle.
    • Has a current state (PackageState enum)
    • Has a future/requested state  (RequestState enum)
    • Can have addition metadata you may want to display to the user, such as Name, Description, version, etc
  • Feature
    • Has an Id
    • Has a current state (FeatureState enum)
    • Has a future/requested state (FeatureAction enum)
    • Can have addition metadata you may want to display to the user, such as Name, Description, etc


With the release of the Windows Installer XML Toolkit (WiX) v3.5, a new concept of chained installer bundles was added, and refined in v3.6. This brought two key features to the toolset:

  • The ability to group several individual Microsoft Installer (.msi) files together into a single installer executable and (optionally) have them show as only one product in the Add/Remove Programs list
  • The option to develop your own custom UI for the installer, including one written in .NET

While there are several places to find documentation on the more mature features of WiX, I found that there was almost no good information available around what it takes to write a custom .NET UI for my bundle, so voi la... a blog post (series) is born.

This article is one in my series focusing on what I learned creating a Windows Installer using the WiX toolkit’s "Managed Bootstrapper Application" (Custom .NET UI) framework. Each post builds upon the previous posts, so I would suggest you at least skim through the earlier articles in the series.

Here’s an index into the articles in this series:

In this series I will focus on what I learned creating a "Managed Bootstrapper Application" (Custom .NET UI) using the WiX v3.6 SDK.  I assume you already have a basic understanding of WiX and will touch on some of the other concepts in WiX, but if you're looking for information on how to build an individual MSI, or write custom actions for an MSI in .Net, here are some better references:

Quick concept overview

The general idea is this:  You have one or more "package" installer files (.msi or .exe) that you created or are from third parties (make sure they have redistribution licenses). You want to improve the installation experience for the end user, so you bind all of these installers into a single .exe which will install them in a predetermined order, optionally allowing the user to select which packages and sub-features should be installed. Note that the individual component MSIs are installed via "silent" mode, so their UI will not be shown (assuming they properly implement silent mode).

For example: let's take Microsoft Office as an example. Office has several individual products (Word, Outlook, Excel, etc) which, for this example, we'll assume are each individual MSI files.  But as an end user, I want to run a single installer, choose which products to install, agree to a single EULA, and have just one Add/Remove Programs entry. This can be achieved by creating a bundled installer that includes all of the individual product MSIs.  Also note that Office has its own UI experience, which differs from the typical MSI battleship-gray UI.

Office 2010 Installer with more user-friendly UI MySql installer using more typical installer UI

While I don't know if the Office installer uses WiX (It doesn't, per Rob's comment on this post), I do know that the Visual Studio 2012 installer does, and it has a completely unique installer user experience, built in WPF and .Net on top of the WiX managed bootstrapper framework.

What happens at Runtime

When the user executes your bundled installer, a WiX-provided native (ie: C++) application is initially loaded. This is the "Engine". This portion of the installer is what actually interacts with the Windows Installer APIs.  The Engine does some initial checks to make sure the version of .NET required by our code has been installed, and if we've registered the .NET installer as part of our package chain, it will go ahead and run that installer.  Once .NET is ready, the WiX Engine loads the class we've registered via the [assembly: BootstrapperApplication] assembly attribute.

All communication with the Engine from that point forward is done via the events available through the BootstrapperApplication base class. There's a list of those events at the end of this post.

Once our managed code is loaded, we need to walk through some MSI-specific steps to make the installer work correctly:

First, we'll use Engine.Detect to determine the current state of the machine.  We'll use a set of events to get notifications about the bundle, packages and features that are a part of our installer.  Before we do that, however, we can create some objects to store details about our packages and features using an xml config file WiX encloses in our bundle's files.

Next, we'll use Engine.Plan to set the requested state (the state we want the component to be in when we're done) for each of our packages and features. Again, we'll use a register a set of events for the packages and features, and when they fire, we'll set the requested state values.

Then, we'll use Engine.Apply to tell the engine to apply the requested changes (ie: install or remove the packages and features). During this phase, there will be a set of events we'll use to get progress updates and error information.

Finally, we'll use Engine.Exit to notify the engine we're done, and if the operation was successful, failed or if the user cancelled, etc.

I'll dig into each of these in more detail in the remaining posts in this series.

And much, much more...

The content of this posting grew quite large, so I’ve split it into multiple, more focused postings. But there's so much more than I have put into these posts that I’d like to write about, so I'll likely write some follow-up posts dealing with passing variables to your MSIs and into CustomActions, signing your installer so your UAC prompts show them as coming from a trusted source, etc.  And, hopefully, the WiX site will get some better documentation around the managed bootstrappers soon.

Here’s an index into the articles in this series:

Engine Events

For reference, below is a list of the events exposed on the Bootstrapper base class which are used for async communication with the Engine.

One important note: The events will run on a non-UI thread, so if you're manipulating UI-bound values in your event handlers, make sure to use the Dispatcher.Invoke() command to run that on your UI thread to avoid Exceptions.

        //Events related to the Apply method
        event EventHandler<ApplyBeginEventArgs > ApplyBegin;
        event EventHandler<ApplyCompleteEventArgs > ApplyComplete;

        event EventHandler<RegisterBeginEventArgs > RegisterBegin;
        event EventHandler<RegisterCompleteEventArgs > RegisterComplete;
        event EventHandler<UnregisterBeginEventArgs > UnregisterBegin;
        event EventHandler<UnregisterCompleteEventArgs > UnregisterComplete;

        //Events related to package acquisition. Really only needed if you're building a web installer
        event EventHandler<ResolveSourceEventArgs > ResolveSource;

        event EventHandler<CacheBeginEventArgs > CacheBegin;
        event EventHandler<CachePackageBeginEventArgs > CachePackageBegin;
        event EventHandler<CacheAcquireBeginEventArgs > CacheAcquireBegin;
        event EventHandler<CacheAcquireProgressEventArgs > CacheAcquireProgress;
        event EventHandler<CacheAcquireCompleteEventArgs > CacheAcquireComplete;
        event EventHandler<CacheVerifyBeginEventArgs > CacheVerifyBegin;
        event EventHandler<CacheVerifyCompleteEventArgs > CacheVerifyComplete;
        event EventHandler<CachePackageCompleteEventArgs > CachePackageComplete;
        event EventHandler<CacheCompleteEventArgs > CacheComplete;

        //Events related to the Plan method
        event EventHandler<PlanBeginEventArgs > PlanBegin;
        event EventHandler<PlanRelatedBundleEventArgs > PlanRelatedBundle;
        event EventHandler<PlanPackageBeginEventArgs > PlanPackageBegin;
        event EventHandler<PlanTargetMsiPackageEventArgs > PlanTargetMsiPackage;
        event EventHandler<PlanMsiFeatureEventArgs > PlanMsiFeature;
        event EventHandler<PlanPackageCompleteEventArgs > PlanPackageComplete;
        event EventHandler<PlanCompleteEventArgs > PlanComplete;

        //Events related to the Execute method
        event EventHandler<ExecuteBeginEventArgs > ExecuteBegin;
        event EventHandler<ExecutePackageBeginEventArgs > ExecutePackageBegin;
        event EventHandler<ExecutePackageCompleteEventArgs > ExecutePackageComplete;
        event EventHandler<ExecuteCompleteEventArgs > ExecuteComplete;
       
        event EventHandler<ProgressEventArgs > Progress;
        event EventHandler<ExecuteProgressEventArgs > ExecuteProgress;
        event EventHandler<ExecutePatchTargetEventArgs > ExecutePatchTarget;
        event EventHandler<ExecuteMsiMessageEventArgs > ExecuteMsiMessage;

        //Events related to error scenarios
        event EventHandler<ErrorEventArgs > Error;
        event EventHandler<ExecuteFilesInUseEventArgs > ExecuteFilesInUse;


        //Events related to start/stops (in the event a reboot is required)
        event EventHandler<StartupEventArgs > Startup;
        event EventHandler<ShutdownEventArgs > Shutdown;
        event EventHandler<SystemShutdownEventArgs > SystemShutdown;
        event EventHandler <RestartRequiredEventArgs > RestartRequired;

        //Events related to the Detect method
        event EventHandler<DetectBeginEventArgs > DetectBegin;
        event EventHandler<DetectPriorBundleEventArgs > DetectPriorBundle;
        event EventHandler<DetectRelatedBundleEventArgs > DetectRelatedBundle;
        event EventHandler<DetectPackageBeginEventArgs > DetectPackageBegin;
        event EventHandler<DetectRelatedMsiPackageEventArgs > DetectRelatedMsiPackage;
        event EventHandler<DetectTargetMsiPackageEventArgs > DetectTargetMsiPackage;
        event EventHandler<DetectMsiFeatureEventArgs > DetectMsiFeature;
        event EventHandler<DetectPackageCompleteEventArgs > DetectPackageComplete;
        event EventHandler<DetectCompleteEventArgs > DetectComplete;

    
        /// <summary>
        /// Fires when an MSI requests elevated permissions. Not really anything you can do with this.
        /// </summary>
        event EventHandler<ElevateEventArgs > Elevate;


In the world of software development, your job is take the complex and make it manageable, whether that's automating business processes, visualizing and aggregating massive data sets, or rendering life-like vector images in real-time, your job is to allow non-developers to interact with the application (or it's outputs) in a way that simplifies the complexities and allows them to focus on the aspects of the business/data/entertainment that is important to them.

But many programmers forget that we have the same issues.  Our codebases begin to suffer from the same complexity problems over time. This is especially true for large systems that have aging code bases and large numbers of developers (over time) with their hands in the code.  So we, too, need to keep an eye on minimizing complexity in our own domain, our code, so that we can concentrate on the task at hand instead of suffering through technical debt and bit-rotting architectures.

I'm a huge fan of continuous improvement, and I've mentioned tools like Sonar and JetBrain’s ReSharper in the past that help quickly locate problematic areas and facilitate that continuous improvement process.

So when Patrick Smacchia (lead developer for NDepend, and I think sole developer/CEO) contacted me to request I review the NDepend tool in exchange for a free license, I was happy to do so.

Quick links to sections of this review:

General Conclusions:

NDepend is a very powerful static analysis tool for .Net codebases that provides HTML reports and interactive GUI(s) for finding overly complex or problematic areas of your code, performing analysis for refactoring and comparing changes over time.  In version 4, a LINQ-like query language was added that makes the tool an extremely powerful reporting engine and can be used to enforce coding standards rules on your build systems similar to FxCop or unit testing frameworks.

While NDepend provides a very powerful backend, it's frontend, Visual NDepend, suffers from some usability issues that I feel distract the user from the power the underlying tooling provides.  With some time, you can get proficient using the tool and have access to a world of data about your codebase. And hopefully the NDepend developers can address some of the usability issues in future releases.

I don't see NDepend as a tool that you'd buy for every developer on your team; but I do see value in having dev leads/architects utilize the tool, as well as having it integrated into your build/Continuous Integration process. This will allow you to 1) monitor changes over time to ensure the development team isn't adding unmerited complexity or incurring inappropriate technical debt and 2) discover high-risk/difficult to maintain areas that you should prioritize while paying down your technical debt.

General Feature Overview:

NDepend is, at it's core, a static analysis tool for .Net assemblies.  It can be run against the assemblies themselves, or by way of a Visual Studio solution.  If running against assemblies, and .pdb files exist, NDepend will pull the source code metadata from the .pdb file and import the source code directly as well.  The analysis is pretty rich and includes dependency graph resolution, cyclomatic complexity calculations, coupling, abstraction, % comments, Lines of Code, etc, etc.  One interesting bit -- several of these are calculated for both the raw source code as well as for the IL, so you can see where your code might end up generating some ugly IL even if the C# isn't so bad.

In addition to the current analysis, given access to the output of previous runs, NDepend can show you differences between snapshots.  This provides some really nice datapoints, such as breaking changes to public APIs. You can also import code coverage reports from some unit testing tools, which allows you to also gather metrics around unit test coverage.

All of this is exposed though several different reports and visualizations, from dependency matrix to heat maps, and when run in the NDepend GUI, they are interactive and configurable. These are great for exploring a systems architecture and evaluating potential refactorings, such as assembly consolidation/splitting.

But the really powerful piece seems to be the Code Quality LINQ (CQLinq) feature, which allows you to write LINQ queries against the code analysis data to generate your own reports and create code quality rules to enforce good coding standards.  A set of pre-defined rules is provided as well, including in-line comments explaining the query for quick reference. And with the NDepend console runner, you can easily integrate NDepend analysis into your CI builds. So image the ability to write a LINQ query to find all methods which have high cyclomatic complexity and low code coverage.  Or write a rule that will fail the CI build if a public API has a breaking change. Or if a large method grows even bigger, etc, etc.

Additionally, NDepend provides an API for writing your own tools to hook into pretty much any of the NDepends functionality, from creating NDepend projects, running analysis and interacting with the results.  A set of sample "power tools" is also provided, including source code, that utilize the API to perform several tasks.  So you could use this API to further automate your analysis, such as writing custom hooks for FxCop, Sonar, data warehousing, etc.

A larger list of NDepend features can be found on their website.

I’m a firm believer that people are better able to comprehend complex data when accompanied by a graphical representation of the data. I also understand that people vary as to how they process data, and there is typically not a one-size-fits-all data visualization that will sink in for everyone.  NDepend is a great tool for this, as it provides multiple visualization techniques for exploring your code and the metrics gathered during analysis.

Below is a listing of the major views within VisualNDepend and the HTML reports, of which I believe Dependency Matrix and Query and Rules Explorer to be the most useful generally, and very powerful in “real world”/day-to-day development efforts.  Additionally, you can flip between many of the view to get a different viewpoint for the same data – for instance, you can be working in the Dependency Matrix and double-click an assembly/member to jump to the Dependency Graph for that element. In many cases, particularly when look at method-level data, you can jump directly to the code in Visual Studio, allowing you to fix the problems as you come across them, which is a nice touch. You can also run NDepend within Visual Studio for an even tighter integration.

Click on any screen cap to open a larger version

Dependency Matrix

dependencyMatrix

This view shows dependencies between assemblies (and their members) and other assemblies in the project, with all of the assemblies listed on both axis and the number of dependencies (usages) between any two assemblies noted in the box where their two axis intersect.  For example, my “common.tests” project uses the “nunit.framework” assembly, so where the two intersect, I would see a number.  In the case where my assembly is on the horizontal axis, the number would be in green, indicating a “uses” relationship.  If nunit is the assembly on the horizontal axis, it would show the number in blue, indicating a “used by” relationship.  The nice thing is that the context-sensitive help (the popup dialog shown in the above screenshot) explains all of this, so you don’t have to remember it all.

Each assembly on both axis also have a tree-view expander (little + sign), where you can explode out the assemblies’ namespaces and see these dependencies' at a namespace and member level.  This is a tremendously helpful tool when moving types around between assemblies or attempting to merge or split assemblies. We’re actually in the process of doing exactly this at work, where we are removing UI-dependent and platform-specific code from one of our “common” assemblies and the person doing the work is using (an older version of) NDepend to inspect the relationships in order to know what can be moved and in what order to prevent build issues.  Additionally, it greys-out (or hashed-out) the columns which are not accessible (ie: internal, protected, etc), which is helpful.

Here is another screen cap with the exploded tree views in the dependency matrix, showing my common.tests assembly on the horizontal axis and nunit.framework on the vertical, where the TestAdd() method in my assembly uses the Assert.IsNotNull(Object) method of the nunit.framework assembly.

Another really nice touch on the dependency matrix view, when looking at assembly-to-assembly dependencies, is that a “0” is shows in the intersection square if the assembly defines a dependency to an assembly (ie: includes a reference in Visual Studio), but doesn’t actually use the assembly.  This is a clear code smell, since this is only really ever valid if you’re loading the assembly via reflection, which has it’s own code maintenance issues.

Dependency Graph

DependencyGraph

This view shows nodes, who’s size is configurable by a drop-down at the top (options include: Constant size, Lines of Code, Complexity, etc), with links connecting dependencies. When you select/hover over a node, the other nodes that use it are highlighted in green, and the nodes it uses are highlighted in blue. There’s also a couple of contextual information boxes that get populated: one (not shown here) that gives stats about the selected node, and then the context help popup (shown at bottom of screen here). The context help popup is an example of where the user experience is a little rough, as I explain in the User Experience issues section below.

Metrics (Heatmap)

heatmap

This view allows you to see methods (grouped by assembly) and set the size of each method’s square based on the metric of your choosing via a drop-box at the top.  In the screen capture, I have the size based on IL Cyclomatic Complexity, allowing me to quickly see the largest offenders.

Additionally, you can see there are several squares highlighted in blue. This is because they are methods matched by the current code query I have selected in the Query and Rules Explorer view, showing again how the various views can be used with each other to gain additional insight into the code.

Query and Rules Explorer / Editor

queriesAndRules

When it comes to differentiating this product from others in the same arena, the NDepend Code Query functionality is a huge feature, and the addition of CQLinq in v4.0 makes this even easier to use for .Net developers who are already familiar with LINQ syntax.

This view allows you to use LINQ-style queries to dig into your codebase and create ad-hoc reports using pretty much any code metric you can imagine.  And there are dozens of CQLinq queries packaged with NDepend, which include decent in-line documentation and links to additional support references, so writing your own queries is made that much easier as you have a library of examples to copy-and-paste from.

Here’s an example of one of the pre-loaded CQLinq queries:

You can also import output from NCover, dotCover and TFS in order to include code coverage metric in your queries.

Further, you can compare the results of the current code against a previous run of NDepend to see how the results have changed over time.  For example, NDepend provides a set of very useful “Breaking API Changes” CQLinq queries which will show changes (adds, breaking modifies) to publicly accessible members.  For an ISV that sells an SDK, such as my employer, this is an extremely useful datapoint to ensure we’re not unintentionally causing breaking changes for our customers between releases, nor exposing additional types that we didn’t intend to expose.

Each of these queries can then be saved as a “rule” which NDepend will enforce.  Note the very first line of the above CQLinq says “warnif” – that would result in a warning (vs. an error/failure) when run as a rule.  Then, when analysis is run via VisualNDepend, the Visual Studio plugin, or via the console app (say, as part of your build scripts), those rules will be run and the results provided to you (as a red/green while in the bottom corner of VisualNDepend and Visual Studio, or as a result value to fail/pass your builds through the command line).

Abstractness vs Instability

abstractness

This graphic, which is included in the HTML report, but doesn’t seem to be available in the VisualNDepend UI, was one of the hallmark features the Scott Hanselman discussed in his review of NDepend in 2007.  However, I personally don’t find a lot of value in this report for large projects, since the density of the type names obscures the data, as seen in the screen cap.  But that’s one of the great things about this tool.  It provides many, many different ways to look at the data – and just because I don’t find this view useful doesn’t mean others don’t (case in point: Scott found it useful).  I’m a little surprised to find it only on the HTML report and not in the GUI, though.

Feature Gaps:

I didn’t find myself saying “I wish it would do X” very often in relation to feature support. (User experience is another issue – see the User Experience Issues section below for that).  There were a few places I felt functionality was missing:

Trouble parsing some files:

directiveError

There were a few places where NDepend had trouble parsing the source code for my assemblies. Most cases were due to #if #else #endif directive in the code which included braces, like this:

This is syntactically valid, but NDepend could not parse it, apparently due to the apparent two open braces for one close brace (ie: doesn’t take the directives into consideration). To be fair, other analysis tools, such as Sonar, also had trouble parsing these files for similar reasons.

Additionally, it appears to choke on the .xoml files used for Workflow XAML.

Support for decompilers other than Red Gate’s Reflector:

reflector

VisualNDepend will open Reflector for an assembly so that you can see further details about it, including reverse compiled code and assembly metadata.  At the time this support was added, Reflector was freeware, but was migrated to for-fee licensing well over a year ago.  The dialogs in VisualNDepend still reference it as freeware and it’s the only one supported.  Given that there are a few prominent and free alternatives (JetBrains dotPeek and Telerik JustDecompile, for example), I think support should be added for those alternatives.  I wrote NDepend support asking if they plan to support alternatives (as well as pointing out the now incorrect freeware verbiage) and got the below response, which states I’m the first person to ask for this support. This surprises me and makes me wonder if people are actually using that feature. supportemail

Lack of an installer:

NDepend is distributed as a .zip file with everything packaged inside.  This is likely sufficient for the vast major of folks, but there are a couple of times where having an installer option, in addition to the .zip file option, would be nice:

  • Licensing.  In order to “install” your license, you must place an xml file into the NDepend folder.  Your license email comes with these instructions.  Having a smart installer could allow me to license the product (or download an existing license without having this email around) and not deal with email MIME issues, etc. licenseemail
  • In the same email, it mentions not installing the files into the Program Files folder due to Windows protection issues.  Having an installer publish the files to Program Files will resolve most (all?) of those issues, I believe.
  • An installer would allow registration into the Windows Add/Remove Programs database, which is also searchable via other Windows management tools. This would allow system admins to ensure machines have the appropriate versions installed without having to login to each machine and check assembly versions.  This could be particularly helpful for build machines utilizing NDepend.

Performance/Resource Utilization:

The analysis runs fairly quickly. For my testing, I used a set of 108 assemblies (1184 namespaces, 29316 types), including about 75 of my own assemblies and the rest being third-party modules my projects use. NDepend analysis took about a minute and a half. By comparison, it took over two and a half minutes to compile and link the assemblies using msbuild/nant scripts.  The analysis did consume a decent amount of memory, though. When run in the VisualNDepend GUI, the app's memory footprint started at about 125MB once I loaded my project, then analysis added about 900MB to the app's memory footprint, and about 620MB stayed in memory after the analysis was completed.  When run within Visual Studio 2012, the NDepend plugin didn't seem to add much memory footprint when loaded and when analysis was running, the memory usage increased from VS's initial 850MB to approx 1.3GB, but came back down once analysis completed. The console runner seemed to peak out at 350MB.  Note that I’m running this on a pretty beefy box with 8 logical processors and 8 GB of memory, running Windows 8 and Visual Studio 2012.

User Experience Issues (VisualNDepend)

As I mentioned in my summary above, the NDepend engine is powerful. But gathering the data is only half the job – reporting and analyzing the data is just as important. The primary tool for analyzing the data is VisualNDepend and it has a lot to be desired from a user experience standpoint.

Ultimately, the tool provides you access to the data. And having the interactive graphical representation of the data in the various views is a powerful feature of the product.  However, I continuously found myself cursing under my breath (and sometimes out loud) as I used the application and became increasingly frustrated when it didn’t behave the way I would expect (based on how most other Windows app work) or where UI elements got in my way and made it harder for me to gain access to the information I was seeking.

Looking through old reviews and screenshots of NDepend on the web, it doesn’t look like the UI has changed much since 2007 (the earliest pictures I found in my brief google search).  Honestly, with the many dockable internal windows and not making use of the operating system’s chrome / look & feel, it feels a bit like a Windows 3.1 app ported to Win95, or the typical cross-platform Java app that doesn’t really fit into any one platform’s UI scheme.  It really makes the product feel unpolished, which I think unfortunately reflects poorly on the power of the underlying engine and the data it provides.

I captured a lot of notes on this topic, and while I’ll be providing those to the NDepend folks, I’ve decided not to laundry list them here. Instead, I’ll give a few key examples so you can get a taste of what type of issues I’m complaining about.

UI elements that get in your way:

The context-sensitive help dialog was the biggest offender here.  In particular, in the Dependency Matrix and the Dependency Graph, both of which had a long list of data and scrolled off the page.  In these reports, if I was working with elements at the bottom of the visible window, the context dialog would popup, blocking the data I wanted to see and placing itself directly under my mouse, where it stayed visible until I moved my mouse away from it.  This meant I was constantly having to move my mouse back and forth across the screen to get to the data elements I wanted to see.  Often, I would have to go click on the scrollbars to move the data higher on the screen to escape the dreaded context dialog. This was made worse by the lack of mouse wheel scrolling (see below). I think this could easily be improved by having some smarts around where the context-sensitive help dialog is displayed so that it moves to the side of the screen opposite the current mouse position, thus ensuring it’s never placing itself directly under the mouse and getting in the user’s way.

Mouse wheel usage inconsistent with most Windows apps:

Given that most .Net developers will be heavy users of Visual Studio and other Microsoft products like Office, I personally expect tools that cater to .Net devs to try and align their common keyboard shortcuts and mouse gestures to those in Microsoft products. When that's not the case, I find myself getting frustrated when I, almost by muscle memory, use one of those familiar shortcuts only to find the tool I'm using doesn't react the way I wanted.  Case in point, in most of the NDepend interactive views there's more data than can fit on the screen, so 1) data is compacted and 2) scrollbars are used to page the data off screen.  I find myself using the scroll wheel on my mouse to navigate the report. In most Microsoft applications, the scroll wheel alone controls vertical pagination (up/down scrollbars), shift+wheel controls horizontal pagination, and cntl+wheel controls zoom. So my first instinct is to use the scroll wheel to move up/down the report; however VisualNDepend uses the scroll wheel to control zoom exclusively (even with shift and/or cntl). Since I have over 100 assemblies in my reports, it's a bit of an annoyance to have to mouse over to the scrollbars in order to scroll up/down.  Honestly, this was one of the bigger annoyances, since I continually found myself using the mouse wheel to try and scroll, only to zoom instead.

Error dialogs that weren’t very helpful:

On multiple occasions, I received error messages that gave almost enough information, but didn’t quite take me all the way. Or, were just missing data altogether.  For instance, from the heatmap view, if source code is available for the method, you can double-click on the method box and be taken directly to Visual Studio. But if code isn’t available, you get this error message:

"Can’t open declaration in source file. Reason: N/A because the assembly’s corresponding PDB file N/A at analysis time."

You get a similar error if you try to set the heatmap’s metric to some of the options, such as Cyclomatic Complexity, and you don’t have source for every assembly:

"Can’t select the metric ‘Cyclomatic Complexity (CC)’. N/A because one or several PDB file(s) N/A"

It took me a while to realize “N/A” means “Not Available” instead of “Not Applicable”, but in any case I would have preferred the report be built for those assemblies where source was available and exclude the remaining assemblies (or at least give me that option).

Where it fits into a dev organization:

I don’t think it’s extremely useful or cost effective for every developer in an organization to have a copy of NDepend, but I do see two scenarios where NDepend licenses would be a good investment for a dev shop:

  • Software architects (those responsible for the overall design of your codebase) and/or QA Leads (those responsible for ensuring the quality of your code) should use the applications to analyze the codebase, perform ad-hoc monitoring for issues and to generate rules for enforcing code quality standards (via CQLinq).
  • Build servers in your (continuous) build environment should validate the quality rules and fail the build / report issues for each build.

Note that build server and developer editions are licensed separately, with build machine license pricing being about 50% more expensive. The license I was given as a “Pro” license, which supports both developer and build server editions, but isn’t available for purchase directly.

Build Server integration:

You can take advantage of the NDepend console runner to integrate with your continuous integration/build servers.  In my case, I'm using TeamCity, which I personally believe is the best build server out there (and it's free for most users), but this would work just as well with Jenkins and others.  There are two key aspects of the console runner that I can take advantage of:

1) If  NDepend finds critical rule violations, it will return a non-zero status code.  TeamCity (and most build systems) will pick up on this and mark the build as failed. This allows me to enforce coding standards in the build server.

2) An html report is generated that I can expose within the build page of TeamCity.  There is some documentation on the NDepend website explaining how to do this, but it's a bit dated (steps to access the settings mentioned have moved in recent versions of TeamCity), but it's close enough for now.  I'll write a future blog post explaining exactly how I integrated it in my environment.

teamcity

This is a solicited review, for which I received a free license to the product.  While I made no promise of a good (or bad) review in return, and attempted to take a completely neutral approach going into the review, you should be aware of my potential bias. That, and I don’t want the FTC to fine me $11,000

Feedback from NDepend Developer:

Note: Before publishing, I gave Patrick at NDepend a chance to review this posting. Here are some excerpts from his feedback:

"Concerning the lack of installer, this is a long time debate. MSI makes sense indeed, but personally, as a programmer I hate when MSI takes minutes and put link and registry key everywhere behind my back. At least with XCopy deployement there is zero bad surprise, and this might be worth noting."

"Abstractness vs Instability will be included smoothly in the UI."

"Concerning the Context Sensitive Help, you can close it at any time. It is designed for larger screen, and indeed on small laptop screen it might overlap data. We take note of the idea of placing Context Sensitive help at the top when mouse is at the bottom. I see a problem luring though, since we need an heuristic to differentiate when the user wants to hover the tip with the mouse, with when user wants to hover data. I guess we'll have to see if the tips overlap the panel or not."

"Concerning the Ctrl+Mouse Wheel zoom, you are right. We'll propose Ctrl+Mouse Wheel zoom by default + an option to go back to the current state."

"Concerning "N/A" I always saw it as Not Available. Knowing it is N/A because a PDB file is missing, seems a very relevant info to me since it explain you why you don't get the data and hence, how you can get it, isn't it?"



Kevin P. Davis asked the question "Is There an IT Talent Shortage?" on his blog this week. I started to post a response as a comment, but it kept growing, so instead I'm posting it here. If you're a software developer/manager, I highly suggest you follow Kevin's blog.

To recap the question(s):

I keep hearing that there's a huge shortage of IT workers.  That there are businesses hiring, and they can't find good people....Is it that the IT worker rate of unemployment is that much lower than the general population rate?  Or are there plenty of IT workers that simply can't find work, or are in the wrong location....That is, are there more senior people available out there than junior people, but the openings are all for junior people (or vice versa)?

Like any good macroeconomic question, there are many variables at play. Here are my thoughts on the topic:

To be clear, I think we're only talking about the US market (in as much as it's separate from the global market), and specifically, US-based companies looking to hire software developers.

First, IT unemployment rates are absolutely lower than the general population. According to one report, it's half. I think this is because companies continue to see the ROI on software development projects and customers continue to demand greater access (web, mobile, etc).  Of course, in some cases, the ROI of a software project comes from reduction of head count (or reduced need for additional head count) in other areas of the business, which just furthers the unbalanced unemployment allocations.

So demand for IT workers, especially software developers, has remained high.

Meanwhile, cheaper sources of IT workers (ie: off shoring) have shown to be less cost effective than previously thought, so my perception is that fewer dev jobs are "going oversees".  After the dotcom bust at the turn of the century, the number of students studying computer science type degrees leveled off (vs being inundated during the bubble).  This has the affect of keeping the supply of skilled workers fairly steady.

So, at a high level, we've got high demand and a limited, but not scarce, supply.

But what I've seen in the last couple years, both as a person interviewing candidates, and as a job seeker myself, is that hiring companies have changed their approach since a decade ago. With a large exception for consulting firms, I see fewer companies willing to train workers for the skills they want and instead look for candidates with existing experience in the specific technologies utilized in the job. Since the technologies available continue to grow at a high rate, and existing technologies are not going away, the likelihood of any given developer having the wanted skill(s) is shrinking. This has the result of reducing the supply of workers drastically from the larger pool. To some extent, I think some employers are setting their expectations too high.  Personally, I prefer to hire for ability to think about complex problems and less on years of Lisp/Ruby/PHP/TCL/Cobal/Bash/Scala/Objective-C/FoxPro experience.

At the same time, salaries have generally stagnated for the last few years, and experienced developers have, in my opinion, become more selective about the work environments they are willing to consider (telecommuting, flex hours, commutes, etc, etc).

So for any given job listing, the number of people in the worker pool who have the skills sought, would be willing to work for the hiring company, and are willing to leave their current employer or are unemployed, is shrinking.

Supply vs demand.

My expectation is that within the next three years, as the economy recovers, there will be a disproportional increase in salary levels for experienced developers, and that more companies will be willing to hire more junior and mismatched skill workers to train on the job. This will pull more people into tech jobs, thus increasing the supply and reducing demand, but over the course of a decade or so. This cyclical affect will continue for the foreseeable future.

I also see companies less willing to hire full-time developers, so consulting firms are able to draw in a lot of good talent (further reducing the pool for everyone else) and they are more willing to train new graduates and mismatched skill developers. Along those same lines, I've seen several of my experienced developer peers move out of staff jobs and take independent contractor roles, where they can better dictate the work environments, hours, salaries, etc. and/or more directly reap the rewards of their efforts. I suspect these are both temporary trends until the economy pick back up and companies are more willing to provide those things directly.

Lastly, I think geography plays a shrinking, but noticeable, role. Chicago has a hot tech job market right now. Looks like Austin does too. Omaha, not so much. But I've seen more companies willing to hire remote tech workers over that last decade, and the industry is trending in that direction, so I see this as becoming less of a barrier. As an employer, if you really need that skillset, you'll hire the guy/girl in the next state; With modern Internet, Skype, IM, etc, they'll be just as productive as if you stuck them in the back cube.