Posts Tagged With Tools - Musing, Rants & Jumbled Thoughts

Header Photo Credit: Lorenzo Cafaro (Creative Commons Zero License)

TeamCity supports using HTTPS access, however they don't provide instructions for configuring this - rather they point you to a set of third-party instructions which are difficult to piece together and are not really clear for people who 1) aren't familiar with Java and 2) are running on a Windows server. So in this post, I'm documenting the steps I followed to get a TeamCity 8.1 server up and running with an SSL cert purchased from a signing authority.

Step 1: Create a PKCS#12 Cert File

If you already have a version of your cert that ends with .p12 or .pfx, you can skip this step. Otherwise, you likely have a .cert, .cer or .crt file. You'll need to convert it to PKCS#12 format using the instructions I've provided in a separate post: Converting a SSL Certificate to PKCS#12 Format on Windows

I suggest placing the file in the /conf folder of your TeamCity installation.

Step 2: Configure the TeamCity server Connector

Open the /conf/server.xml file in your TeamCity installation folder with your favorite text editor and find the <Service name="Catalina"> section where it defines the <Connector> entry. Add an entry as follows:


    <Connector port="443" 
               protocol="HTTP/1.1" 
               SSLEnabled="true"
               scheme="https" 
               secure="true"
               clientAuth="false" 
               sslProtocol="TLS" 
               keystoreFile="C:/your.path/TeamCity/conf/exportedCert.pfx"
               keystorePass="yourpassword"
               keystoreType="PKCS12"
               maxThreads="150" 
               />

Where: - port is the listening port for HTTPS. The standard port for HTTPS is 443. - keystoreFile is the correct path to the .pfx file (hit, Shift-Right-Click the file and choose "Copy as path". Make sure to use forward slashes in your path here, not the standard Windows back-slashes. - keystorePass is the password for the cert (change yourpassword to your actual password)

Now save and restart the server!

If there were any issues, they will be logged into the /log/catalina*.log file, so take a look there if things don't "just work".

Also, don't forget to set the URL in the server's configuration page so that emails, etc, use the new URL.



If you use TeamCity for your builds and to run your SonarQube analysis, you can add a tab to the TeamCity build results page that includes your SonarQube results.

Here's what the end result will look like. Note the "SonarQube" tab next to the Artifacts tab, which includes a report showing new (by default) issues added in this build.

I'm using TeamCity 8.1 and SonarQube 4.1, but older versions of both should be able to use these same steps with little or no changes. I'm analyzing a C# project using the sonar-runner utility, but if you're using Java, you should be able to follow along too.

Setting this up is actually really easy!

In SonarQube, add the Issues Report plugin. The easiest way to do this is by clicking "Install" on it's entry in the Update Center and restarting the SonarQube service.

Then, on your TeamCity server, modify your Sonar Runner build step to include -Dsonar.issuesReport.html.enable=true

This will cause the Sonar Runner to output an html report into the .sonar/issues-report folder after the SonarQube analysis is completed. The next step is exposing that report in a custom tab, which the TeamCity folks have made very easy. If you're using a version of TeamCity older than 8.1, you'll want to follow that link for alternate instructions.

Now, setup the project artifacts path to include the .sonar\\issues-report folder, like this:

Then, go into the project-level settings (not the build-level settings) and click the "Report Tabs" option. You'll use the Create new report tab button to add a "SonarQube" tab and use the issues-report.html file from artifact you just created as the Start Page (issues-report.zip!issues-report.html if you put it in a zip, like I showed above):

Run a build, and you've got your tab! If you don't see the tab, double-check the artifacts path in the Reports Tab settings.



I'm happy to announce that v1.1 of my SonarQube plugin for integrating ReSharper analysis into SonarQube has been released. This is the first public release of the plugin and users can now install it from the built-in SonarQube Update Center.

For existing beta and v1.0 users, please read the important upgrade note at later in this post.

If you are interested in installing this plugin, please read the official documentation and installation instructions.

Overview

The basic idea is this: The good folks at JetBrains have released a commandline-based tool (called “InspectCode”) allowing users to run ReSharper code analysis outside of Visual Studio and generate an XML-based file listing the various issues it detected in your codebase. Oh, and it’s free!

I felt this lined up very well with SonarQube, and wrote a SonarQube plugin to (optionally) execute the ReSharper analysis and consume the results, publishing them into SonarQube as Issues, allowing users to take advantage of the management, planning, reporting, etc, features of SonarQube to manage their code quality and technical debt.

The plugin supports reusing existing reports generated by the ReSharper "inspectcode.exe" tool, or, if the user has separately installed the tools, it supports running the analysis as part of the SonarQube runner. This mirrors the process used for the other .NET ecosystem tools such as FxCop and StyleCop. The JetBrains tools are not bundled with the plugin.

Related: See my other post on Setting up SonarQube analysis for C# projects

What You Can Expect To See

Something like this in Visual Studio:

Violation in Visual Studio

Will turn into this in SonarQube:

Violation in SonarQube

Installation

If you are interested in installing this plugin, please read the official documentation and installation instructions.

IMPORTANT NOTE FOR EXISTING v1.0 USERS:

Between v1.0 and v1.1 of the plugin, the format of the rule keys was changed to remove the use of colons (:). The change in rule keys will cause any customization to those rules to be lost after the upgrade and the new rules will need to be activated in the Quality Profiles.

Disclaimers

I am not associated with JetBrains in any way, aside from being a big fan of many of their products. I have, however, received free licenses to JetBrains products (ReSharper and dotPeek) over the years via raffles at the (now defunct) Chicago Alt.Net Users Group, which was sponsored by JetBrains.



UPDATE 2/1/2014: After I wrote this initial posting, the team at SonarSource who oversee SonarQube and the .NET plugins replied to me saying they have changed their mind on the need for this plugin, and on Unit Test reporting in general. Going forward, SonarQube will not be reporting unit test results, focusing only on coverage details. As such, this plugin is effectively dead. I do not plan on continuing any work on it at this time.

I've made some decent progress with a new SonarQube plugin, intended to be part of the community "C# Ecosystem" plugin package, which I'm calling the "sonar.dotnet.tests" plugin. The intent of this plugin is to support direct parsing of unit testing/coverage tools' reports instead of relying on Gallio, as the current ecosystem plugins do. The first (and currently only) supported framework is NUnit, allowing you to directly import NUnit test results into SonarQube without going through Gallio (see SONARDOTNT-372).

I posted this email to the SonarQube developers mailing list, but thought I'd also post the information here, so that it's easier to find via search engines and so that people can leave comments without having to subscribe to the mailing list.

If you're interested in taking a look at my work in-progress, you can find my code in the "dotnet-tests-plugin" branch of my GitHub clone of the dotnet repository. Note: I have multiple branches in my repo, so make sure you're looking at the "dotnet-tests-plugin" branch. There is not currently a publicly hosted build of the plugin, so if you want to try it out, you'll need to build it (and the other dotnet plugins in the same repo) yourself.

Some general notes:

  • There is one known open issue: It currently is unable to associate test results to the appropriate classes when the tests are defined in a base class of the test fixture.
  • The plugin utilizes the default SonarQube unit test widget (no need for the separate widget used in the gallio plugin)
  • The plugin is using the org.sonar.api.test classes (TestCase, TestPlan, etc), which required changing the sonar API version to v3.5 (from 3.0). Since 3.7 is the LTS version, I think this should not create too many complaints.
  • I have only implemented the nunit support so far, but it should be pretty straight forward to migrate the mbunit (gallio) parsing logic from the existing gallio plugin, as well as almost all of the coverage report parsing code.
  • The plugin supports only reuseReport and skip modes.

Implementation details:

  • This branch also includes Peter Steven's Pull request for SONARDOTNT-382, which is required for the NUnit support to work.
  • As part of this implementation, I made some changes to the base dotnet plugins:
    • Added a property (sonar.csharp.squid.includeTestSources) to include Test projects when performing squid parsing for c# projects (building out the AST). (Default is false to mirror prior behavior). This allows consumers to lookup file resources for types/members in test classes. This also required changes to the CSharpSquidSensor so that it will run for test projects and be able to lookup files in the test sources tree but not to include the Test files when saving measures, complexity, issues, etc.
    • CSharpResourcesBridge now also indexes methods using their parameter signatures (ex: My.Namespace.MyClass#MethodName(bool, int)) as a second index. This allows callers to lookup methods based on their usages without having to know the line number they are defined at.
      • The current indexing implementation includes the line number in the index key and the doc for DotNetResourceBridge currently says this: /!\\ Do not use for the moment! For the moment, method key ends with ':XXXX', where 'XXXX' is the line number, so this API does not work. TODO: Need to work on that.
      • As such, the changes I made to include a secondary index for methods could be removed and made the primary index instead. This would remove the need for the custom SourceMethod class as well, but would break anyone who has implemented against the current line number based signature. I would appreciate feedback on whether this should be a second index or the primary index.
    • Similar changes would likely be required in the VB.NET plugin, which I don't have access to, so haven't tested/reviewed.
  • New unit test result parsers (such as MSTest and MBUnit/gallio) would need to extend o.s.p.dotnet.tests.parser.DotNetTestResultParser and be added to DotNetTestsSensor.getParsers() at the line that currently says "//TODO: add other parsers here". I would eventually like to refactor this to use dependency injection to discover the parsers, but haven't gotten there yet.

Usage:

For the NUnit parsing, the following properties need to be defined in the project.properties file (or equivalent) :

#
# global flags to enable/disable all parsing. Integration tests have a separate flag.
#
sonar.dotnet.tests.mode=reuseReport
sonar.dotnet.tests.it.mode=skip

#
# each parser will have it's own pair of report path keys (unit and integration):
#
sonar.dotnet.tests.nunit.reports.path=$(SolutionDir)/TestResult*.xml
#sonar.dotnet.tests.nunit.it.reports.path=$(SolutionDir)/AggregateResults.xml

#
# The NUnit parser requires the new c# squid property to be set to true
#
sonar.csharp.squid.includeTestSources=True

As I mentioned, I will be setting this aside for several months as I will be spending my "side project time" working with the That Conference organizers team for That Conference 2014. I would appreciate anyone who could review things as they currently stand and provide any feedback, particularly from the current dotnet plugin maintainers and SourceSource crew.



Update 2014-02-06: The SonarQube .NET ReSharper plugin has been released. See my release announcement.

For those of you following along with the SonarQube ReSharper plugin, here's an update on where things currently stand.

JetBrains released version 8.1 of the ReSharper commandline tools back in early December, so I consider version 1.0 of the plugin to have been "released" around that time... and there are a handful of folks out there using it! However, I was unsuccessful in getting the necessary votes on the SonarQube developers mailing list to get the plugin published to the official Update Center, which would make it uber simple for others to install and use the plugin. I must say, I was quite disappointed that I only managed one "+1" vote of the required three. (For the uninitiated, the rules for getting a plugin officially released require three "+1", or "yes" votes and no "-1" or "no" votes from the existing developers group.) As discussed on the dev mailing list, there's not much incentive for the developers on the list, who are, by definition, Java developers, to try, test and therefore vote on plugins for .NET (or other languages), thus making the bar very high for new developers to make the cut. Generally, the call for a vote stays open for under a week -- in my case, it was open for more than a month before I threw in the towel.

Since then, however, someone else reported an issue with the rule keys used by the plugin (which map directly to the ReSharper rule keys) that cause the Issues Report plugin to fail.

Additionally, several users have asked for the ability to provide a ReSharper dotSettings file while running analysis directly in the plugin.

And I've had a few people contact me to say they'd like to submit pull requests for features they'd like to see as well.

So, I'm planning to work on version 1.1 over the next month or so to fix the rule key issue and add a few of the user-requested features (especially if pull requests are submitted). And at that point, re-try to get the plugin officially hosted in the Update Center. I am putting a somewhat hard limit on a month to get this wrapped up, as my "volunteer" time will be shifting to support That Conference 2014, likely up until the actual conference, Aug 11-13th.



I was recently asked in an email if I knew of any tools that would translate the XML results file from the JetBrains ReSharper command line tool inspectcode into a human readable format, such as HTML. For your viewing pleasure, here was my response:

There aren't any tools out there yet to convert the XML to HTML, but the XML format is fairly simple, so I suspect it wouldn't take much to write your own.

The file is broken into two sections <IssueTypes> and <Issues>

Under <IssueTypes>, there's a collection of <IssueType> elements, which list all of the violation types that were discovered in your code. Each one has the following possible attributes:

  • Id
    • This is the unique identifier for the rule
  • Category
    • This is a general grouping for the rule types
  • SubCategory (optional)
    • some of the groupings are further split
  • Description
    • This is a general description of what the rule is checking
  • Severity
    • One of these values: ERROR, WARNING, SUGGESTION, HINT, DO_NOT_SHOW
  • WikiUrl (optional)
    • A link to a jetbrains webpage that has additional details about the rule

In the <Issues> section are the specific instances of rule violations found in your code. Under <Issues>, you will find a collection of <Project> elements, each with a Name attribute which will match your Visual Studio project name. Under each <Project> attribute will be a collection of <Issue> elements, each with the following attributes:

  • TypeId
    • A reference to the rule in the collection, where the TypeId here is a match for the Id in the element
  • File
    • The file path that contains the violation. This path is relative to the Visual Studio Solution's folder
  • Line
    • The line number in the file where the issue occurred
  • Message
    • A message about why this line of code violated the rule. The message is case-specific and often include the variable name or other context information specific to this line of code.
  • Offset
    • I'm 100% sure what this actually represents, but from what I can tell, it's the character range (offset from start of file) of the specific text in the file that violates the rule. In Visual Studio, this would be the text that is highlighted/underlined, etc. So a value of "1684-1722" would be the 1684th character in the file through the 1722nd character.

Hope that helps



Update 2014-02-06: The SonarQube .NET ReSharper plugin has been released. See my release announcement. Since my last post announcing my SonarQube plugin, it has been accepted into the SonarQube Community plugins, including hosting of documentation, issue tracking, and builds on the SonarSource servers, and moving the code into the SonarCommunity GitHub repository, with a path for eventual inclusion in the SonarQube Update Center for easy distribution to users.

Before I open the gates and push a formal release to the public, I'd like to get some beta testers to use the plugin and work out any rough edges. If you're using SonarQube for your .Net-based projects, I'd love to have you as an early adopter and get your feedback.

Overview



The basic idea is this: The good folks at JetBrains have released a commandline-based tool (called "InspectCode") allowing users to run ReSharper code analysis outside of Visual Studio and generate an XML-based file listing the various issues it detected in your codebase. Oh, and it's free!

I felt this lined up very well with SonarQube, and wrote a SonarQube plugin to (optionally) execute the ReSharper analysis and consume the results, publishing them into SonarQube as Issues, allowing users to take advantage of the management, planning, reporting, etc, features of SonarQube to manage their code quality and technical debt.

What You Can Expect To See

Something like this in Visual Studio: Violation in Visual Studio

Will turn into this in SonarQube: Violation in SonarQube

How To Get Started



For this beta, I'm assuming you're already a SonarQube user and have successfully configured analysis for your .Net solutions. If that's not the case, take a look at my post on setting up SonarQube analysis for C# projects.

The official documentation for installing this plugin can be found on the SonarQube Wiki page for this plugin, which I'm still building out, and will change over time to reflect the current state of things. For now, here's the process:

Download resharper EAP and run a local analysis First, make sure the InspectCode tool is working for you. Download the ReSharper Command Line Tools (they come as a .zip file) and extract them into the folder of your choosing. For the purpose of this posting, I'm assuming you used C:\jetbrains-commandline-tools\.

Open a command prompt and run inspectcode.exe /help. You should see a usage statements printed to the console.

Now change into the folder with your VS Solution file and run the command c:\jetbrains-commandline-tools\inspectcode.exe /output=resharper-results.xml YourSolutionFile.sln

Warning: This may take a very long time to run, possibly a couple hours for very large projects. I would suggest starting with a small project and maybe grabbing some lunch.

One this completes, you should have a resharper-results.xml file in your solution folder, which will look a little like this:

<?xml version="1.0" encoding="utf-8"?>
<!-- Generated by InspectCode 8.0.0.0 -->
<Report ToolsVersion="8.0">
    <Information>
        <Solution>Example.sln</Solution>
        <InspectionScope>
            <Element>Solution</Element>
        </InspectionScope>
    </Information>
    <IssueTypes>
        <IssueType Id="CSharpWarnings::CS0162" Category="Compiler Warnings" Description="CS0162:Code is unreachable" Severity="WARNING" />
        <IssueType Id="CanBeReplacedWithTryCastAndCheckForNull" Category="Potential Code Quality Issues" Description="Type check and direct cast can be replaced with try cast and check for null" Severity="SUGGESTION" />
        <IssueType Id="ClassNeverInstantiated.Global" Category="Potential Code Quality Issues" Description="Class is never instantiated: Non-private accessibility" Severity="SUGGESTION" />
        <!-- ... -->
    </IssueTypes>
    <Issues>
        <Project Name="Example.Application">
            <Issue TypeId="RedundantUsingDirective" File="Example.Application\\Program.cs" Offset="910-943" Line="22" Message="Using directive is not required by the code and can be safely removed" />
            <Issue TypeId="RedundantUsingDirective" File="Example.Application\\Program.cs" Offset="945-963" Line="23" Message="Using directive is not required by the code and can be safely removed" />
            <!-- ... -->
        </Project>
        <Project Name="Example.Core">
            <Issue TypeId="RedundantUsingDirective" File="Example.Core\\IMoney.cs" Offset="895-908" Line="21" Message="Using directive is not required by the code and can be safely removed" />
            <Issue TypeId="RedundantUsingDirective" File="Example.Core\\IMoney.cs" Offset="910-943" Line="22" Message="Using directive is not required by the code and can be safely removed" />
            <!-- ... -->
        </Project>
    </Issues>
</Report>

If that all works, then you're ready to start using the plugin.

  • Download the sonar-dotnet-resharper-plugin from the build server and place it in your SONARQUBE_SERVER_HOME/extensions/plugins folder.
  • Restart your SonarQube service
  • Login to the SonarQube server with an admin account and head over to the Quality Profiles page and select the profile used by your .Net project (this may be the "Sonar way" profile, or one you've created on your own).
  • In the search fields, set the Repository to ReSharper and the Activation to Any and click "Search". You should be presented with a list of approx 650 rules supported by ReSharper's tool. In the upper right corner of the results list is a Bulk Change option. Choose Activate all. This will (immediately) activate all of the ReSharper rules for this profile.
SonarQube Quality Profile Bulk Change
  • Modify your sonar-project.properties file to include the following:
    # ReSharper settings
    sonar.resharper.mode=reuseReport  #leave blank (run ReSharper during sonar analysis), 'skip' to deactive, or 'reuseReport' (recommended) and provide a previously generated results file
    sonar.resharper.report.path=$(SolutionDir)/resharper-results.xml
    #sonar.resharper.installDirectory=c:/jetbrains-commandline-tools  #only needed if running during sonar analysis
    #sonar.resharper.timeoutMinutes=10 #optional, and only if running during sonar analysis
  • Execute sonar-runner and get yourself a cup of coffee. For really big projects this could take half and hour or more.
  • Bask in the glory of your new ReSharper-based Issues in the SonarQube web UI.

A Note About Rule Severity



ReSharper users a four-tier severity (plus a "Do Not Show" option), while SonarQube uses a five-tier system. I have mapped the ReSharper values like this:

ReSharper SonarQube
Error Blocker
Warning Critical
- Major
Suggestion Minor
Hint Info
Do Not Show Info

While some of the ReSharper rules are marked as Do Not Show by default, the plugin maps them to Info so that you can modify their visibility via the Quality Profile. If you want them to not show, disable the rules in the Quality Profile. You can, of course, change any of these levels for any rule using the Quality Profile, and I fully expect most people will want to downgrade a lot of the Critical rules to Major and a lot of Info to be turned off.

Reuse Report vs Run During Analysis



The ReSharper analysis can be very time consuming for large solutions with many projects, sometime taking multiple hours to run if the cache has to be rebuilt, so I would suggest users favor the reuseReport mode when possible. This reduces the number of times the analysis is executed, since running it during the sonar-runner will analyze each project separately, adding significant analysis time, even with heavy caching in the ReSharper tool.

To use the reuseReport option you'll need to use the inspectcode.exe utility to analyze your solution before executing sonar-runner. There are several command line options for the tool, but at minimum, you’ll need this:

inspectcode.exe -o=resharper-results.xml YourSolutionName.sln

Where -o=resharper-results.xml will write the results into the file name you provide. Note: by default (without the -o argument), inspectcode will write the file to the temp directory. You’ll need this file to provide to the plugin.

Analysis times

The amount of time required to run the inspectcode analyzer and import the results into SonarQube can vary based on several factors. Here is some representative data to give you some guidance on how long it will take, based on two actual projects I'm analyzing.

Small Project Larger Project
Lines of Code 27,515 593,979
VS Projects in Solution 10 39
Standalone InspectCode 4min 1hr 24min
Sonar-Runner (ReuseReport) 2min 35min
Sonar-Runner (embedded InspectCode) 11min 2hr 58min

All cases had an existing ReSharper cache. Excludes unsupported project types, such as WiX (installer) projects. Includes only the .NET ReSharper plugin and the base C# plugins (ie: does not include FxCop, Gallio/Unit Tests, etc). Timings are based on running within a TeamCity CI build on a decently powerful server. Obviously, your times will vary and these represent only a guidepost for how much variation can happen between runs.

Known Issues / Limitations



The ReSharper command line tools are fairly new and still have a few kinks to work out. I would suggest always using the newest possible version, including new EAP builds of 8.1 while it is still in development.

Inconsistent results from inspectcode

From one run to the next, you may get a different list of rule violations from ReSharper, even without code changes. This means you will have violations come and go in SonarQube without the underlying issue actually being addressed. In my tests, for a large project like the Windows Azure .net library, I saw a 4% difference (~430 rules delta on ~12,000 total)

See RSRP-390005 - inspectcode provides inconsistent results

Data quality issues in available rules

With the 8.1 EAP builds, JetBrains has added a commandline switch to inspectcode to dump the list of possible issuetypes that could be reported. There are a couple of data issues in that output, which I have worked around in the plugin:

See RSRP-390380 - inspectcode /dumpIssuesTypes includes duplicate entries for "StructuralSearch" and RSRP-390375 - IssueType "InvocationIsSkipped" has undocumented "INFO" severity

"Sonar.UnknownIssueType" Violation

SonarQube requires that the rule database be populated up front, at server start. ReSharper provides a list of possible violation types for their tool via the /dumpIssuesTypes argument, and I've populated the SonarQube rule repository using these values. However, as new versions come out, or you add your own ReSharper plugins to generate new IssueTypes, it’s possible that you will have violations in your code that the plugin did not know about when the server started, so cannot import into SonarQube appropriately. When the plugin comes across one of these, it will write a log statement (to the runner’s log) and generate a "Sonar.UnknownIssueType" violation with instructions on how to add the rule information to the 'ReSharper custom rules' property in the Settings UI. After adding that setting and restarting SonarQube, future analysis runs will support those IssueTypes.

unknown violation setting

Other File Types

While the ReSharper tools support the full .Net stack, including .xaml and other file types, The sonar-dotnet API does not currently support more than one language (C# or VB.net) for a given project. For .xaml and other files, the plugin attempts to report the violation against the file, but they may end up attached to the project instead. In either case, the source code for the .xaml or other unsupported file type will not be imported into SonarQube.

Future plans and release timeline



There are a handful of things I want to see before I release version 1.0, which will go out in the Update Center for general public consumption:

  • ReSharper 8.1 released. I feel like the 8.0 release of the command line tools just isn't viable for use with SonarQube, primarily due to the inconsistent results from one run to the next. This also ensures any breaking changes they make before 8.1 ships are caught and addressed in the plugin. JetBrains has not publicly stated when they thing 8.1 will be released.
  • A handful of early adopters are using the plugin and providing feedback (and are happy with it).
  • I've addressed any high-value tickets in JIRA. I'm using the NETRESHARPER-1.0 release to track things targeted for the public release.

Other features I'm hoping to include, but are not guaranteed:

  • additional configuration settings to modify the inspectcode arguments when running directly. For example: ability to set the cache folder path, disable solution-wide analysis, use of a R# settings file, etc.
  • utilize and document ReSharper plugins in the inspectcode analysis. This is a core feature we're looking to utilize at work to make the dev-time experience match the CI server analysis for company coding standards enforcement. This may "just work", but is dependent on JetBrains releasing the 8.1 SDK nuget packages, which will be "later this autumn".

Disclaimers



I am not associated with JetBrains in any way, aside from being a big fan of many of their products. I have, however, received free licenses to JetBrains products (ReSharper and dotPeek) over the years via raffles over the years at the Chicago Alt.Net Users Group, which is sponsored by JetBrains.



Update 2014-02-06: The SonarQube .NET ReSharper plugin has been released. See my release announcement.

Update - I've moved my plugin into the "SonarQube Forge" and hosting the source, etc, under the SonarCommunity plugins. The information on this page is now stale. See my post SonarQube .Net ReSharper Beta Release for updated information.


I've created a fork of the SonarQube .Net ecosystem plugin's GitHub repository with the purpose of adding a new plugin to support the free JetBrains ReSharper command line code inspection tool. I have full intention of submitting this back to the main SonarQube dotnot plugin repository once it's stabilized, under SONARDOTNT-348.

The fork can be found at https://github.com/johnmwright/sonar-dotnet(no longer hosted here), if you want to try it yourself. I provide no warranty, no guarantee and limited support (I'm just one guy, doing this in 30-minute chucks on my train ride to/from work). I'm using the LGPL v3 license for this code, same as the .Net ecosystem plugins, so the code is open for anyone -- thus the use of GitHub. You are free to modify it yourself, in accordance with the license, and I will accept pull request, should you wish to provide them.

Currently, this plugin only support reusing an existing report, although I will be adding support to run the analysis directly in the next month or so.

Installation Instructions:

Install .NET plugins 2.2-SNAPSHOT

This plugin requires you use the "2.2-SNAPSHOT" builds of the C#/.NET plugins for SonarQube. While you can build these from within my fork, since I'm not actively integrating upstream changes, I suggest you don't use these and pull from the official sources. I will provide some support for the ReSharper plugin, but for the rest of the .NET Ecosystem plugins you should go through the upstream maintainers.

Install ReSharper Command Line Tools and Analyze

To use the new sonar-dotnet-resharper-plugin, you'll need to download the ReSharper command line tools from JetBrains and use the inspectcode.exe utility to analyse your solution. There are several command line options for the tool, but at minimum, you'll need this:

    inspectcode.exe -o=resharper-results.xml YourSolutionName.sln

Where -o=resharper-results.xml will write the results into the file name you provide. Note: by default (without the -o argument), inspectcode will write the file to the temp directory. You'll need this file to provide to my SonarQube plugin.

You can also optionally provide the name of a single project in the solution to limit the analysis to just that project.

Install My Plugin

Install the sonar-dotnet-resharper-plugin into your SonarQube server's \extensions\plugins folder and restart the SonarQube server. In the server's log file, you should see lines like this:

    2013.09.12 22:22:08 INFO  org.sonar.INFO  Register rules [resharper-cs/cs]...
    2013.09.12 22:22:09 INFO  org.sonar.INFO  Register rules [resharper-cs/cs] done: 718 ms
    2013.09.12 22:22:09 INFO  org.sonar.INFO  Register rules [resharper-vbnet/vbnet]...
    2013.09.12 22:22:09 INFO  org.sonar.INFO  Register rules [resharper-vbnet/vbnet] done: 313 ms

Now, add the following to your sonar-project.properties file(s):

    #ReSharper
    sonar.resharper.mode=reuseReport
    sonar.resharper.reports.path=resharper-results.xml

Where the value of sonar.resharper.reports.path is the path to the ReSharper results file relative to each project file. This has the current limitation (same as the FxCop and other sonar-dotnet plugins) of requiring the same relative path be used for all projects. As such, if your filesystem layout has project files at different depths relative to the results file, you will have to copy the results file to each project's folder so that the relative path works.

Right now, for sonar.resharper.mode it only supports reuseReport. In the (near) future I will add the ability to directly run project-level ReSharper analysis instead of reusing a report, which will work much better for people that have differing depth project files.

Modify Your Project's Quality Profile

If you install the sonar-dotnet-resharper-plugin on a clean system where you have not modified the "Sonar way" Quality Profile, then all of the new ReSharper-based rules will be included and enabled immediately. However, if you have modified that profile or have your own custom profile(s), you will need to enable the ReSharper rules for that profile. To do this, go into the SonarQube web UI, click the "Quality Profiles" link in the header bar, and choose the profile you wish to include ReSharper results.

In the filter boxes, choose "ReSharper" in the Repository field and "Any" from the Activation field and submit. I expect you'll see a list of approx 240 rules, all disabled. In the upper right corner of the results list is a "Bulk Change" option. Choose "Activate all". This will (immediately) activate all of the ReSharper rules for this profile.

A Note About Rule Severity

ReSharper users a four-tier severity, while SonarQube uses a five-tier system. I have mapped the ReSharper values like this:

ReSharper SonarQube
Error Blocker
Warning Critical
Major
Suggestion Minor
Hint Info

Note that there is no mapping to the SonarQube 'Major' category, but this is the fall-back/default value SonarUses for a rule, so if you see any (or all) of the ReSharper rules in this tier, please let me know -- there was likely an error importing the rules into your SonarQube installation.

You can, of course, change these levels for any rule using the Quality Profile, and I fully expect most people will want to downgrade a lot of the 'Critical' rules to 'Major'.

Run the sonar-runner!

Now run sonar-runner on your SonarQube project. You should now see your new ReSharper violations in the SonarQube web reports.

Current limitations

SonarQube requires that the rule database be populated up front, at server start. However, ReSharper does not (yet) publish a list of possible violation types for their tool, so the rules in the plugin are only those which I've come across so far in my own C# projects. This means that it's possible, likely even, that you will have violations in your code that the plugin does not know about, so cannot import into SonarQube appropriately. When the plugin comes across one of these, it will write a log statement (to the runner's log). In a future release, it log more information on how to 1) submit that rule back to me for inclusion in a future plugin build and 2) add it to a local supplement file for inclusion in your local rules. For now, if you hit this, you can email me with the block of your ReSharper results block and I'll update the plugin. Also, I have an open request with JetBrains to provide a mechanism to get the full list of violations up front so that this issue will be less likely going forward.

The sonar-dotnet API does not currently work for .xaml files, so any violations in .xaml files are not reported in SonarQube.



In the world of software development, your job is take the complex and make it manageable, whether that's automating business processes, visualizing and aggregating massive data sets, or rendering life-like vector images in real-time, your job is to allow non-developers to interact with the application (or it's outputs) in a way that simplifies the complexities and allows them to focus on the aspects of the business/data/entertainment that is important to them.

But many programmers forget that we have the same issues.  Our codebases begin to suffer from the same complexity problems over time. This is especially true for large systems that have aging code bases and large numbers of developers (over time) with their hands in the code.  So we, too, need to keep an eye on minimizing complexity in our own domain, our code, so that we can concentrate on the task at hand instead of suffering through technical debt and bit-rotting architectures.

I'm a huge fan of continuous improvement, and I've mentioned tools like Sonar and JetBrain’s ReSharper in the past that help quickly locate problematic areas and facilitate that continuous improvement process.

So when Patrick Smacchia (lead developer for NDepend, and I think sole developer/CEO) contacted me to request I review the NDepend tool in exchange for a free license, I was happy to do so.

Quick links to sections of this review:

General Conclusions:

NDepend is a very powerful static analysis tool for .Net codebases that provides HTML reports and interactive GUI(s) for finding overly complex or problematic areas of your code, performing analysis for refactoring and comparing changes over time.  In version 4, a LINQ-like query language was added that makes the tool an extremely powerful reporting engine and can be used to enforce coding standards rules on your build systems similar to FxCop or unit testing frameworks.

While NDepend provides a very powerful backend, it's frontend, Visual NDepend, suffers from some usability issues that I feel distract the user from the power the underlying tooling provides.  With some time, you can get proficient using the tool and have access to a world of data about your codebase. And hopefully the NDepend developers can address some of the usability issues in future releases.

I don't see NDepend as a tool that you'd buy for every developer on your team; but I do see value in having dev leads/architects utilize the tool, as well as having it integrated into your build/Continuous Integration process. This will allow you to 1) monitor changes over time to ensure the development team isn't adding unmerited complexity or incurring inappropriate technical debt and 2) discover high-risk/difficult to maintain areas that you should prioritize while paying down your technical debt.

General Feature Overview:

NDepend is, at it's core, a static analysis tool for .Net assemblies.  It can be run against the assemblies themselves, or by way of a Visual Studio solution.  If running against assemblies, and .pdb files exist, NDepend will pull the source code metadata from the .pdb file and import the source code directly as well.  The analysis is pretty rich and includes dependency graph resolution, cyclomatic complexity calculations, coupling, abstraction, % comments, Lines of Code, etc, etc.  One interesting bit -- several of these are calculated for both the raw source code as well as for the IL, so you can see where your code might end up generating some ugly IL even if the C# isn't so bad.

In addition to the current analysis, given access to the output of previous runs, NDepend can show you differences between snapshots.  This provides some really nice datapoints, such as breaking changes to public APIs. You can also import code coverage reports from some unit testing tools, which allows you to also gather metrics around unit test coverage.

All of this is exposed though several different reports and visualizations, from dependency matrix to heat maps, and when run in the NDepend GUI, they are interactive and configurable. These are great for exploring a systems architecture and evaluating potential refactorings, such as assembly consolidation/splitting.

But the really powerful piece seems to be the Code Quality LINQ (CQLinq) feature, which allows you to write LINQ queries against the code analysis data to generate your own reports and create code quality rules to enforce good coding standards.  A set of pre-defined rules is provided as well, including in-line comments explaining the query for quick reference. And with the NDepend console runner, you can easily integrate NDepend analysis into your CI builds. So image the ability to write a LINQ query to find all methods which have high cyclomatic complexity and low code coverage.  Or write a rule that will fail the CI build if a public API has a breaking change. Or if a large method grows even bigger, etc, etc.

Additionally, NDepend provides an API for writing your own tools to hook into pretty much any of the NDepends functionality, from creating NDepend projects, running analysis and interacting with the results.  A set of sample "power tools" is also provided, including source code, that utilize the API to perform several tasks.  So you could use this API to further automate your analysis, such as writing custom hooks for FxCop, Sonar, data warehousing, etc.

A larger list of NDepend features can be found on their website.

I’m a firm believer that people are better able to comprehend complex data when accompanied by a graphical representation of the data. I also understand that people vary as to how they process data, and there is typically not a one-size-fits-all data visualization that will sink in for everyone.  NDepend is a great tool for this, as it provides multiple visualization techniques for exploring your code and the metrics gathered during analysis.

Below is a listing of the major views within VisualNDepend and the HTML reports, of which I believe Dependency Matrix and Query and Rules Explorer to be the most useful generally, and very powerful in “real world”/day-to-day development efforts.  Additionally, you can flip between many of the view to get a different viewpoint for the same data – for instance, you can be working in the Dependency Matrix and double-click an assembly/member to jump to the Dependency Graph for that element. In many cases, particularly when look at method-level data, you can jump directly to the code in Visual Studio, allowing you to fix the problems as you come across them, which is a nice touch. You can also run NDepend within Visual Studio for an even tighter integration.

Click on any screen cap to open a larger version

Dependency Matrix

dependencyMatrix

This view shows dependencies between assemblies (and their members) and other assemblies in the project, with all of the assemblies listed on both axis and the number of dependencies (usages) between any two assemblies noted in the box where their two axis intersect.  For example, my “common.tests” project uses the “nunit.framework” assembly, so where the two intersect, I would see a number.  In the case where my assembly is on the horizontal axis, the number would be in green, indicating a “uses” relationship.  If nunit is the assembly on the horizontal axis, it would show the number in blue, indicating a “used by” relationship.  The nice thing is that the context-sensitive help (the popup dialog shown in the above screenshot) explains all of this, so you don’t have to remember it all.

Each assembly on both axis also have a tree-view expander (little + sign), where you can explode out the assemblies’ namespaces and see these dependencies' at a namespace and member level.  This is a tremendously helpful tool when moving types around between assemblies or attempting to merge or split assemblies. We’re actually in the process of doing exactly this at work, where we are removing UI-dependent and platform-specific code from one of our “common” assemblies and the person doing the work is using (an older version of) NDepend to inspect the relationships in order to know what can be moved and in what order to prevent build issues.  Additionally, it greys-out (or hashed-out) the columns which are not accessible (ie: internal, protected, etc), which is helpful.

Here is another screen cap with the exploded tree views in the dependency matrix, showing my common.tests assembly on the horizontal axis and nunit.framework on the vertical, where the TestAdd() method in my assembly uses the Assert.IsNotNull(Object) method of the nunit.framework assembly.

Another really nice touch on the dependency matrix view, when looking at assembly-to-assembly dependencies, is that a “0” is shows in the intersection square if the assembly defines a dependency to an assembly (ie: includes a reference in Visual Studio), but doesn’t actually use the assembly.  This is a clear code smell, since this is only really ever valid if you’re loading the assembly via reflection, which has it’s own code maintenance issues.

Dependency Graph

DependencyGraph

This view shows nodes, who’s size is configurable by a drop-down at the top (options include: Constant size, Lines of Code, Complexity, etc), with links connecting dependencies. When you select/hover over a node, the other nodes that use it are highlighted in green, and the nodes it uses are highlighted in blue. There’s also a couple of contextual information boxes that get populated: one (not shown here) that gives stats about the selected node, and then the context help popup (shown at bottom of screen here). The context help popup is an example of where the user experience is a little rough, as I explain in the User Experience issues section below.

Metrics (Heatmap)

heatmap

This view allows you to see methods (grouped by assembly) and set the size of each method’s square based on the metric of your choosing via a drop-box at the top.  In the screen capture, I have the size based on IL Cyclomatic Complexity, allowing me to quickly see the largest offenders.

Additionally, you can see there are several squares highlighted in blue. This is because they are methods matched by the current code query I have selected in the Query and Rules Explorer view, showing again how the various views can be used with each other to gain additional insight into the code.

Query and Rules Explorer / Editor

queriesAndRules

When it comes to differentiating this product from others in the same arena, the NDepend Code Query functionality is a huge feature, and the addition of CQLinq in v4.0 makes this even easier to use for .Net developers who are already familiar with LINQ syntax.

This view allows you to use LINQ-style queries to dig into your codebase and create ad-hoc reports using pretty much any code metric you can imagine.  And there are dozens of CQLinq queries packaged with NDepend, which include decent in-line documentation and links to additional support references, so writing your own queries is made that much easier as you have a library of examples to copy-and-paste from.

Here’s an example of one of the pre-loaded CQLinq queries:

You can also import output from NCover, dotCover and TFS in order to include code coverage metric in your queries.

Further, you can compare the results of the current code against a previous run of NDepend to see how the results have changed over time.  For example, NDepend provides a set of very useful “Breaking API Changes” CQLinq queries which will show changes (adds, breaking modifies) to publicly accessible members.  For an ISV that sells an SDK, such as my employer, this is an extremely useful datapoint to ensure we’re not unintentionally causing breaking changes for our customers between releases, nor exposing additional types that we didn’t intend to expose.

Each of these queries can then be saved as a “rule” which NDepend will enforce.  Note the very first line of the above CQLinq says “warnif” – that would result in a warning (vs. an error/failure) when run as a rule.  Then, when analysis is run via VisualNDepend, the Visual Studio plugin, or via the console app (say, as part of your build scripts), those rules will be run and the results provided to you (as a red/green while in the bottom corner of VisualNDepend and Visual Studio, or as a result value to fail/pass your builds through the command line).

Abstractness vs Instability

abstractness

This graphic, which is included in the HTML report, but doesn’t seem to be available in the VisualNDepend UI, was one of the hallmark features the Scott Hanselman discussed in his review of NDepend in 2007.  However, I personally don’t find a lot of value in this report for large projects, since the density of the type names obscures the data, as seen in the screen cap.  But that’s one of the great things about this tool.  It provides many, many different ways to look at the data – and just because I don’t find this view useful doesn’t mean others don’t (case in point: Scott found it useful).  I’m a little surprised to find it only on the HTML report and not in the GUI, though.

Feature Gaps:

I didn’t find myself saying “I wish it would do X” very often in relation to feature support. (User experience is another issue – see the User Experience Issues section below for that).  There were a few places I felt functionality was missing:

Trouble parsing some files:

directiveError

There were a few places where NDepend had trouble parsing the source code for my assemblies. Most cases were due to #if #else #endif directive in the code which included braces, like this:

This is syntactically valid, but NDepend could not parse it, apparently due to the apparent two open braces for one close brace (ie: doesn’t take the directives into consideration). To be fair, other analysis tools, such as Sonar, also had trouble parsing these files for similar reasons.

Additionally, it appears to choke on the .xoml files used for Workflow XAML.

Support for decompilers other than Red Gate’s Reflector:

reflector

VisualNDepend will open Reflector for an assembly so that you can see further details about it, including reverse compiled code and assembly metadata.  At the time this support was added, Reflector was freeware, but was migrated to for-fee licensing well over a year ago.  The dialogs in VisualNDepend still reference it as freeware and it’s the only one supported.  Given that there are a few prominent and free alternatives (JetBrains dotPeek and Telerik JustDecompile, for example), I think support should be added for those alternatives.  I wrote NDepend support asking if they plan to support alternatives (as well as pointing out the now incorrect freeware verbiage) and got the below response, which states I’m the first person to ask for this support. This surprises me and makes me wonder if people are actually using that feature. supportemail

Lack of an installer:

NDepend is distributed as a .zip file with everything packaged inside.  This is likely sufficient for the vast major of folks, but there are a couple of times where having an installer option, in addition to the .zip file option, would be nice:

  • Licensing.  In order to “install” your license, you must place an xml file into the NDepend folder.  Your license email comes with these instructions.  Having a smart installer could allow me to license the product (or download an existing license without having this email around) and not deal with email MIME issues, etc. licenseemail
  • In the same email, it mentions not installing the files into the Program Files folder due to Windows protection issues.  Having an installer publish the files to Program Files will resolve most (all?) of those issues, I believe.
  • An installer would allow registration into the Windows Add/Remove Programs database, which is also searchable via other Windows management tools. This would allow system admins to ensure machines have the appropriate versions installed without having to login to each machine and check assembly versions.  This could be particularly helpful for build machines utilizing NDepend.

Performance/Resource Utilization:

The analysis runs fairly quickly. For my testing, I used a set of 108 assemblies (1184 namespaces, 29316 types), including about 75 of my own assemblies and the rest being third-party modules my projects use. NDepend analysis took about a minute and a half. By comparison, it took over two and a half minutes to compile and link the assemblies using msbuild/nant scripts.  The analysis did consume a decent amount of memory, though. When run in the VisualNDepend GUI, the app's memory footprint started at about 125MB once I loaded my project, then analysis added about 900MB to the app's memory footprint, and about 620MB stayed in memory after the analysis was completed.  When run within Visual Studio 2012, the NDepend plugin didn't seem to add much memory footprint when loaded and when analysis was running, the memory usage increased from VS's initial 850MB to approx 1.3GB, but came back down once analysis completed. The console runner seemed to peak out at 350MB.  Note that I’m running this on a pretty beefy box with 8 logical processors and 8 GB of memory, running Windows 8 and Visual Studio 2012.

User Experience Issues (VisualNDepend)

As I mentioned in my summary above, the NDepend engine is powerful. But gathering the data is only half the job – reporting and analyzing the data is just as important. The primary tool for analyzing the data is VisualNDepend and it has a lot to be desired from a user experience standpoint.

Ultimately, the tool provides you access to the data. And having the interactive graphical representation of the data in the various views is a powerful feature of the product.  However, I continuously found myself cursing under my breath (and sometimes out loud) as I used the application and became increasingly frustrated when it didn’t behave the way I would expect (based on how most other Windows app work) or where UI elements got in my way and made it harder for me to gain access to the information I was seeking.

Looking through old reviews and screenshots of NDepend on the web, it doesn’t look like the UI has changed much since 2007 (the earliest pictures I found in my brief google search).  Honestly, with the many dockable internal windows and not making use of the operating system’s chrome / look & feel, it feels a bit like a Windows 3.1 app ported to Win95, or the typical cross-platform Java app that doesn’t really fit into any one platform’s UI scheme.  It really makes the product feel unpolished, which I think unfortunately reflects poorly on the power of the underlying engine and the data it provides.

I captured a lot of notes on this topic, and while I’ll be providing those to the NDepend folks, I’ve decided not to laundry list them here. Instead, I’ll give a few key examples so you can get a taste of what type of issues I’m complaining about.

UI elements that get in your way:

The context-sensitive help dialog was the biggest offender here.  In particular, in the Dependency Matrix and the Dependency Graph, both of which had a long list of data and scrolled off the page.  In these reports, if I was working with elements at the bottom of the visible window, the context dialog would popup, blocking the data I wanted to see and placing itself directly under my mouse, where it stayed visible until I moved my mouse away from it.  This meant I was constantly having to move my mouse back and forth across the screen to get to the data elements I wanted to see.  Often, I would have to go click on the scrollbars to move the data higher on the screen to escape the dreaded context dialog. This was made worse by the lack of mouse wheel scrolling (see below). I think this could easily be improved by having some smarts around where the context-sensitive help dialog is displayed so that it moves to the side of the screen opposite the current mouse position, thus ensuring it’s never placing itself directly under the mouse and getting in the user’s way.

Mouse wheel usage inconsistent with most Windows apps:

Given that most .Net developers will be heavy users of Visual Studio and other Microsoft products like Office, I personally expect tools that cater to .Net devs to try and align their common keyboard shortcuts and mouse gestures to those in Microsoft products. When that's not the case, I find myself getting frustrated when I, almost by muscle memory, use one of those familiar shortcuts only to find the tool I'm using doesn't react the way I wanted.  Case in point, in most of the NDepend interactive views there's more data than can fit on the screen, so 1) data is compacted and 2) scrollbars are used to page the data off screen.  I find myself using the scroll wheel on my mouse to navigate the report. In most Microsoft applications, the scroll wheel alone controls vertical pagination (up/down scrollbars), shift+wheel controls horizontal pagination, and cntl+wheel controls zoom. So my first instinct is to use the scroll wheel to move up/down the report; however VisualNDepend uses the scroll wheel to control zoom exclusively (even with shift and/or cntl). Since I have over 100 assemblies in my reports, it's a bit of an annoyance to have to mouse over to the scrollbars in order to scroll up/down.  Honestly, this was one of the bigger annoyances, since I continually found myself using the mouse wheel to try and scroll, only to zoom instead.

Error dialogs that weren’t very helpful:

On multiple occasions, I received error messages that gave almost enough information, but didn’t quite take me all the way. Or, were just missing data altogether.  For instance, from the heatmap view, if source code is available for the method, you can double-click on the method box and be taken directly to Visual Studio. But if code isn’t available, you get this error message:

"Can’t open declaration in source file. Reason: N/A because the assembly’s corresponding PDB file N/A at analysis time."

You get a similar error if you try to set the heatmap’s metric to some of the options, such as Cyclomatic Complexity, and you don’t have source for every assembly:

"Can’t select the metric ‘Cyclomatic Complexity (CC)’. N/A because one or several PDB file(s) N/A"

It took me a while to realize “N/A” means “Not Available” instead of “Not Applicable”, but in any case I would have preferred the report be built for those assemblies where source was available and exclude the remaining assemblies (or at least give me that option).

Where it fits into a dev organization:

I don’t think it’s extremely useful or cost effective for every developer in an organization to have a copy of NDepend, but I do see two scenarios where NDepend licenses would be a good investment for a dev shop:

  • Software architects (those responsible for the overall design of your codebase) and/or QA Leads (those responsible for ensuring the quality of your code) should use the applications to analyze the codebase, perform ad-hoc monitoring for issues and to generate rules for enforcing code quality standards (via CQLinq).
  • Build servers in your (continuous) build environment should validate the quality rules and fail the build / report issues for each build.

Note that build server and developer editions are licensed separately, with build machine license pricing being about 50% more expensive. The license I was given as a “Pro” license, which supports both developer and build server editions, but isn’t available for purchase directly.

Build Server integration:

You can take advantage of the NDepend console runner to integrate with your continuous integration/build servers.  In my case, I'm using TeamCity, which I personally believe is the best build server out there (and it's free for most users), but this would work just as well with Jenkins and others.  There are two key aspects of the console runner that I can take advantage of:

1) If  NDepend finds critical rule violations, it will return a non-zero status code.  TeamCity (and most build systems) will pick up on this and mark the build as failed. This allows me to enforce coding standards in the build server.

2) An html report is generated that I can expose within the build page of TeamCity.  There is some documentation on the NDepend website explaining how to do this, but it's a bit dated (steps to access the settings mentioned have moved in recent versions of TeamCity), but it's close enough for now.  I'll write a future blog post explaining exactly how I integrated it in my environment.

teamcity

This is a solicited review, for which I received a free license to the product.  While I made no promise of a good (or bad) review in return, and attempted to take a completely neutral approach going into the review, you should be aware of my potential bias. That, and I don’t want the FTC to fine me $11,000

Feedback from NDepend Developer:

Note: Before publishing, I gave Patrick at NDepend a chance to review this posting. Here are some excerpts from his feedback:

"Concerning the lack of installer, this is a long time debate. MSI makes sense indeed, but personally, as a programmer I hate when MSI takes minutes and put link and registry key everywhere behind my back. At least with XCopy deployement there is zero bad surprise, and this might be worth noting."

"Abstractness vs Instability will be included smoothly in the UI."

"Concerning the Context Sensitive Help, you can close it at any time. It is designed for larger screen, and indeed on small laptop screen it might overlap data. We take note of the idea of placing Context Sensitive help at the top when mouse is at the bottom. I see a problem luring though, since we need an heuristic to differentiate when the user wants to hover the tip with the mouse, with when user wants to hover data. I guess we'll have to see if the tips overlap the panel or not."

"Concerning the Ctrl+Mouse Wheel zoom, you are right. We'll propose Ctrl+Mouse Wheel zoom by default + an option to go back to the current state."

"Concerning "N/A" I always saw it as Not Available. Knowing it is N/A because a PDB file is missing, seems a very relevant info to me since it explain you why you don't get the data and hence, how you can get it, isn't it?"



I've got another dev tool that I wanted to pass along:

A few months ago, Jon Skeet posted a tweet about a new tool he was using called NCrunch.  Since then, I've been playing with the tool and working with the author to resolve some of the issues (here and here) that were blocking it from working smoothly in the my environment.  I believe it's now to the point where my coworkers who wish to take advantage can do so and where I can promote it's use to the world.

NCrunch, at its core, is a TDD extension for Visual Studio.  It will run your unit tests in the background and provide real-time unit test results (no need to even save your file – runs as you type) by way of color-coded dots to the left of each line of code. (Green = passing, red = failing, black = not covered).  It will provide details for exceptions that are thrown and many other cool features.  This will allow you to get immediate feedback if changes you are currently typing break/fix any unit tests.

There's a very good demo video on the NCrunch homepage (about 6 minutes long) that I think is worth watching to get a feel for what the tool can do.

Key features (or at least "My favorites"):

  • Line-by-line, real-time status of unit test coverage
  • Context menu access to applicable unit test
  • Tool-tip/hover bubble with details on:
    • number of covering tests,
    • performance,
    • exception details/stack trace
  • Visual indicators for performance metrics (slow tests have yellow centers -- with transparency based on level of slowness)
  • Quickly run covering tests, debug into a given line
  • Ability to configure how much CPU it will use.
  • It's FREE!! (update: NCrunch will be going for-pay soon)

Cons:

  • Feature overlap with TestDriven.Net and ReSharper test runners (although, the need for those may go away if you use NCrunch)
  • Some rough edges still (see below), but the developer is very actively updating and fixing bugs, and very responsive to users on the forum, twitter, etc.

There are a couple of things to note:

  • NCrunch does a lot of background compilation and running unit tests. It appears to be smart enough to only compile/run tests that are affected by changes you are making. In any case, if you have a slow machine, you may want to disable the automatic testing and run in manual mode.  I have a very beefy development box (8 cores, 8GB memory) and don't see any issues (I also run ReSharper with full solution analysis mode with no issues).
  • NCrunch lets you designate which unit tests to run/ignore.  In our case, we have both unit tests and system tests (db dependent) in the same solution, so developers would want to enable the unit tests but ignore the system tests.  When you first enable NCrunch for a project it asks you if you want to ignore all tests – I'd suggest doing that and then using the Tests window (accessible from the NCrunch menu) to unignore the tests/assemblies you care about from the right-click context menu.
  • NCrunch has the option of running tests linearly or in parallel.  If your tests are written such that they do not have side effects and don't share singletons, etc., then you should be better off running in parallel. However, you run the risk of having tests interfere with each other.  For our code, we need to run the tests one at a time.(Update 2012-2-2: per the comment from the author, each test is run in a separate process, so no memory/static property sharing, so little risk to run the tests in parallel.)
  • NCrunch compiles each project in an isolated, shadow-copied environment in the background. There are some cases, though, where the Visual Studio configurations are such that NCrunch doesn't automatically determine all of the referenced assemblies that need to be copied. In those cases, you can flip a configuration setting to have NCrunch copy the output folder over into the shadow environment.  This resolves the issue, but does have a performance impact.   This shows itself as an error in the NCrunch Tests window with the message "Cannot register assembly XXX or one of its dependencies. The system cannot find the file specified."

To resolve this, you need to enable the "CopyReferencedAssembliesToWorkspace" option for the project by going to the NCrunch Visual Studio menu and choosing the Configuration option, selecting the project in the Configuration window and changing the property.



NOTE: This post was last updated for SonarQube v4 in Dec 2013. The support for .NET has changed quite a bit since then, so this guide is probably not going to be very useful anymore. Sorry.

In an effort to better understand some of the problematic areas of the C# codebase I work on, I recently setup an instance of the SonarQube code analysis platform. SonarQube is originally written for Java analysis and later added C# support. This posting walks you through my experience attempting to setup, configure and run the analysis.

Note: SonarQube changed it's name from "Sonar" in mid-2013, so older references to this posting may use the old name.

I periodically update this post to reflect changes with newer versions of the tools. Most recent update was 12/18/2013 based on a fresh install of SonarQube v4.0.

I've also written a SonarQube plugin to use ReSharper as a source for quality metrics. Once you have your SonarQube instance up and running for your .NET project, see my post SonarQube .Net ReSharper Beta Release for details on importing ReSharper results into SonarQube.

SonarQube Overview:

So why did I even do this? Once up and running, SonarQube provides some useful metrics for pointing out hotspots in your code that may be making it more difficult to maintain and extend your functionality. Through the web interface, you can drill-down on any of the metrics to the module, class, and method level, including full source code. Some of the metrics provided for each C# project include (screenshots are from the "nemo" demo site mentioned below, my own project, or other sources):

  • General analysis
    • Uses several rule-based static analysis tools
    • Details and statistics, with drill-downs, on rule violations
  • Comments:
    • Percentage of code commented
    • Percentage of public APIs that are (un)documented
  • Duplication:
    • Percentage of code that is duplicated
    • Counts by duplicated lines of code, blocks, files
  • Unit tests:
  • Counts:
    • Lines of code
    • Count of files
    • Count of classes
    • Count of methods

In addition, it will track changes over time, so you can see where issues are increasing/decreasing in your code.

SonarQube is open sourced under the LGPL and free to use, however some of the plugins used to perform the analysis are only commercially available, and in some cases come with steep licensing fees. For this blog, I focus on only freely available (ie: no fees) aspects of the product. Each of the tools I used are also freely available (FxCop, StyleCop, Gendarme, Gallio, OpenCover, MySQL -- all have licenses that allow no-fee usage for most people).

There is a demo site provided by one of the commercial plugin providers to demo the system (including their not-free SQALE plugin) which shows analysis for several open-source Java projects and can give you a feel for the UI and the data that can be provided. Beware -- this includes data from some of the commercial plugins, so don't expect to see everything on that site after following this posting.

You can also reference the [official installation notes])http://docs.codehaus.org/display/SONAR/Setup+and+Upgrade).

General SonarQube Technical Architecture:

Generally, there is a "runner" that consumes the source code and analyzes it via plugins. This information is published to the SonarQube database directly. Before each run, the runner makes a call to the server to download configuration settings stored there, and mashes those with configuration settings stored in a local config file, as well as a project-specific config file.

The SonarQube (web) server pulls the results from the database and provides a UI for reviewing and managing them, as well as a UI for managing configuration settings, user authentication, etc. The server also has it's own config file where you define database settings, user authentication settings, etc.

Step 1: Prerequisites and Assumptions:

This posting assumes you have a working Debug build of your codebase that compiles with no errors and generates .pdb files. You need access to the source code and the output, as they exists at the completion of your build. In my case, I have a CI server (TeamCity for my playground site -- free for small/mid-scale configurations, and Jenkins for our production BI build servers) running and just added another build step at the end to kick-off the SonarQube analysis from the command line.

For many of the plugins (such as FxCop and Gendarme), you'll need to install the .NET Reference Assemblies, which are part of the Win7 SDK for versions of .NET 1.0 through 4.0 and the Win8 SDK for .NET version 4.5. Note that the FxCop installer is included (exclusively) with the Win SDK. Also note that a full Win7 SDK install is several Gigs in size and will take some time to download, so plan accordingly. You only need to install the Reference Assemblies (under the .NET Development header in the installer) which ends up being about 151MB to download and 505MB to install if you use the web-based installer I linked to above.

If you have any Silverlight components in your build, you will need to install the Silverlight SDK for the version of Silverlight you are targeting.

Install the Java JDK Yes, the JDK. It's big and ugly, I know. You're a .Net dev, I know. Do it anyway. Set the JAVA_HOME environment variable to point to the JDK root folder. This should be the “jre” folder – for example, on my machine the value is set like this: JAVA_HOME=C:\\Program Files\\Java\\jre7 Note: You don’t need to install the Source Code component from the installer, which will save you about 500MB of disk space.

If you’re using Gallio for running your unit tests, you’ll also need to make sure .NET2.0 is installed. For newer versions of Windows (Win8 and Win Server 2012), this requires going into “Turn on Windows Features” settings and adding the “.NET Framework 3.5 Features” feature (note: in WinServer 2012, this is part of the Application Server role).

Step 2: The SonarQube Server

Download the SonarQube server. At the time of this (revised) posting, it was version 4.0.

This will give you a .zip file. Decompress it into the location you want; I used C:/sonar-server

For the initial configuration/setup, I'd suggest running it at the command line until you know you have it fully configured. To do that, open a command window and run bin\\windows-x86-64\\StartSonar.bat (or -x86-32 for 32-bit machines) This will run in your current window (no real output, but the command blocks) until you hit Cntl-Z or Cntl-Break. You'll see it pulling down additional packages as needed, such as JRuby, but eventually you'll see something like this, which means it's working:

For SonarQube 3.7:

jvm 1 | 2012-04-27 08:45:44.254:INFO::Started SelectChannelConnector@0.0.0.0:9000

For SonarQube 4.0:

jvm 1    | 2013.12.18 21:04:17 INFO  Web server is started

Once you get things working, I'd suggest installing it as a Windows service. To do this, first register Sonar as a service using the provided script: bin\\windows-x86-64\\InstallNTService.bat (There's an UnInstallNTService.bat script too). Then, start the service from the Windows Services control panel, or using the provided StartNTService.bat script. Personally, I would suggest just configuring it to be Automatic so it will survive reboots, etc.

With my SonarQube install on Windows Server, running the service failed initially, so I used the logs/sonar.log file to troubleshoot and found this error:

Encountered an error running main: java.lang.IllegalStateException: Unable to create file in temporary directory, please check existence of it and permissions: C:\Windows\system32\config\systemprofile\AppData\Local\Temp\

Initially, I just took the lazy route and configured the service to run under my user account instead of a system account, so didn't really "fix" that issue. Update: In a later installs, I had to create the “Temp” folder in the systemprofile\AppData\Local folder and provide security rights to the LOCAL SERVICE user.

The server has a sonar.properties config file that drives some of it's functionality, including which database to use. You can find it in the \conf folder. The only change I made to this file was the port used for web connections (default is 9000 – I changed to 80), and the database (see Step 3) -- but if you want to use the built-in Derby database, no changes are needed.

Step 2a: Verify and Change Password

At this point, you should be able to access the SonarQube server at http://localhost:9000 (or whatever port you set in the sonar.properties file). The default username/password is admin/admin. I would highly recommend changing the admin password at this time. Passwords are changed via the user profile page.

Step 3: The Database

SonarQube server comes with a build-in Derby database. While it is quick to install and easy (and pre-configured), I would suggest moving away from it almost immediately. I had MAJOR performance problems using it. For instance, for my solution of about 400K lines of code (50+ Visual Studio projects), when using the Derby database, it took about 2 hours to perform the analysis, 5 hours if the runner was on a different host. During that time, the server CPU would be pegged at 100% for 5 - 15 minutes per project as the runner reported it's results. Once I moved to a MySQL database, it takes about 20 minutes total.

So, I would suggest downloading and installing the free MySQL Community Edition from Oracle. At the time of this (revised) writing, it was at version 5.6.13.

Once installed, you'll need to create the SonarQube database instance. This is pretty straightforward, but SonarQube provides a script to make it even that much easier. For v3.7, from the SonarQube installation folder, the script is in extras/database/mysql/create_database.sql. It's no longer included in the v4.0 release, but you can find it on the SonarQube GitHub repository. The command to execute the query is at the top of the file. This will generate the SonarQube database instance, as well as grant access from localhost and remotely. The SonarQube schema is created the first time you run the server.

You'll need to configure the SonarQube server to use this db. This is as easy as changing a couple of lines in the sonar.properties config file. Chances are, the lines you want are already in the file (commented out), as samples are there for MySQL, PostgreSQL, Oracle and MS SqlServer. (Note: MS SqlServer is not officially supported, although people on the web have posted ways to get it working. I did not venture down this path).

Update: While getting SonarQube up and running in our production environment, the person doing the work was able to get up and running with MS SqlServer and provided me with this feedback. Note, he was using v.2.13.1, which is newer than what I was using at the time:

I followed the link you had on your page (http://www.sezok.de/sonar/sonar.html) but differed in the following ways:

  • The database must be setup with Case Sensitive and Accent Sensitive rules, so I created the database with collate option SQL_Latin1_General_CP1_CS_AS
  • The jdbc drivers that come bundled with 2.13.1 actually work, and the ones listed in the above link don’t.

For me, I commented out the Derby/H2 lines and used these for my MySQL instance:

#----- MySQL 5.x/6.x
# Comment the embedded database and uncomment the following properties to use MySQL.
# The validation query is optional.
sonar.jdbc.url: jdbc:mysql://localhost:3306/sonar?useUnicode=true&amp;characterEncoding=utf8&amp;rewriteBatchedStatements=true

# Optional properties
#sonar.jdbc.driverClassName: com.mysql.jdbc.Driver
#sonar.jdbc.validationQuery: select 1

Restart the SonarQube server, if it's running.

I should note that there isn't a quick-and-easy way to migrate from the Derby db to another, so expect any data collected in Derby will be lost when you move. I found this page where someone found a way, if you really must try.

Note: at one point, I was getting a com.mysql.jdbc.PacketTooBigException error, so I followed the suggestions here and ran the sql command: SET GLOBAL max_allowed_packet = 1024*1024*14;

Step 3a: Verify and Change password, again

At this point, the SonarQube server should be up and running again, albeit with no data. I would suggest verifying you can reach the server at http://localhost:9000 (or whatever port you used in the sonar.properties file). The user accounts will have reset with the database change, so again I would highly recommend changing the admin password at this time!

Step 4: The Runner

SonarQube was build for Java, so the docs almost all assume you're using Maven. You may be able to get this working with nAnt, but there's an easier route. A small, java-based runner has been created that can be kicked-off from the command line. This is what I used from within my TeamCity and Jenkins builds. There’s also a Jenkins plugin that looks promising, but I haven’t used it.

To use this runner, you will need to do a few things:

  1. Download the runner. At this (revised) writing, the version was 2.3. This will come as a Zip file. Decompress it in an appropriate location. For me, I put it in C:/sonar-runner/
  2. Test that it works. Open a command line to the sonar-runner/bin folder and run sonar-runner -h. This will either show you the usage statement (if things are working fine), or blow up. If it blows up, check you'reJAVA_HOME is set correctly to point at the jre folder.
  3. Once it works, set another environment var: SONAR_RUNNER_HOME to the root installation folder for sonar-runner. (for example: C:/sonar-runner/) This is used when you want to run the sonar runner from a folder other than the /bin folder. (ie: when you run sonar-runner from your source folder).
  4. Change the conf/sonar-runner.properties file. This has general configuration items used by the runner, some of which can be overridden in the project's config file (we'll get to that later). At a minimum, you'll need to set the sonar.host.url to point to where you have your sonar server running, and the database config. For the db, just copy/paste the lines from the sonar.properties file you created in Step 2. Again, example values are provided in the file (commented out) which will likely work just fine. Don't forget to comment out the Derby lines if they exist.

Step 5: The C# Plugins

You'll need to install the "C# Plugins Ecosystem" (ie: The plugins to analyze C# code). Login to the SonarQube website as an Admin user and click the "Settings" link in the upper right-hand corner, then click "Update Center" in the left-hand navigation pane. You should see an "Available Plugins" tab, under which you will find all of the officially supported plugins. Search down the list to find "C#" under the Additional Languages section and click on the "C#" link. You should now see an "Install" button -- click it!

Under the hood, this is installing each of the C# plugins .jar files into the /extensions/plugins folder of your SonarQube server installation folder. You will need to restart the SonarQube service after you place the files for them to be available.

The plugins consist of the following: (as of this writing, which uses v2.1 of the plugins)

  • sonar-dotnet-plugin - .NET Core (support for the .NET-based languages - general API used by other plugins -- required)
  • sonar-csharp-plugin - C# Core (support for parsing C# language -- required)
  • sonar-dotnet-fxcop-plugin - FxCop support (general analysis)
  • sonar-csharp-stylecop-plugin - StyleCop support (formatting analysis)
  • sonar-dotnet-gendarme-plugin - Gendarme support (general analysis)
  • sonar-dotnet-gallio-plugin - Gallio support (unit testing / coverage)
  • sonar-dotnet-ndeps-plugin - NDeps (assembly dependency parsing) (new in v1.3)

Note: If you use VB.NET, there is a separate plugin entry for that, which will reuse several of these plugins.

If you're going to use FxCop, you'll need to install FxCop 10 (the FxCop installer is installed as part of the Win7 SDK install).

If you're going to use StyleCop, you can either use a bundled version included with SonarQube, or install your own.

If you're going to use Gendarme, you can either use a bundled version or install your own.

If you're going to use Gallio, you'll need to install it as well as OpenCover (or you can use NCover, if you have a license). When installing OpenCover, I suggest using the Advanced mode and setting it to install for all users. Also note: Gallio does not support NUnit v2.5.4 or higher, so you’ll either need to update the Gallio plugin or jump to NUnit 2.6 (or other supported version).

Step 6: The sonar-project.properties File

Each Solution will need to have it's own sonar-project.properties file. This file will need to exist in the folder from which you execute the sonar-runner. To make this easy, I would suggest putting the file in the same folder as your .sln file.

The file will have a few sections, which I will describe here.

Important Note: Any folder names in the config file will need to either escape the backslash with another backslash (\\) or use a forward slash (/). I've chosen the latter.

Project Identification:

This section will provide the project key used by the SonarQube server to group analysis results over time, as well as provide a useful name in the UI, etc. This should be unique across projects. The project version can be used to track different branches, etc.

# Project identification
sonar.projectKey=Jwright:DemoApp
sonar.projectVersion=trunk
sonar.projectName=DemoApplication

Then, describe the source code layout. The "sources" field points to the top-level folder where the source code exists. If you're .sln and .csproj files have relative paths internally, then this should be the top-level folder. Assuming you don't have any strange layouts, this will likely be the same folder as your .sln file (which is likely where your .properties file exists), so can just be ".". Additionally, you need to denote that the language is C# using the "cs" value.

# Info required for Sonar 
sonar.sources=.
sonar.language=cs 

C#-specific settings:

Here, you'll need to provide information about where the .sln file is located and where key libraries are located, and which version of .Net you're using.

#Core C# Settings
sonar.dotnet.visualstudio.solution.file=DemoApp.sln
sonar.silverlight.4.mscorlib.location=C:/Program Files (x86)/Reference Assemblies/Microsoft/Framework/Silverlight/v4.0
sonar.dotnet.excludeGeneratedCode=true
sonar.dotnet.4.0.sdk.directory=C:/Windows/Microsoft.NET/Framework/v4.0.30319
sonar.dotnet.version=4.0 

# To prevent any issues while analyzing multiple solutions containing projects with similar keys
# Will be set by default to safe starting at version 2.2: http://jira.codehaus.org/browse/SONARDOTNT-339
sonar.dotnet.key.generation.strategy=safe

Plug-in Specific Sections:

For each plugin, there is a "mode" setting. If blank, then the plugin will run. If you want to skip/not run a plugin, set the mode to "skip".

For Gallio, you can stipulate if you want to use OpenCover (free) or NCover (not free). You can also stipulate the runner mode. I had trouble using anything other than "Local". You will also need to stipulate the naming pattern (regular expressions, I believe) for the Visual Studio projects that include unit tests. You can have multiple patterns, separated by semicolons.

#Gendarme
sonar.gendarme.mode=

# Gallio / Unit Tests
sonar.gallio.mode=
sonar.gallio.coverage.tool=OpenCover
sonar.gallio.runner=Local
sonar.dotnet.visualstudio.testProjectPattern=*UnitTest*;Testing*
sonar.opencover.installDirectory=C:/Program Files (x86)/OpenCover/ 

# FXCop 
sonar.fxcop.mode=skip 

# StyleCop 
sonar.stylecop.mode=

# NDeps
sonar.ndeps.mode=

Step 7: Running an Analysis

From the folder with the sonar-project.properties file, run the command $SONAR_RUNNER_HOME/bin/sonar-runner

You'll see the runner start up, listing some details like the working folder, etc, then it will kick off the source code parsing, then the plugins (such as running the unit tests). If there are any errors, a Java exception will be thrown. Sometimes these contain enough details to troubleshoot, sometimes not, so you may need to run with the -X command line argument to get additional details when errors occur.

One error I run into frequently is UTF-8 Byte Order Mark causing the source file parsing to fail. You will see this in two ways: the files will not end up in the SonarQube web UI and something like the following will show in the runner log:

  07:42:56.938 ERROR - Unable to parse source file : C:\\Users\\...\\CodeFile.cs
  07:42:56.969 ERROR - Source Snippet:
  ---------------
    --> n++using System;
      2 using System.Collections.Generic;
      3 using System.Diagnostics;
      4 using System.IO;
      5 using System.IO.Pipes;
      6 using System
  ---------------

You can fix this by setting the encoding in your sonar-project.properties or the sonar-runner.properties file:

   #----- Default source code encoding
   sonar.sourceEncoding=UTF-8

After a successful run of the analysis, you can see the results in the SonarQube server webpage. Unless you have changed the port in the sonar.properties file, this will be at http://localhost:9000/

The homepage will have a link for each project (the value provided as sonar.projectName), which will take you to the project overview. Clicking on any of the links on the overview page will take you to the drill-down data for that metric.

References:

I used this page to start my journey, in addition to the SonarQube project docs: http://www.ifunky.net/Blog/post/Install-and-Configure-Sonar-on-Windows-2008.aspx

The C# Plugins page: http://docs.codehaus.org/display/SONAR/C-Sharp+Plugins+Ecosystem

The Sonar installation page: http://docs.codehaus.org/display/SONAR/Install+Sonar

Update: The Sonar Source page, complete with mailing lists, issue trackers, etc. http://www.sonarsource.org/support/support

Hope this helps!



Microsoft has release a new Visual Studio Power Tool called “Debugger Canvas” that looks to be a very useful way to debug your apps.  Basically, instead of jumping from file to file while stepping through your code in debug mode, it will layout the callstack as method “bubbles” on a single tab, with local variable information available for each bubble:




There’s a really cool video demo on the download page that I would suggest you watch.

As well as this overview blog\announcement by Mary Jo Foley

Which points to this announcement page