I Have A New Job Title: Director of Development

Two years ago I joined InRule Technology, in part, because I saw tremendous potential in the company and it’s software. During those years, we’ve seen great sales growth (with a 67% increase year-over-year revenue in 2013), and all signs indicate that sales growth will continue for a very long time. Accordingly, the company is poised to grow in size and maturity as well, and in my new role as Director of Development, I will be helping guide the ship through.

I’m a huge believer in continuous improvement and frequent incremental change, not just for improving our software codebase and feature set, but also for our people, processes, practices – both at the micro (individual contributor) and macro (department and company) levels. In the Director of Development role, I’ll help drive strategic initiatives within the Product Development (aka software development) group aimed at improving our processes, software architectures, system infrastructures and our people to make it easier to expand our products, grow the department and to venture into new technologies and new markets. I’ll play a key role in growing the team itself, as well as representing the development group in cross-departmental concerns.

Many of these are things I’ve already been doing in the background, because I’m passionate about my work and enjoy the work I do. This title change just makes that part of my official role. Of course, Jeff Key, Sr VP of Engineering, will continue to oversee the department overall, and I will continue to report to him. While I will be absorbing some of the work that is currently in his plate, we both felt it was important that I continue to spend a large portion of my time writing code.

As such, I will still be doing development/architecture the majority of the time. This is my “happy place” and I wouldn’t want it any other way. Too often, companies will take high performing members of the team and “promote” them into a leadership role which takes them away from development and the things they enjoy about their jobs. This ends up hurting the individual and the company in the long run, something I’ve experienced in my own career, and is something Jeff and I both wanted to avoid. That I work with a management team that understands this dynamic is yet another of the great things about working for InRule!

Hmm… Guess it’s time I dusted off my Thoughts on Employment series.

Alright, enough self-aggrandizing. Check back with me in a year when I’ve got something to show along with my big words!

Photo Credit: Michael Heiss (Creative Commons License) Some rights reserved

Drastically Improving WCF Download Times with Gzip + SSL

I put this together last year while looking into ways to improve the amount of time it took to download an approximation 45MB payload SOAP response from a Microsoft Dynamics CRM service (ie: a giant XML document).

In the process, I found surprising results at how much better the user experience was when you combine GZIP compression with SSL encryption.

While the below write-up is specific to CRM in IIS, it should apply much more generally. I hope you find this helpful.

Summary / Real-World Proof

I implemented the below described ssl + gzip IIS configuration to speed up the metadata downloads on an internal server. While I saw no measurable difference in download times from my dev machine on the local network (which downloaded the metadata at approx. 25 seconds), my coworker, who is connecting over a VPN from two timezones away, saw download times go from approx. 5 mins before the change to approx. 30 secs after.

This can be contributed exclusively to the ssl + gzip config change, as we were already running just gzip on the one of our servers and just ssl on another, which were both taking the full amount of time to download metadata. It was only once I enabled gzip and ssl that the times dropped so significantly. Ultimately, this is due to the drastically reduced payload size (data going across the wire) when you combine those two technologies.

Overview

After some investigation on how to improve the download times for the CRM metadata, I think on of our best options is to suggest users enable dynamic compression for SOAP data, and utilize SSL. This will significantly reduce the payload size going across the network by ~96%, which represents the overwhelming majority of the user’s wait time.

Findings

Out-of-the-box, Dynamics CRM will enable the dynamic (GZIP) compression setting for the web interfaces (including WCF services), but IIS7’s default configuration does not consider SOAP to be compressible. You must manually add SOAP to the list of dynamicTypes, which is a host-wide config change. Further, enabling SSL with compression significantly reduces the payload size.

Estimated download payloads and timings:^

  • Default install (IIS7, no dynamic compression, no SSL): 44.5 MB = 8min
  • With GZIP compression for SOAP: 33 MB = 6 mins
  • With SSL only: 33 MB = 6 min
  • With GZIP and SSL: 1.5 MB = 17sec

^Times are best-case, assuming you’re using a network connection with 768Kbps (.09MBps) download speed, the average DSL speed in America. Actual times will likely be slower.

That’s not a typo – enabling both SSL and GZIP took the time down to 17 seconds, or ~3.5% of the original time.

How To:

Step 1: Enable dynamic compress for soap data in the IIS applicationHost.conf

Enable compression by manually updating the ApplicationHost.Config

  • On the CRM Server Navigate to: C:\Windows\System32\Inetsrv\Config\applicationHost.config and open it with notepad.
  • Search for the Section: <dynamicTypes> and in that section you should fine an entry that looks like this: <add mimeType="application/x-javascript" enabled="true" />
  • Below that, add the following line: <add mimeType="application/soap+xml; charset=utf-8" enabled="true" />
  • Save the file and reset IIS for the setting to take effect.

Step 2: Ensure dynamic compression is enabled for the Dynamics service:

Note: This should already be enabled in the default configs, but may have been changed by sysadmin

In IIS Manager, open the compression settings for the host:

Ensure dynamic compression is checked.

Open the Dynamics site compression settings:

Ensure dynamic compression is enabled:

Step 3: Enable SSL using a self-signed cert

Follow these instructions to enabled SSL with a self-signed cert.

Step 4: Export the cert and install on desktop

The CRM SDK won’t connect to a site with certificate errors, so if using an untrusted (self-signed) cert, you’ll need to add it to the desktop’s trusted certs.

In IIS Manager, from the Server Certifications page, click Export, select a location to save the file and enter a password.

Copy that file to your desktop machine and double-click the file, which should open the certificate import wizard.

Select Current User (or, to make the cert apply to all users on the machine, select Local Machine) and complete the wizard, using the same password when prompted as you entered on the server during export.

When prompted for which certificate store to use, select “Place all certificates in the following store” and browse to the “Trusted Root Certificate Authorities”. Finish the wizard and agree to all of the security warnings (there may be several).

You may need to restart your desktop machine for the certificate settings to take affect.

References:

My SonarQube ReSharper Plugin is Officially Deprecated

Well, it seems the perceived success of my SonarQube ReSharper plugin was it’s own downfall. Less than 30 days after it’s official release, that plugin, as it exists today, is now officially deprecated. The SonarSource team has taken it upon themselves to completely rewrite ReSharper integration with the next release of what is currently called the “.net ecosystem” plugins (coming this month) and I am bowing out of developing for SonarQube.

And as I was writing this post, I received an email from SonarSource with this:

I didn’t want to touch your repository at the time I started this work, because I would indeed have needed to discuss this with you first, and I was under some time pressure.

You also correctly identified that this implies that your plugin is de facto end of lifed.

I am going to have to ask for the removal of the repository, and to prefix your Jira component name by “(Deprecated)”.

This will remove any possible confusion between your plugin and the library.

Seems they are more than happy to run me out of town, but want me to hang my own eviction notice — perhaps as some sort of extra humiliation?

Unfortunately, it is unlikely I can even use the new plugin, as it will not support reuseReport mode. Additionally, SonarSource’s stated (and restated) goal is to eventually drop support for ReSharper and all third-party tools, so this is a temporary solution.

While I am in a “wait and see” mode right now to determine if I will even continue utilizing SonarQube at all once these changes go live with v4.2, I will not be providing additional support or fixes for the current sonar-dotnet-resharper plugin.

This makes me very sad — and angry. But I guess I should have seen this coming, considering just getting the ReSharper plugin approved was like pulling teeth, and then my NUnit integration plugin was killed. Shame on me for not seeing this coming. Or maybe my Java skills are just that bad.

So, that’s all for my involvement with the SonarQube ReSharper integration. I appreciate those who helped while it lasted, and hope those of you who are using it are able to continue with the new version.

If you want the details, just take a look at the last two weeks on the SonarQube dev mailing list. There are too many threads to list here.

Personally, I have two other side projects currently active. I’m helping out with the That Conference organization team and I’m working with a couple of friends to write an Azure Cloud plugin for TeamCity.