Monday, March 14, 2016

The anatomy of a unit

The article in a pill: Click here.

Back in the winter of 2005 I started working on a flight planning system for commercial airlines. There I was presented with a shit-load of legacy code written to perform functions and not to be understood. Suffices to say that it took 9 months to get new developers on-board - even brilliant ones like my good friend Adrian. Sure the domain is just hard. All the physics, advanced maths, algorithms, spherical calculations and God knows what else combined with 10+ years code converted originally from Fortran produced a mixture very hard to work with. Around that same time (late 2005/early 2006) I was introduced to the idea of unit testing through a library called DUnit.

A funny (judging from the perspective of 2016) thing happened in the flight planning product. I was introduced to this idea that would make the Pervasive-based database backend (that was actually just pure BTrieve with no SQL over it to sugar coat it) interchangeable with "real" SQL database like Oracle or Microsoft's SQL server. Looking at it from time's perspective switching from a no-sql database that performed really well for the sake of "we need to support X because our clients require that of us" sounds like pure nonsense but it was what it was. And the actual most disturbing thing about that project wasn't the what as you might have guessed but the how of it. Basically what happened was that BTrieve API calls have been literally translated to extensive SQL builder. It was a disaster for a number of reasons: first the idea that the SQL calls would perform anywhere near the speed native BTrieve was just wrong. Nowadays we know that for example the performance of fine-tuned Redis for fast data inserts from multiple sources is better compared to, let's say MySQL - it is a simple fact. But back in the days the desire to run the flight planner off of SQL Server was more important than speed. And even when the project ultimately failed with the Japan government not approving it due to performance reasons (surprise!) it wasn't the biggest sin of that project in my opinion.

Another sub-project of that solution happened in the mean time having to do with parsing weather messages coming off of a satellite dish. Nowadays that'd be a service working mainly on regular expressions (as it is the case with text parsing), having clear separation of pretty much everything and being unit tested to the death but back then it was a piece of work created by just one programmer, the lead programmer of that project, consisting of just one procedure in just one Delphi unit having cyclomatic code complexity at around 6000 (that's six thousand!). It proved to be a fantastic testing ground for my tool to calculate the metric using McCabe's simplified method and gave me a ton of fun to work with. There was just one problem with the entire thing: it just didn't work as it was supposed to.

What both of the pieces had in place was code that was hard to understand, hard to read and fucking hard to fix. What you don't know is that the first one actually had unit tests to it! The coverage wasn't great (around 60%) but its readability factor was no better than the 6000-high complex walpha unit for parsing weather data. Why was that the case? Why both solutions were so bad and how could they have been made better?

To answer that one needs to first understand what a unit in Pascal-like languages is. Let's take a look at the anatomy of it down below

unit MyUnit;

interface

uses
  Classes, SysUtils;

const
  SOME_CONSTANT = 123;

type
  TMyClass = class (TObject)
  private
  protected
  public
    constructor Create; override;
  published
  end;

function GetMyObject: TMyClass;

implementation

uses
  SomeOtherUnit;

{ TMyClass }

{ Private declarations }

{ Protected declarations }

{ Public declarations }

constructor TMyClass.Create;
begin
..
end;

{ Published declarations }

{ Global declarations }

var
  MyObject: TMyClass;

function GetMyObject: TMyClass;
begin
  Result = MyObject;
end;

initialization
  MyObject = TMyClass.Create;

finalization
  FreeAndNil(MyObject);

end.

Without going into too much details one can clearly see the separation of interface and implementation sections, the list of units the code depends on explicitly and implicitly but what is most important is that a unit in this form describes a piece that is, in point of fact, self sufficient. Let's take a look at the sins of the BTrieve API rewrite and the WAlpha madness.

The SQLisation of BTrieve API has been initiated by this guy Nathan. Nathan was an architect back at the company and was high on Java which was the next best thing after the invention of sliced bread back in the days. Nathan was also a very buzz-oriented person so naturally when TDD became a thing in the industry he quickly realized that all pre-TDD code was shit and that all his future inventions will finally be good. Nathan led the project with another college of mine who got strung out on TDD the same way. Nevermind the fact that the project was actually carried out in Delphi and not Java since DUnit was already around they decided to take it to the next level. And that they did. Each and every class was having an interface, each interface and class was put into separate file, each had a unit test for it - all according to the best practices. What it meant for a developer using their code was a screen-long list of uses statement, anemic tests and a system so complex they had no idea how it works. Debugging took weeks and even though the system had such a great code coverage (of which they have been so proud!) it failed when it came to real-world usage.

The WAlpha case is on the other end of the spectrum. It was carried over by an experienced programmer, Irene, who had been with the project for years. I think she might have had the longest participation in the project besides the original author. She was used to the codebase, never paid any attention to suggestions from younger team mates and what is even more frighting is that she was in a position of power having the axe in hand that could expel you from the project in a heartbeat. So as I said before she did the coding on WAlpha all by herself. She wasn't very big on the whole TDD buzz that was going around so she did what she did best - she tested all the code inside the Delphi integrated debugger (a phenomenal piece of software compared to anything else I knew back in 2006!). And when it finally worked she called it a day and collected the awards coming her way for the job, obviously, well done.

For a very short time I took part in the BTrieve API thingy but couldn't stand the stink of Java in Delphi. It was just too much. I said to myself that I can write something better over a weekend that will work faster and will have less code than what all those geniuses did. And I was right! A weekend and six-pack later I have had a fully working read-only solution to the problem with the write portion 80% done and not completed because the weekend run out. Leaning on the shoulders of ADO drivers for SQL Server and Oracle I was able to navigate the tables, search through them and do all that blazing fast. The original project still used the same drivers but was set dead on on the SQL aspect of it which turned out to be a disaster. Soon after I presented my solution to the team I have heard that it is very nice but (and here comes the best part!) we have invested so much time already that we won't back out now. Funny enough my little side project turned out to be a fully working solution that I was able to offer to other companies and their abrupt solution didn't make it to production at all.

Those are just 2 examples of projects that failed to stand the test of time. Both have been very much different in their design, concepts used to created them, developers and their prior experience. What they have in common is that in both cases it wasn't the right thing those developers focused their attention on than what was actually needed for them to succeed. That thing I am referring to is clarity. Back in 1994 I red an article about different developers on the demoscene (Amiga and C64 was my thing back then) and what they viewed as the most important thing when it came to software development. One of them stated that the code doesn't need to work and be bug-free right away but needs to be written so that it is easy to navigate and fix whereas the other stated that he doesn't care at all about those qualities because all that counts is that it looks cool when showed on a copy party on the big screen. In my opinion both of the guys were right in their own areas. When you write code once, make money on it and throw it away (not even pass on for further development - just throw away) concentrating on clarity, test coverage, readability and whatever comes to mind when we talk about the properly engineered code makes absolutely no sense. It is pure waste and everyone should understand it. On the other hand if the code will be maintained for months and years to come forgetting about readability and concentrating only on how big the coverage is and how fast the tests run will cause all kinds of curses from your fellow programmers.

There's one universal truth to software development that has not changed ever: code is much more often read than it is written. It's that simple. If you write code that is tested like Forth Nox but nobody can understand what the hell you meant everyone will be in trouble (most importantly you if you're still around!)

There's another truth that I think is the mother of all statements: In software development there is no substitute for thinking. No discipline is going to make you a professional programmer, no design pattern is going to allow you to create readable code even though we try to tell it to ourselves that design patterns are the vocabulary of modern software development. My friends you can use 10 design patterns and make everybody hate you with a passion at the same time when you don't pay attention to readability and clarity.

There's another piece that I found irritating around the unit testing paradigm - especially with the test-first approach. When I do coding I usually have no idea what will come out of what I am doing. I explore ideas, options, usually figure stuff as I go. I might give a library a go if I think that it might help me out or I might put together some code from stackoverflow.com to see if it actually performs the thing I want it to do. At the time of writing I have no idea if it will be production-quality-top-notch-super-duper of if I'm going to flush it down the drain in a few minutes/hours/days. And as such I try to follow my heart and I don't write tests (much less test-first). I do YAGNI because I think that what I created is shit and nobody will want to see it. Later on when it turns out to be valuable I tend to write system tests to make sure I lock the end result in place. I test the whole thing in as much isolation as possible - but not an inch more. I seldom write real unit tests as such (except when the architect of a solution is still strung out on code coverage then I do that for his pleasure). I think that testing code in isolation makes absolutely no sense whatsoever. Single methods are useless pieces of a whole system that if exercised in separation give one no clue if they work as part of the whole. In Delphi the concept of a unit allows a developer to put together an implementation of a fully functioning unit of work that can be nicely tested through the provided interface. No other language that I know of goes about this the way Delphi does. And the funny thing is that Pascal wasn't even created with that in mind! It was a remedy to switching between header and source in C and C++! But the definition of a unit is in my personal opinion the best there is in all the languages I worked with. Those units make sens to be tested.

Remember: think before you write, read it after you wrote it and in two weeks time. If it makes no sense what you wrote re-write it until it is readable. Refactor, extract, rename, test, unit-test, re-test - do whatever you need to make sure it's not going to be ordeal for whoever will work with that piece next. For all you know it might be you!

Saturday, March 12, 2016

Unit test coverage means nothing

If you're like me or any other person that got hooked up on TDD you might want to take a look at this presentation by the creator of Ruby on Rails on RailsCon 2014

RailsConf 2014 - Keynote: Writing Software by David Heinemeier Hansson

I think this is the missing piece of revelation that I have been looking for for years on end. It always felt like there's something wrong with the world of TDD but I just couldn't figure it out myself. I did have a project with 100% code coverage and tests running bloody fast that broke down in the first week of being online and I did write software working for 10+ years that had no tests whatsoever and still performs its duties Today without a hiccup! I also wrote a ton of software that I'm not proud of but some pieces from 17 years back when I read through them now look as though I had a genius by my side in terms of readability and clarity. This is a pure stunning experience.

Go write some code that you'll be proud reading 10+ years from now!

Windows and Git - config --global not working

When you use Git and you're behind a corporate firewall that blocks access to remote repositories via the git:// protocol you're being advised on many sites to exchange the git:// protocol to https://

git config --global url.https://.insteadOf git://

That's all nice and dandy until you're on Windows where this just simply doesn't work when used from npm. I only experienced that problem on Windows 10 but I'm pretty sure it's going to be equally fucked up on any other version. The reason for it is that npm uses some kind of different user to clone the repos. Let's face it: whatever the reason is Windows just suck big time anyways.

The only way I found that would make it work and allow to install packages using npm is to do the same configuration but system-wide like so

git config --system url.https://.insteadOf git://

I'm far, far away from saying that life's good again but that piece works now.