Flame wars have been with us for God knows how long. Christianity vs all other religions, Islam vs everyone else, socialism vs capitalism, Atari vs Commodore, Amiga vs PC, Macintosh vs the World, Linux vs everyone else. Fighting for our believes seems to be at the core of our nature. We just can’t help thinking that what we think is best must be the best - because we deemed it to be the case.
Developers are a particular strain of oddities: Whatever technology we seem to work with at the time seems to be either the evil incarnated or the impersonation of God - depending on the current hype among our friends that we trust know better. Very seldom it is the case we grow big enough balls to actually go over the fence, dig into the dirt and figure out for ourselves if the grass on the other end is really so gray as we deem it to be.
It has been an eye opening experience every time I took a turn in my career when it came to the tools I worked with.
I started in the 7th heaven owning an Atari 800XE and 3 games I could barely hold still while loading a game from a tape storage. For that reason I started to learn what the keyboard and TV set has to offer beyond playing Mr Robot and Forth Apocalypse and I discovered that, in point of fact, Basic was a part of the package.
Soon after that I became a sworn 6502/C64 freak at the age of 14. I know. I betrayed. I was the outcast. But I was doing ora-dycpys going over side borders while playing ripped tunes and waving Dedal logos. And I did it all in machine language - not even assembly! All I have had was the Final Replay 2 cartridge but for reasons I cannot fathom it was the best thing that ever happened for me. I was able to see the results of my work upon issuing a single sys command. And it looked great!
At that time Amiga was “the better game machine for me”. I mean, with all due respect, I still think that the playability of Giana Sisters, the Mario clone for Commodore computers, was way better on C64 than it was on any platform (even XBox!). Having said that I was kind of socially pressed into wanting the A500 with 512Kb of Slow RAM extension to be able to play Pinball Dreams and IK4+. But playing games was never my thing. After I saw how much Giana Sisters sucked by comparison I started looking for things to do with my shiny new computer. Pascal was there, but the animation example I saw was visually so bad that I couldn’t stand watching it. The C example didn’t even compile so I thought it was a waste of time to get interested in it. But the good old low level language, Motorola 68k assembler, was quite a nice fit for me after having a few years of experience in programming registers on its older uncle. Man, those plasma screens I loved so much! I was staring at it for hours! I was finally home!!!
At that time I remember reading a quite far reaching article about code quality. A couple of demo scene Gods discussed if it makes sense to write good quality code or if it is more important to just code it in the fastest way possible, win the compo at a demo party and move on to the next one. A question I sure hope the industry has answered so far to everyone’s satisfaction.
Being bored writing sinus scrolls, 3D animations and plasmas I started looking into this promised land Amos was said to be. With its Amal animation language targeting Amiga’s coprocessors it was told to be even better than asm itself. I remember it being the first IDE having an integrated debugger, forward function declaration and (upon pressing F9) code folding. Man! I missed that for years afterwards!
Then one day everything changed. My beloved A1200 was (again, upon the pressure of friends) exchanged to a 486SX with 50MB hard-drive. It was running DOS, Norton Commander, it looked bad (compared to A1200’s workbench) and what I had no idea then it was the first computer running an OS that was not Unix-like. Apparently, for what seems to be forever, I have fallen into the Redmond dream that I was unable to wake up from.
I remember a few years after that trying to install RedHat Linux from a 23 floppy disks installation - and failing miserably. I thought that those “Linux” guys must be insane to be using something like that. I was a sworn DOS enthusiast! I discovered Windows 3.1 and the only thing it was good for was multitasking to run the BBS software and at the same time to be able to code in Turbo Pascal that I fell in love with in the meantime. Pascal wasn’t fast enough though to write intros/demos so me and my friend resorted to “db 66” asm instructions to speed up double buffering.
Not long after that I learned that programming isn’t really something lots of people do particularly well - me included. That was when I discovered The Almighty Internet. Suddenly the knowledge that I craved for so much for so long became within my reach. But it was so overwhelming!!! Just going through some examples that I found on swagger took me a lot of time. Those were the times when I first saw Qnix - a windowing system-enabled one floppy-disk, Unix-like, free to use real time operating system. So Unix did have some appeal after all, I thought. I went even deeper when I learned about the Linux Router Project - a one-floppy-disk Linux distribution that did IP routing and masquerading out of the box. I knew Linux was the one - but it was so different than Windows and Dos Navigator that I grew so accustomed to!! All the things were different. That was just a hassle I was not ready to go through.
Fast forward a few years and the Turbo Pascal I worked in became Delphi and my professional career was booming! I was moving to a different country, founding my second business - I was on the roll! To have everything in check instead of buying Windows XP I decided to try out this “SUSE Liunux” as it was promised to “just install and work” on my PC. Well, it did. But it was soooo different than Windows XP and… Delphi didn’t run there. I was still deep in the Redmond dream.
I don’t remember when it really happened but it must have been when I received the Ubuntu 7 something CD for free. It really delivered on the promise to be approachable enough for everyone - this time me included. But Delphi still didn’t run there. By accident that was the time I went to a seminar in Warsaw where the successor of Delphi 7 was announced and I realized that this is the end of what I was able to get out of the platform. So I started looking…
At that time I worked in a corporation that offered me an option to learn Java and .NET. C# looked like a natural choice (being coordinated by the creator of Delphi himself) but for reasons I cannot fathom till this day I decided to go with Java. The first years were a disaster! Nothing was like it should have been. Java developers spoke of things I had no idea even had names (like refactoring, unit testing, clean code, design patterns) although upon deeper investigation it looked like we spoke about those same things - just naming them differently.
I fell in love with Groovy and Maven (I know - I’m different that way). I use both of those tools till this day with great proficiency. I think that Maven was the best thing that ever happened to Java. It made it approachable for mere mortals and freed us from Ant hell. Groovy on the other hand was for me the Pascal equivalent on the JVM. It had the concept of properties that I missed so much ever since I left Delphi behind. But at the same time I unwillingly became independent of the environment I worked in. Linux, Windows, OSX - I didn’t care anymore. So I realized one of my long-lived dreams and switched fully to Ubuntu - the Linux platform for the rest of us :) It was a natural step because all production servers were running Linux so using SSH that was not (and still isn’t) present on Windows felt so natural. And so, Windows became “the OS I sometimes run”. I do remember the day when I installed Linux exclusively without dual-boot. It felt weird - but good.
Since then I moved from Java to frontend development. Something I thought I had some idea of but was proven sooooooo wrong. Learned that what you see is not really what you will get (as in I finally learned what the hell everyone else was talking about in relation to Internet Explorer 6). But I love every bit of it. It gives out of the box tooling that on other platforms you need to pay good money for. To some extent I am even happy I learned about frontend development through the Ember.js perspective. It was the worst thing after getting struck by Java Server Faces but it made me explore the domain to see if there’s something that can substitute this horrible piece of machinery. So I learned about React and fell in love with it, I also learned about Angular and how it makes development more like I was used to from my times in Java and ended falling completely in love with Vue.js.
Then I decided to move to a company that thinks very little of frontend development but is really big on Sitecore. If you don’t know what Sitecore is think Wordpress on .NET that you can pay for because you think it is better than X or Y. This meant for me taking a round trip to the .NET world and the C# language.
At each step after a few months of digging in I felt my passion for software development giving away field to understanding of capabilities. Every week/month/year I meet sworn enemies of technology X than can give me 5, 10 sometimes even 20 reasons for not using the other, legacy, frameworks, languages, platforms. The truth however is that everything has its reason for existence. Yes, even jQuery and goto. I just wish I learned that years ago. The only thing that really counts is writing code for other developers to read (regardless of the platform/language) and questioning the status quo if it makes your life harder.
So going from 6502, through M68k, Amos/Amal, Pascal, Delphi, Java, JS and the browser and now .NET and Sitecore I learned only one thing: developing software is easy. But doing it right is hard. And if you don’t use whatever means necessary to help you out then sinking in the pool of your own blood and excrements is as obvious as the fact that sun’s rising in the east. The rule for me now is to first learn for myself if a piece of tech is useful in a context - not “in general”. And as a rule it has its exception for me: JSF. Everything else I learned over the years made me a better programmer, person, husband and father.
Happy years!