Sunday, November 25, 2018

Making an UV image setter for PCBs

This is going to be a quick one. I wanted to do one of these for a long time but somehow it eluded me for quite some time (too long if you ask me)

The problem

I need to make a PCB for a small circuit at home. Conveniently. After I'm done designing the circuit it usually is late and the sun doesn't shine no more. I want to be able to use the photoresist dry film (foil) and I want it to be fast!

The solution

I used to use a 500W halogen, then an energy-saving fluorescent lamp but all of those things were just cumbersome. Finally I have decided to make use of UV emitting diodes.

See the following picture of a ready-made product:

The device is powered by a 3 cell LiPo battery (I have a bunch of them for my flying toys). The time it takes to properly develop the foil is ABSOLUTELY SHORTEST FROM ALL THE LIGHT SOURCES I USED BEFORE!!!

The 500W halogen needed circa 30 minutes in the setup I used. The energy-saving light bulb needed 12.5 minutes. This little toy does the same and more in 90 seconds!!!

The making

Making such a toy is extremely simple. You just cut as many pieces to length as much space you have in the housing of your choosing (I used the Z23 housing where I have cut a full-size hole in one of the parts).

Next you cut a piece of paper that fits inside the housing. Once you do that take the pieces of strips and glue them on to the piece of paper in a similar way that you see on the picture above. The point here is to make the as evenly spaced as possible.

Then you solder them to form a strip (plus with plus, minus with minus). On one end you solder any kind of connector (I went for the small JST one because I have a lot of them). Then just make a hole on the side where the leads with the connector are coming out.

Finally place that paper with glued strips inside, apply power and enjoy!

Happy PCBing!

Friday, November 23, 2018

The Flame Wars - story of my life

Flame wars have been with us for God knows how long. Christianity vs all other religions, Islam vs everyone else, socialism vs capitalism, Atari vs Commodore, Amiga vs PC, Macintosh vs the World, Linux vs everyone else. Fighting for our believes seems to be at the core of our nature. We just can’t help thinking that what we think is best must be the best - because we deemed it to be the case.

Developers are a particular strain of oddities: Whatever technology we seem to work with at the time seems to be either the evil incarnated or the impersonation of God - depending on the current hype among our friends that we trust know better. Very seldom it is the case we grow big enough balls to actually go over the fence, dig into the dirt and figure out for ourselves if the grass on the other end is really so gray as we deem it to be.

It has been an eye opening experience every time I took a turn in my career when it came to the tools I worked with.

I started in the 7th heaven owning an Atari 800XE and 3 games I could barely hold still while loading a game from a tape storage. For that reason I started to learn what the keyboard and TV set has to offer beyond playing Mr Robot and Forth Apocalypse and I discovered that, in point of fact, Basic was a part of the package.

Soon after that I became a sworn 6502/C64 freak at the age of 14. I know. I betrayed. I was the outcast. But I was doing ora-dycpys going over side borders while playing ripped tunes and waving Dedal logos. And I did it all in machine language - not even assembly! All I have had was the Final Replay 2 cartridge but for reasons I cannot fathom it was the best thing that ever happened for me. I was able to see the results of my work upon issuing a single sys command. And it looked great!

At that time Amiga was “the better game machine for me”. I mean, with all due respect, I still think that the playability of Giana Sisters, the Mario clone for Commodore computers, was way better on C64 than it was on any platform (even XBox!). Having said that I was kind of socially pressed into wanting the A500 with 512Kb of Slow RAM extension to be able to play Pinball Dreams and IK4+. But playing games was never my thing. After I saw how much Giana Sisters sucked by comparison I started looking for things to do with my shiny new computer. Pascal was there, but the animation example I saw was visually so bad that I couldn’t stand watching it. The C example didn’t even compile so I thought it was a waste of time to get interested in it. But the good old low level language, Motorola 68k assembler, was quite a nice fit for me after having a few years of experience in programming registers on its older uncle. Man, those plasma screens I loved so much! I was staring at it for hours! I was finally home!!!

At that time I remember reading a quite far reaching article about code quality. A couple of demo scene Gods discussed if it makes sense to write good quality code or if it is more important to just code it in the fastest way possible, win the compo at a demo party and move on to the next one. A question I sure hope the industry has answered so far to everyone’s satisfaction.

Being bored writing sinus scrolls, 3D animations and plasmas I started looking into this promised land Amos was said to be. With its Amal animation language targeting Amiga’s coprocessors it was told to be even better than asm itself. I remember it being the first IDE having an integrated debugger, forward function declaration and (upon pressing F9) code folding. Man! I missed that for years afterwards!

Then one day everything changed. My beloved A1200 was (again, upon the pressure of friends) exchanged to a 486SX with 50MB hard-drive. It was running DOS, Norton Commander, it looked bad (compared to A1200’s workbench) and what I had no idea then it was the first computer running an OS that was not Unix-like. Apparently, for what seems to be forever, I have fallen into the Redmond dream that I was unable to wake up from.

I remember a few years after that trying to install RedHat Linux from a 23 floppy disks installation - and failing miserably. I thought that those “Linux” guys must be insane to be using something like that. I was a sworn DOS enthusiast! I discovered Windows 3.1 and the only thing it was good for was multitasking to run the BBS software and at the same time to be able to code in Turbo Pascal that I fell in love with in the meantime. Pascal wasn’t fast enough though to write intros/demos so me and my friend resorted to “db 66” asm instructions to speed up double buffering.

Not long after that I learned that programming isn’t really something lots of people do particularly well - me included. That was when I discovered The Almighty Internet. Suddenly the knowledge that I craved for so much for so long became within my reach. But it was so overwhelming!!! Just going through some examples that I found on swagger took me a lot of time. Those were the times when I first saw Qnix - a windowing system-enabled one floppy-disk, Unix-like, free to use real time operating system. So Unix did have some appeal after all, I thought. I went even deeper when I learned about the Linux Router Project - a one-floppy-disk Linux distribution that did IP routing and masquerading out of the box. I knew Linux was the one - but it was so different than Windows and Dos Navigator that I grew so accustomed to!! All the things were different. That was just a hassle I was not ready to go through.

Fast forward a few years and the Turbo Pascal I worked in became Delphi and my professional career was booming! I was moving to a different country, founding my second business - I was on the roll! To have everything in check instead of buying Windows XP I decided to try out this “SUSE Liunux” as it was promised to “just install and work” on my PC. Well, it did. But it was soooo different than Windows XP and… Delphi didn’t run there. I was still deep in the Redmond dream.

I don’t remember when it really happened but it must have been when I received the Ubuntu 7 something CD for free. It really delivered on the promise to be approachable enough for everyone - this time me included. But Delphi still didn’t run there. By accident that was the time I went to a seminar in Warsaw where the successor of Delphi 7 was announced and I realized that this is the end of what I was able to get out of the platform. So I started looking…

At that time I worked in a corporation that offered me an option to learn Java and .NET. C# looked like a natural choice (being coordinated by the creator of Delphi himself) but for reasons I cannot fathom till this day I decided to go with Java. The first years were a disaster! Nothing was like it should have been. Java developers spoke of things I had no idea even had names (like refactoring, unit testing, clean code, design patterns) although upon deeper investigation it looked like we spoke about those same things - just naming them differently.

I fell in love with Groovy and Maven (I know - I’m different that way). I use both of those tools till this day with great proficiency. I think that Maven was the best thing that ever happened to Java. It made it approachable for mere mortals and freed us from Ant hell. Groovy on the other hand was for me the Pascal equivalent on the JVM. It had the concept of properties that I missed so much ever since I left Delphi behind. But at the same time I unwillingly became independent of the environment I worked in. Linux, Windows, OSX - I didn’t care anymore. So I realized one of my long-lived dreams and switched fully to Ubuntu - the Linux platform for the rest of us :) It was a natural step because all production servers were running Linux so using SSH that was not (and still isn’t) present on Windows felt so natural. And so, Windows became “the OS I sometimes run”. I do remember the day when I installed Linux exclusively without dual-boot. It felt weird - but good.

Since then I moved from Java to frontend development. Something I thought I had some idea of but was proven sooooooo wrong. Learned that what you see is not really what you will get (as in I finally learned what the hell everyone else was talking about in relation to Internet Explorer 6). But I love every bit of it. It gives out of the box tooling that on other platforms you need to pay good money for. To some extent I am even happy I learned about frontend development through the Ember.js perspective. It was the worst thing after getting struck by Java Server Faces but it made me explore the domain to see if there’s something that can substitute this horrible piece of machinery. So I learned about React and fell in love with it, I also learned about Angular and how it makes development more like I was used to from my times in Java and ended falling completely in love with Vue.js.

Then I decided to move to a company that thinks very little of frontend development but is really big on Sitecore. If you don’t know what Sitecore is think Wordpress on .NET that you can pay for because you think it is better than X or Y. This meant for me taking a round trip to the .NET world and the C# language.

At each step after a few months of digging in I felt my passion for software development giving away field to understanding of capabilities. Every week/month/year I meet sworn enemies of technology X than can give me 5, 10 sometimes even 20 reasons for not using the other, legacy, frameworks, languages, platforms. The truth however is that everything has its reason for existence. Yes, even jQuery and goto. I just wish I learned that years ago. The only thing that really counts is writing code for other developers to read (regardless of the platform/language) and questioning the status quo if it makes your life harder.

So going from 6502, through M68k, Amos/Amal, Pascal, Delphi, Java, JS and the browser and now .NET and Sitecore I learned only one thing: developing software is easy. But doing it right is hard. And if you don’t use whatever means necessary to help you out then sinking in the pool of your own blood and excrements is as obvious as the fact that sun’s rising in the east. The rule for me now is to first learn for myself if a piece of tech is useful in a context - not “in general”. And as a rule it has its exception for me: JSF. Everything else I learned over the years made me a better programmer, person, husband and father.

Happy years!

Wednesday, October 31, 2018

Creating PCBs

I have been working lately on perfecting a formula for preparing PCBs. If you're like me and like your designs tried out faster than a few days read on.

Assumptions

I am assuming you either are using the applications and materials that I used or you are happy to realize the same functions in the application you are using. I will be using Linux, KiCAD, a PDF viewer for printing and chemicals available in Poland, where I live.

Preparing the printout

The following assumes you can use KiCAD. KiCAD's printing functionality is basically broken. The quality of printouts is so low you can't do anything with it. The following mitigates that by using Plot.

When creating designs that contain through-hole components make sure to check the size of pads. By default they are tiny-tiny small and can easily break when heat is applied. To change the parameters of a pad right-click on, select Properties and set the Hole shape to Oval, Size X and Size Y to at least 2. With that there is probably enough space to solder.

  1. Create your design
  2. Select File / Plot
  3. Set Plot format to PDF
  4. Select the copper layers
  5. Select Mirrored plot and Negative plot
  6. Set Drill marks to Actual size
  7. Click Plot
  8. Open the created PDF and print it on a Canson calcque
  9. Use Density Toner to even the traces on the layout. Density Toner is basically rectified gasoline which melts the toner making it evenly black in all spots. This prevents the printout to leak UV rays through

Depending on your printer (I use OKI C332 laser printer) it is important to set the printout density to maximum. That black toner is all that will block the UV rays from the light-sensitive material!

Preparint the PCB

  1. Cut out a piece of laminat to the size of your project. You don't need any excess - that's a waste I don't like
  2. Cut out the printed layout to the size of your PCB. Again, I do that precisely which then makes aligning things easy
  3. Cut out a piece of light-sensitive foil with additional 5-10mm. That excess is needed for the board to be exactly covered form side to side
  4. Clean the PCB. I mean like really, really well. I use a special sanding block for cleaning PCBs that I bought ages ago but a very fine sanding paper will do just fine. Don't overdo it - you don't want the copper layer to get too thin!
  5. Wipe the board with a clean tissue so that any dust is removed from the surface
  6. Peal the cover off of the foil on the inner-side. I do that with a piece of tape - works every time with absolutely no hassle
  7. Place the foil on the PCB - the side that had the outer foil removed
  8. Make sure there are no air bubbles or dust bubbles between the foil and PCB
  9. Put the PCB between a folded sheet of paper (PCB towards the fold!) to form a sandwitch
  10. Run the sandwitch through laminator at least 3 times. Make sure the laminator is properly heated!
  11. Remove paper from sandwitch leaving only PCB. Beware not to peal of the foil from PCB
  12. Put the printed layout toner-down on the PCB
  13. Spray the PCB with layout on it with Transparent and remove all air bubbles
  14. Cover the PCB with a plexi and make sure there are no air bubbles. If there are then you either have to litle Transparent or the field is not even.
  15. Place a 32W energy saving bulb 22cm from the PCB.
  16. Turn on the light source for 12-13 minutes
  17. Remove the PCB and put it in a dark place for 10 minutes (necessary for the chemical process to finish)
  18. Remove the outer protective foil - watch out not to break the layout!
  19. Use foil developer for 2 minutes. While developing use a soft brush to remove the disolved foil. Do NOT exceed 2 minutes!
  20. Wash the PCB under warm running water and using hands delicately remove any remaining pieces of foil.
  21. Put the PCB under light for 5 minutes. That way if there are any parts of the foil lefy they will turn blue and you'll know you have to start again
  22. Put the PCB in a dark place for 5 minutes for the chemical process of hardening the foil to settle
  23. Use B327 to remove non-covered parts of the copper; Don't keep it too long.
  24. Once all the parts are disolved put the PCB under running water and wash excessively
  25. Pour acetone on the board using surface tension to flow the liquid from one side to another. Do that until the foil peels of from the traces
  26. Wash under running water

That's it! With luck the PCB can have extremely small traces (requires good printer which the C332 is not) and very delicate handling especially when washing out the unexposed foil. I stick to 0,25mm traces and above but even that can sometimes go wrong. 0,5mm traces are guaranteed to be 100% successful.

Important remarks

Don't rush things! With the exception of exposing the foil for too long and developing the foil for too long take your time! Rushing things is the worst you can do. This process requires you to be gentle with the board, precise and methodical. Rushing things will only get you to broken traces.

That is it! Happy etching!

Wednesday, September 19, 2018

POI 10, tests and coverage

The problem

You're using POI 10 with @poi/plugin-karma and want to get coverage of your unit tests. The way to do it is to specify:

  plugins: [
    require('@poi/plugin-karma')({
      coverage: true,
    })
  ]

and theoretically you should be all set. Only that it won't work:

TypeError: Cannot read property 'preLoaders' of undefined

I've looked everywhere and it seems that it is, unfortunately, not a known problem. What is even worse the GitHub repo with POI sources, version 10, is GONE! I mean I understand everything but removing history of an opensource project - that is a bit too much for me. SHAME ON YOU, EGOIST!

The solution

The problem is that @poi/plugin-karma taps directly to options of the vue-loader whist seems to be undefined at that point. The quick and dirty solution is to add those options:

  plugins: [
    require('@poi/plugin-karma')({
      coverage: true,
      chainWebpack (config) {
        config.module.rule('vue').use('vue-loader').options({})
      }
    })
  ]

I need to dig more into the matter why vue-loader isn't configured there and why the hell when testing a Vue.js SFC still causes errors but for now for testing just javascript code the above solution works

Happy coding

Friday, September 14, 2018

Replacing strings in proxied response using Apache mod_substitute

This is another quicky but it took me forever to finally figure it out - maybe someone will have that same problem...

The problem

You're working on a site that uses Apache2 proxy to merge your local frontend development environment with a remote site providing the content. That is easy to setup:

ProxyRequests Off
ProxyPreserveHost Off

ProxyPass "/path-to-a-resource" "http://localhost:3000/path-to-local-resource"
ProxyPass / http://remote-host/
ProxyPassReverse / http://remote-host/

Now all is nice and dandy until all you want to do is just to merge the remote site with local resources on localhost. It starts to get interesting when you want to modify the content sent by upstream server. Why? Well... let's say links in that sent HTML are absolute and you'd like them to be relative. There's a ton of options that are available once you can do that!

The solution

To be able to substitute strings in the response sent by upstream server there are 3 things that need to happen:

  • Enable the mod_substitute ($ sudo a2enmod substitute)
  • Enable filtering using mod_substitute
  • Specify replacement rules

The second part is what gave me a headache Today. Basically you need to understand 2 things: when Apache does the substitution it needs to have the full page received to be able to perform the substitution. That doesn't happen automatically. And then once the substitution is done Apache needs to put it all back together and send it to the browser. It's all done using the following for HMTL files only:

AddOutputFilterByType INFLATE;SUBSTITUTE;DEFLATE text/html

And last but not least: we need some substitutions to see the effect. For the purpose of demonstration we'll replace all class attributes with TEST:

Substitute "s|class|TEST|i"

That's it! It's easy as pie when you know the deal :) I know I'll be using it more and more in the future!

Happy coding!

Friday, July 6, 2018

Handling new POI's publicPath

Today I wasted over 3 hours frantically looking for explanation why the hell POI's publicPath setting thinks I am an idiot and it knows better what I want. So this is a quick post to let you know what you should do if you really want to change the output.publicPath in a POI-managed project.

The problem

Imagine you're working on some project that has pre-existing sources and their location is (surprisingly) not in the root of your server. That's the case every single time when you're working on existing apps trying to induce some build system on them to bundle up the scripts and stylesheets.

Usually, for those cases Webpack has a special configuration option output.publicPath that handles everything for you and life's good. POI, on the other hand, has that option exposed at the top level - but with a twist: it only works in final build and not in development mode.

The problem is, however, that if you'd like to work against an existing server that produces HTML and other artifacts you need to comply with that structure regardless of the mode. Unfortunately, POI fucking knows better! And here's the relevant point in POI's code that does it:


function getPublicPath(command, publicPath) {
  if (command === 'build' && typeof publicPath === 'string') {
    return /\/$/.test(publicPath) || publicPath === ''
      ? publicPath
      : publicPath + '/'
  }
  return '/'
}

module.exports = getPublicPath

I have no idea what was going through EGOIST's head at the time of writing but it must have been something really, really strong. Not good for opensource as it breaks the fundamental rule of least surrprise.

The fix

Instead of setting the publicPath in poi.config.js or setting it via command-line parameter override it in the resulting Webpack configuration like so:

module.exports = {
  configureWebpack(config, context) {
    // we're not setting publicPath as the general configuration option
    // because some sick bastard decided this will only have an effect
    // in production mode and we need it everywhere. Talk about predictability...
    config.output.publicPath = '/path/where/everything/lives/'
    return config
  },

With that life's good again (even thought it was so damn frustrating Today)

The architecture

Working on projects where the content is delivered from some sort of CMS or other app that has a fat backend isn't easy. It usually boils down to having everything running locally, even if we don't need it. There is, however, a better way of doing it! Use some reverse proxy to route traffic to your browser from 2 different sources: the backend and the frontend in development mode. Here's an example configuration with Apache2 but you can do much much more than that:

<VirtualHost :80>
    ServerName my-app.local

    # Make sure no caching is induced so that refresh always brings the latest version
    # run "sudo a2enmod headers" to enable headers module
    Header set Cache-Control no-cache

    # run "sudo a2enmod proxy_http" to enable http proxy
    ProxyRequests     off
    ProxyPreserveHost off

    # Serve all theme files from local development server
    ProxyPassMatch    "^\/path\/where\/everything\/lives\/(.*)$" "http://localhost:10001/path/where/everything/lives/$1"

    # Required by Hot Reload (run "sudo a2enmod proxy_wstunnel" to enable ws proxy)
    ProxyPassMatch    "^/(.+).hot-update.js$" "http://localhost:10001/$1.hot-update.js"
    ProxyPassMatch    "^/(.+).hot-update.json$" "http://localhost:10001/$1.hot-update.json"
    ProxyPassMatch    "^/sockjs-node/(.+)/websocket" "ws://localhost:10001/sockjs-node/$1/websocket"
    ProxyPassMatch    "^/sockjs-node/(.+)$" "http://localhost:10001/sockjs-node/$1"

    # Serve everything else from remote server
    ProxyPass         / http://dev-machine.somedomain.com/
    ProxyPassReverse  / http://dev-machine.somedomain.com/

</VirtualHost :80>

Also, don't forget to set NameVitualHost *:80 or the name-based virtual hosting just won't work. Now with that in place add the following entry to your /etc/hosts file:

127.0.0.1   my-app.local

and you're all set.

You can of course fiddle with it, add SSL support if your backend server requires it, but the point here is that it is possible and you can work without installing heavy backend to work on frontend!

Happy coding!

Saturday, June 23, 2018

Scaffolding a new Vue.js component

I've been working for the past few months on a travel portal rewriting parts of it from old jQuery code to Vue.js - and learning a lot of stuff in the process. One thing that I learned the hard way is that scaffolding that's built in into Vetur is just not for me. It is too simple.

The naming convention we have for our component files is for them to have the same name as the file they exist in. So basically TestMe.vue becomes

<template>
  <div class="test-me">
    ...content
  </div>
</template>

<script>
import Vue from 'vue'
import Component from 'component'

@Component({})
export default class TestMe extends Vue {
}
</script>

<style lang="scss" scoped>
.test-me {
}
</style>

Makes sense?

The problem is that having to type this in every time I create a new component (and the number of those grows like creazy!) is tedious. So I finally got to it and created a snipped that works for me:

That's it! You type "sfc", press [Tab] and the component is ready to be worked on by magic of the VS Code templating engine.

The first thing that'll be edited is the name of the exported class. If it is OK then just press [Tab] to move to the name of the CSS class. Edit it if you must then press [Tab] again and you'll be able to select the processor for styles. The next thing that you get to customize is if the styles should be scoped. Once on it either press [Tab] to go with scoped styles or [Delete] and then [Tab] to go without scoped styles. Finally you arrive at the Hello, world! text where the editing of your component begins.

Have a nice day!

Saturday, June 16, 2018

Vue.js - editor components

When talking about creating applications in Vue.js it is not hard to find one that does something to data. Obviously for state management there's Vuex, Redux and other stores but in this post we're going to focus strictly on passing state via props to components that one could generally call controls - so more on the lines of custom inputs.

The 3 cases

There are 3 cases that we will encounter when passing data down to controls:

  • We're passing in primitive values (strings, numbers, booleans)
  • We're passing in complex reactive objects (think: list of people)
  • We're passing in a complex reactive object but we'd like to treat it as a primitive and have atomic changes to all its fields

Primitives

In the case of primitives the situation is dead simple: use v-model, you can react to changes by hooking up to the input event - done. The documentation is fantastic in that area so if you'd like to know more about it dig in!

Reactive complex objects

Imagine you have a data structure like this:

data = { firstName: 'John', lastName: 'Doe' }

In this case if you create a control (we'll call it DataEdit.vue) that you'd like to immediately edit the fields you could do something like that:

<template>
  <div>
    <input v-model="value.firstName">
    <input v-model="value.lastName">
  </div>
</template>

<script>
export default {
  props: {
    value: Object
  }
}
</script>

This means that if used in a parent component (ContactForm.vue) like so:

<template>
  <div>
    <h1>{{ person.firstName }} - {{ person.lastName }}</h1>
    <DataEdit :value="person" />
  </div>
</template>

<script>
import DataEdit from './DataEdit.vue'

export default {
  components: {
    DataEdit
  },
  data() {
    return {
      person: { firstName: 'John', lastName: 'Doe' }
    }
  }
}
</script>

then changes to values in inputs in DataEdit.vue component will immediately be reflected in ContactForm.vue. Vue's reactive system at its best.

Treating complex objects like a value

This is the trickiest one because even though we're passing on a complex, reactive object we'd like to get all the changes at once or none at all. You might ask why would you want such a thing? The answer is quite simple: you'd like to implement "OK/Cancel" functionality or (if the edits drive some kind of Ajax requests) limit the number of actions upon edits. It is quite obvious that there will be a need for a copy of the reactive object. For that I use cloneDeep method from Lodash and it works just great so far.

<template>
  <div>
    <input v-model="internal.firstName">
    <input v-model="internal.lastName">
    <button @click="$emit('input', internal)>Save</button>
  </div>
</template>

<script>
import cloneDeep from 'lodash/cloneDeep'

export default {
  props: {
    value: Object
  },
  data () {
    return {
      internal: cloneDeep(this.value)
    }
  }
}
</script>

Now if that component is used in the ContactForm.vue (note the change from :value to v-model)

<template>
  <div>
    <h1>{{ person.firstName }} - {{ person.lastName }}</h1>
    <DataEdit v-model="person" @input="personUpdated" />
  </div>
</template>

<script>
import DataEdit from './DataEdit.vue'

export default {
  components: {
    DataEdit
  },
  data() {
    return {
      person: { firstName: 'John', lastName: 'Doe' }
    }
  },
  methods: {
    personUpdated(newValue) {
      console.log('Person has been updated to: ', newValue)
    }
  }
}
</script>

you won't see any changes to the header until they are saved by clicking the Save button. Pretty neat, right? On top of that you can be notified when the change occurred so if some additional action needs to take place (like updating list of people from an external database) by listening to the input event. That is just pure awesome!

One more thing...

If the editor persists it will now share the internal and value objects which will make it behave like the case where everything is reactive. Not good - let's do something about it

<template>
  <div>
    <input v-model="internal.firstName">
    <input v-model="internal.lastName">
    <button @click="$emit('input', internal)">Save</button>
  </div>
</template>

<script>
import cloneDeep from 'lodash/cloneDeep'

export default {
  props: {
    value: Object
  },
  watch: {
    value: {
      handler (newValue) {
        this.internal = cloneDeep(newValue)
      },
      deep: true
    },
  },
  data () {
    return {
      internal: cloneDeep(this.value)
    }
  }
}
</script>

The introduced watch updates the internal state so that it is again disconnected from the ContactForm.vue. Of course in a situation where the DataEdit.vue component is removed from DOM due to let's say closing a popup then the watch is completely unnecessary. It does however come in handy if there might be a possibility that the data object in question (or some of its parts) can be modified from the parent component. The internal state will be out of sync in such case. This might happen if some of the details come from an Ajax request or a timer. The watch covers both cases so it is basically a universal way for data synchronization on changes from parent component.

Working example

I know this is a lot to take in at once. Therefore I have prepared a test application for you that illustrates all the pieces. You can find it at https://github.com/padcom/vue-editor-components

That's it, folks!

Monday, April 9, 2018

Vue.js and functional single file components

Single file components (SFC) is probably the most powerful feature (structure-wise) in Vue.js. They are simple in nature but very powerful when it comes to bundling things together:

<template>
  <h1 class="my-header>Hello!</h1>
</template>

<script>
export default {
}
</script>

<style>
.my-header {
  color: red;
}
</style>

There are times when you want your component to be stateless. Such components are great for bundling together functionality that would otherwise be created by putting together other components. I call them template components, because they are predefined with places where you can override the defaults. Those components do not have state hence we call them stateless or functional.

The problem with SFCs is that unlike regular .vue components they don't pass data down. Especially the class and style properties are not passed automatically. To mitigate that I went through a few iterations to find the nicest solution. Here's what I came up with that seems to do the job very well. Let's assume we're creating a predefined component for animation. We want the styles to be bundled with the component so SFC is the right way to go.

<script>
export default {
  functional: true,
  render: (h, { data, children }) => (
    <transition-group { ...data } tag="ul" name="slide">
      { children }
    </transition-group>
  )
}
</script>

<style>
.slide-move {
  transition: transform 1s cubic-bezier(0.68, -0.55, 0.265, 1.55);
}
</style>

A quick explanation: { data, children } is a destructing assignment from the second parameter to 2 fields that we'll use later. Then the { ...data } means "apply all elements passed on (including but not limited to class and style).

That's all there is to it.

Thursday, March 22, 2018

HttpHandler registration on XSP4 vs IIS

The problem

You're developing an ASP.NET HttpHandler on Mono using XSP and now you need to deploy your app on IIS. What worked on XSP4 isn't working anymore and strange errors appear.

Let's start with the config that works on XSP4

<?xml version="1.0"?>
<configuration>
  <system.web>
    <httpHandlers>
      <add verb="*" path="*" type="MyHandler" />
    </httpHandlers>
  </system.web>
</configuration>

But you put that same Web.config file on IIS it throws errors.

The solution

As you see the registration happens inside system.web/httpHandlers section. That works on XSP4 and older ASP.NET but it's not the case on modern IIS. In there you need to switch to system.webServer/handlers section like this:

<?xml version="1.0"?>

<configuration>
  <system.webServer>
    <handlers>
      <add name="name" verb="*" path="*" type="MyHandler" />
    </handlers>
  </system.webServer>
</configuration>

That's it! Happy coding!

Wednesday, March 21, 2018

Getting started with POI and React

This will be a quick one and, quite frankly, a very obvious one. But since I forgot how it is done I better write it down for posterity (and me).

Setting things up

As always you'll need a folder. Let's call it my-example

$ mkdir my-example
$ cd my-example

Then we need to initialize our project:

$ npm init

and accept all the defaults.

Next we need a few packages installed:

$ npm install poi react react-dom --save-dev

Creating poi.config.js

Since by default POI compiles JSX down to Vue's interpretation of what JSX is we need to tell it to use React's JSX compilation. We do that by adding the poi.config.js file with the following content:

module.exports = {
  jsx: 'react'
}

Creating start script

Finally, to make our project setup complete we should add a start script to package.json to make use of the locally installed POI:

"scripts": {
  "start": "poi"
}

Time for a simple app

With all that setup all there is to it is to test it with some React components. For that we will create index.js with the following content to display a Hello, world!-type app:

import React from 'react'
import ReactDOM from 'react-dom'

ReactDOM.render(<h1>Hello, world!</h1>, document.getElementbyId('app'))

And just like that, in under 3 minutes, by the magic of POI, we have a fully-working application that can be developed, tested and built for production deployment. I like it!

Happy coding!

Big Fat Ass - Windows + Visual Studio

Warning: this post contains bad language, obscene descriptions and a lot of frustration. Don't read it if you're sensitive.

The fucking nightmare

In the past I used to work with Windows. I liked it - especially Windows 2000 and then XP. It was kind of fresh, you know? Still looking the same as the old win but kinda... better. And after a few small upgrades to the hardware, like a tiny bit more RAM, it worked really fast. I was reluctant to look for other OSes for more reasons than just I got used to it: Delphi, the IDE I was working most of the time in, was not available on other platforms. And I loved Delphi. It was the best thing ever! Especially the Personal edition which was bloody fast and easy to use.

At one point I gave Linux a try. Slackaware to be precise. I felt this was not just for me but also not for any sane human being. I mean come on: compile kernel just to be able to install the distro? Are you fucking kidding me? Then I gave RedHat a try. Well.. better but still: 24 floppies just to install the basic OS? Are you fucking kidding me?!!! And then Ubuntu came along a few years later. I ordered the CD (good old days :D), installed it and liked it instantly. It looked different, foreign... but nice. I got along with the keyboard quite fast, cmd was finally bash and way more humane to work with... Basically, for the first time I didn't immediately went with "You have to be shitting me, man! What a shit hole!". But I was still a Delphi developer so it was more for entertainment than anything else...

Fast forward a few years and that stopped being a problem since I moved to Java.

2018

It's 2018. I work exclusively in Linux, I love every bit of it. I like that I can choose not to butcher my computer with CPU/GPU-intensive operations just to show the content of a fucking folder. I'm loving XFCE and everything about it. I use Linux Mint and my main occupation is frontend development so Windows is pretty much useless anyways. I play on Linux, I watch movies on Linux, I work on Linux. And everyone in my family does too and they like it! My distro of choice is Linux Mint because it is the closest approximation of Windows XP I can get out of the box. And it is good!

And then I decided to give .NET a spin... Man,.. there are decisions that one regrets moments after they are made. At first it wasn't so bad. I still used Linux with Mono, tried Core but it wasn't the right target I was looking for so Mono was the option. Everything seemed to work just fine, MonoDevelop rocks! Fast, easy, nice... I like. I even completed the entire project, was able to test it using XSP...

All was good just up to the point where I needed to test it in real conditions. NOTHING FUCKING WORKED! NOT A SINGLE FUCKING THING! JAVA KILLER MY ASS!!!

So basically creating a solution in MonoDevelop does not mean you will be able to load it into the latest Visual Studio. Windows, when I turned it on, immediately started taking over my CPU fan, spinning it so fucking fast the whole machine became more than just a bit warm. It was FUCKING HOT!!! And still... nothing fucking worked!!!

After a trial and error, looking at Stack Overflow and other enlightened sites I finally was able to throw away all the project files, recreated them in VS 2017 and then it still didn't fucking work! All because some idiot decided to switch from one XML configuration section to another. I mean for the love of God, really?! WHAT THE FUCK!?!?

It took me a better part of this evening to just get it started, then about 3 minutes to add a portion to my node module that converts from fucking backslashes to forward slashes (as though the rest of the fucking world were morons and used the wrong character for path separator) and I was finally on my marry to start hating Windows on my blog.

Post mortem

Unfortunately, as a front developer I need to have access to Edge which, surprise surprise!, is only available on fucking Windows. And this is the only reason I have a separate, dual boot notebook that runs a legal (my God, what a fucking waste of fucking hard earned money) copy of Windows 10.

I need a fucking drink after all this. No wonder .NET developers are so strange... there is a damn good reason for it!

God save us from Windows and .NET. Please!!!

Monday, January 29, 2018

Side project: getting eyes far away

Last year I became obsessed with FPV. I don't mean games - that's kinda for kids. I'm talking here about getting an object, a plane in this case, equipped with a camera and other equipment that can be controlled from a distance. I was trying to get into the multi-rotor business but that seems not to be my thing. Planes, on the other hand, are. So I tried my luck there...

My first time was... well... exciting wouldn't even begin to cover it! It was a mysterious experience! I was actually in a plane that flies around, hundreds of meters away from me, seeing the real life flashing before my eyes. It was simply a blast! But as soon as I flew the first time and saw the land that I already covered something woke up in me: the nagging question what is out there? I guess I am kind of an explorer down deep inside :) And so I begun to cover more and more ground. It soon was pretty clear for me that the equipment I had at the time with a 200mW VTX was not going to quite cut it. All that was made crystal clear for me once I lost the plane after just several hundred meters flight in the neighborhood. It sucked big time and about 600 PLN was forever lost. Sometimes when I drive around I hope to see this plane hanging out in some tree but that didn't happen.

Someone could say: this is way too expensive and too dangerous - but not me. I came back home, late at night that day, and immediately ordered a stronger VTX (used to ride a 200mW one, now I ordered a 600mW ImmersionRC that people said was good for several kilometers!). It took a while before I was able to get back up in the sky but it eventually happened. With the new VTX, a changed site and a new plane I was rocking the sky again!

Over the course of last summer I didn't loose a plane (well, technically I did but more on that later or). I was flying first 1.3km to a landmark I knew and I was passionate to get to. That didn't work because of the antennas that I had. That day I learned that power is not everything - you need to have something to convey it with. And in the case of video transmission (and also the control transmission from your radio to the plane) antennas are the holy grail! So I went ahead and bought 3 of the most recommended (and most sexy) ones - the Aomway clover leave antennas. Alongside the 600mW VTX I also bought a 14dBi directional patch antenna. Using this setup I was finally able to easily break the 2km range!

But my appetite (and the addiction) grew...

So I started experimenting. First about going further than 2km. It turned out that the 100mW OpenLRS system wasn't all that good at ground leve. - so I built a 2m mast that gave me more range. A few weeks later I tested it up to 4.8km and let me tell you the wiggles that I have had that day were nothing compared to the first time with a woman! I was so fucking stressed out I almost fainted!

All things considered it was a pretty good flight. I used a 3s 3000mAh Li-on pack at that flight and it almost lasted to get the plane home. Just a few tents of meters over a field and the plane was safe and sound in my hands. That meant I needed more power. Lot's of it.

In a situation like that power comes from a combination of input voltage, amps, ESC, motor and propeller. So I started looking for the best combination. Just out of share luck I have had a few od 2212 810kv motors around (from my failed attempt to build a 450-sized quadcopter). I thought, since it had the required max thrust (around 1kg) so why I don't I give it a shot. And so I did!

The first attempt was very promising. At 4S and a 10x4.5 prop it flew great and I was able to finally reach the 5km mark. I turned back to see how the battery will behave on the way but it was plenty left. That encouraged me to give it another shot the next day and... that is how I achieved my up-to-date record of 8.2km out and safely back!

Oh boy was I excited! All my efforts were finally paying off! My dream was to take the plane from the place where I was usually flying back to the place I live - which was about 6.5km away. With the record flight of 8.2km I thought it was a no brainer. Man, was I wrong! It's not just the tx/rx/antennas combination but first and foremost the lay of the land that allows (or not) to get to longer distances. Luckily the day before that I have put my phone number and since the plane automatically landed on a busy street and a good man picked it up and called me the plane was recovered - and my ambition was greatly humbled.

This year I'm trying my luck in the same direction that my record flight of 8.2km was. I am hoping to break through the 20km barrier with some new equipment that I have bought:

  • 500mW Wolfbox TX (with potential to be modded to 1W)
  • 1W VTx and a biquad 11dBi antenna that I tested up to 12km with crystal clear picture on the ground
  • an all new Mini Talon with 4S2P Li-on battery pack

The plan is as follows:

  • First do a maiden of the plane with the minimum electronics to get it to fly
  • Add a flight controller for stabilization and get it to work as expected (F4 AIO FC with OSD)
  • Do a test flight over a 1km distance with the lest powerful antennas I have to check the viability of the video down and control up link
  • Do a endurance test with all equipment to see how long will the battery will last at what speeds - hope to get at least 40km range before the batteries are all dead)
  • Get back to the 5km mark and back
  • Get past the 10km mark and back
  • Get past the 15km mark and get back
  • Get past the 20km mark and get back

I have no idea if this is going to work or not. I truly don't! But I hope one day I'll have a video to show you how the flight went and what emotions it gave me!

Happy flying!

Sunday, January 21, 2018

Creating your own ESLint shareable configuration

This is going to be a quick one: How to share ESLint configuration between projects.

Step 1: create a separate project

By separate I mean really separate with its own package.json and own repository.

$ mkdir eslint-config-config-name
cd eslint-config-config-name
npm init
...
About to write to .../package.json:

{
  "name": "eslint-config-config-name",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "author": "",
  "license": "ISC"
}


Is this ok? (yes)

Step 2: create your configuration

Since the main in package.json points to index.js this will be the spot where we put our shareable configuration. Let's start small - you can always extend it further later on as you work your way through the available rues:

module.exports = {
  rules: {
    'no-var': 'error',
  }
}

To make sure you don't forget about installing ESLint in your target project add the following lines to the package.json of your configuration project:

  "peerDependencies": {
    "eslint": ">= 4"
  }

This will ensure that when the target project doesn't have ESLint as a dependency then an error will be thrown telling that it needs to be present.

Step 3: publish your repository

In order for the configuration to be available for everyone you need to have it somewhere everyone can access it. In that case it is recommended to add the following keywords to your package.json.

  "keywords": [
    "eslint",
    "eslintconfig"
  ]

If you choose to publish it to NPM then that's fine, but honestly those config files can get pretty specific for your company therefore I would suggest just tagging and pushing to your Git repository, like so:

$ git init
git add .
git commit -a -m "Initial import"
git tag 1.0.0
git remote add origin url-to-your-repo
git push --set-upstream origin master

Make sure your repository is accessible via HTTP or HTTPS - that's going to be needed for the next step, which is...

Step 4: use the config

In order to use that preset you need to add it as a dependency or devDependency to your project:

  "devDependencies": {
    ...
    "eslint-config-config-name": "git+https://url-to-your-repo#1.0.0"
  }

That way you are not specifying just the latest version but a specific version. This is done so that projects can adopt the changes to your common configuration as they get the time for it.

Next we need to make use of the configuration in our local project. We do that by creating either a local .eslintrc.js or by adding that configuration to package.json. Since we have already created an example .eslintrc.json in the previous step let's see it being used in package.json:

  "eslintConfig": {
    "extends": [
      "eslint-config-config-name"
    ]
  }

That's it! Now when you npm install your dependencies then the configuration and all its dependencies will be installed along with it.

If you would like to see a working example take a look at Vue.js + Electron example on Github that uses the common configuration I prepared for my company.

Happy linting!

Sunday, January 14, 2018

Electron - where Vue.js meets desktop

Today I was pushed by some unseen force to check out Electron - the desktop platform that embeds Chrome for everything. Let me tell you - it is amazing! Not only you have just one browser to care about but it automatically runs on all sorts of platforms! Well, I am sold on the idea - how about you? If you're like me then keep reading.

Getting started

First we need a sort of web application. For obvious reasons I am choosing Vue.js (don't even get me started on Angular X or React - I could bitch about them forever). So Vue.js it is. Obviously we need a build system for it. And what better can we get than POI! So let's do a Vue.js app.

$ npm init
This utility will walk you through creating a package.json file.
It only covers the most common items, and tries to guess sensible defaults.

See `npm help json` for definitive documentation on these fields
and exactly what they do.

Use `npm install ` afterwards to install a package and
save it as a dependency in the package.json file.

Press ^C at any time to quit.
package name: (vue-electron) 
version: (1.0.0) 
description: 
entry point: (index.js) main.js
test command: poi test
git repository: 
keywords: 
author: 
license: (ISC) 
About to write to .../vue-electron/package.json:

{
  "name": "vue-electron",
  "version": "1.0.0",
  "description": "",
  "main": "main.js",
  "scripts": {
    "test": "poi test"
  },
  "author": "",
  "license": "ISC"
}


Is this ok? (yes) 

$ npm install --save-dev poi poi-class-component

Since this will be an Electron app we need the electron package too:

$ npm install --save-dev electron

Now we're all set to create our app. In the src folder let's create 2 files: index.js and App.vue

src/index.js

import App from './App.vue'

new App({ el: '#app' })

src/App.vue

<template>
  <h1>{{ message }}</h1>
</template>

<script>
import Vue from 'vue'
import Component from 'vue-class-component'

@Component
export default class App extends Vue {
  message = "Hello, world!"
}
</script>

<style>
h1 {
  background-color: green;
}
</style>

Easy, right? Now here comes the hard part: embedding the application in Electron. I mean, it's not really hard if you know how :). There is a good documentation for both Electron and POI on how do to it but nowhere is there one place that shows the whole solution. So there you have it.

poi.config.js

module.exports = {
  entry: './src/index',
  webpack(config) {
    config.target = 'electron-renderer'
    return config
  },
  homepage: './',
}

main.js

const electron = require('electron')
const isDev = require('electron-is-dev')
const app = electron.app
const BrowserWindow = electron.BrowserWindow
const path = require('path')
const url = require('url')

let mainWindow

function createWindow () {
  mainWindow = new BrowserWindow({width: 800, height: 600})

  const entryDev = 'http://localhost:4000'
  const entryProd = url.format({
    pathname: path.join(__dirname, 'dist/index.html'),
    protocol: 'file:',
    slashes: true
  })
  mainWindow.loadURL(isDev ? entryDev : entryProd)

  if (isDev) mainWindow.webContents.openDevTools()

  mainWindow.on('closed', function () {
    mainWindow = null
  })
}

app.on('ready', createWindow)

app.on('window-all-closed', function () {
  if (process.platform !== 'darwin') {
    app.quit()
  }
})

app.on('activate', function () {
  if (mainWindow === null) {
    createWindow()
  }
})

Now we need a few things to be able to run it. First there needs to be a way of starting the web application and then the Electron container hosting our app. We'll do that by using the concurrently module and by adding a start script to our package.json. While we're at it let's also install electron-is-dev package used in the main.js script:

$ npm install --save-dev concurrently
$ npm install --save electron-is-dev

Since the electron-is-dev package will be part of our Electron app and will not be bundled with the rest of the app then it needs to be added as a regular, transitive, dependency.

Now the promised start script. Remember, that goes into the "scripts" section in your package.json

"start": "concurrently \"poi\" \"sleep 10 && electron .\""

Please note the sleep 10 before starting electron. That is to give the initial bundler run enough time to start serving our app.

And that is it! Your Vue.js app runs in Electron! And yes, it does hot reloading! Isn't that cool? Well... How about giving your app to other people? You wouldn't do that this way as you do in development, would you? We need some bundling techniques. Luckily for us there is a package called electron-packager that has our backs covered.

Bundling

Let's start by installing the electron-packager module:

$ npm install --save-dev electron-packager

With that covered let's add a few more scripts:

"build": "npm run clean && npm run compile && npm run package",
"clean": "rm -rf dist $npm_package_name-*",
"compile": "poi build --no-clear",
"package": "electron-packager . --ignore=\"src|poi.config.js|.gitignore\""

I think they are pretty self explanatory. Now from the console when you start the npm run build command it will first remove all leftovers, then build the client application using POI and then package the application for your platform. If you would like to additionally package your app for a different operating system then you can always start the package target with additional parameters, like so:

$ npm run package -- --platform=win32 --arch=all

And that is it! Electron is super cool - go check it out! And as a reward for getting to those last words here's a link to a GitHub repository that has a ready-made working example: https://github.com/padcom/vue-electron-example. Besides the mentioned settings it also contains ESLint integration that I described in my previous post.

Happy coding!

Saturday, January 13, 2018

POI, Vue and ESLint

Writing code is hard. It is however much harder to write code that conforms to standards that are already in the project. Fortunately for us there is ESLint with all the bells and whistles that help in that tedious task. And believe it or not there are ready-made presets and plugins that make the configuration really simple.

Installation

First, as usual, we need some packages added to our project:

$ npm install --save-dev eslint poi-preset-eslint eslint-plugin-vue babel-eslint
Then create or update your existing poi.config.js with as follows:
module.exports = {
  presets: [
    require('poi-preset-eslint')({ mode: '*' }),
  ]
}

Configuration

All that is left is to configure ESLint. I prefer to have a set of rules that don't get in the way while developing but hold the forth when building final version. That is why some of my settings are environment-dependant:

// This defines the mode of operation for checks depending on the environment
const mode = process.env.NODE_ENV === 'production' ? 'error' : 'warn'

module.exports = {
  parserOptions: {
    'parser': 'babel-eslint',
    'sourceType': 'module'
  },

  extends: [ 'eslint:recommended', 'plugin:vue/recommended' ],

  rules: {
    'camelcase': [ 'error', { 'properties': 'always' } ],
    'func-name-matching': 'error',
    'func-names': [ 'error', 'never' ],
    'object-shorthand': [ 'error', 'always' ],
    'prefer-const': 'error',
    'prefer-template': 'error',
    'template-curly-spacing': [ 'error', 'never' ],
    'no-useless-rename': 'error',
    'no-useless-constructor': 'error',
    'arrow-spacing': [ 'error', {
      'before': true,
      'after': true,
    } ],
    'arrow-parens': [ 'error', 'as-needed' ],
    'arrow-body-style': [ 'error', 'as-needed' ],
    'no-const-assign': 'error',
    'prefer-numeric-literals': 'error',
    'indent': [ 'error', 2, {
      'SwitchCase': 1,
    } ],
    'semi': [ 'error', 'never' ],
    'quotes': [ 'error', 'single' ],
    'comma-dangle': [ 'error', 'always-multiline' ],
    'no-console': mode,
    'no-debugger': mode,
    'no-alert': mode,
    'no-var': 'error',
    'one-var': [ 'error', 'never' ],
    'space-before-function-paren': [ 'error', {
      'anonymous': 'never',
      'named': 'always',
      'asyncArrow': 'always',
    }],
    'object-curly-spacing': [ 'error', 'always' ],
    'array-bracket-spacing': [ 'error', 'always' ],
    'computed-property-spacing': [ 'error', 'never' ],
    'key-spacing': [ 'error', {
      'beforeColon': false,
      'afterColon': true,
    } ],
    'keyword-spacing': 'error',
    'space-infix-ops': 'error',
    'space-unary-ops': 'error',
    'space-in-parens': [ 'error', 'never' ],
    'comma-spacing': [ 'error', { 'before': false, 'after': true } ],
    'no-whitespace-before-property': 'error',
    'no-multi-spaces': 'error',
    'no-multiple-empty-lines': [ 'error', { 'max': 1 } ],
    'dot-location': [ 'error', 'property' ],
    'getter-return': 'error',
    'consistent-return': [ 'error', { 'treatUndefinedAsUnspecified': true } ],
    'valid-jsdoc': 'error',
    'eqeqeq': 'error',
    'no-return-assign': [ 'error', 'always' ],
    'vue/max-attributes-per-line': [ 'error', {
      singleline: 5,
      multiline: { max: 5, 'allowFirstLine': true },
    } ],
    'vue/html-indent': [ 'none' ],
    'vue/attribute-hyphenation': [ 'error', 'never' ],
  },
}

As for the rules (and remember this is a personal preference), I don't like the fact of being forced to write every single attribute for a component in a separate line. I think this looks awful. Sure, if there are more than X attributes then it should be in a separate line. That's why the number is so high (5). Also, I found that the hyphenation of attributes that are bound, like :selection-mode causes compilation errors so this one is (at least currently) broken.

Practice

A hint for transitioning projects from non-eslint to eslint: when you'll start with the project there will be a massive number of violations. I mean like hundrets of them. Before you commit to fixing them all consider selectively disabling ESLint in all files by means of the following directive:

/* eslint-disable */

Then, when you will get the time remove that directive from 1 file and fix warnings. Then move to the next until ESLint is enabled in all the files. That will make the job much more approachable!

And that is it! Of course you can add or change more rules as you see fit. Whatever works for you.

Happy coding!