Make sure you’re using the right command processor when running Incredibuild

Quick hack/warning for those using an alternative command line processor like TCC and also use Xoreax’ Incredibuild for distributed builds. Incredibuild is awesome, by the way, and if you have a larger C++ project that takes a long time to build, you should use it. And no, I’m not getting paid or receive free stuff for writing that.

However, if you have to start your Visual Studio instance from the command line because you need to set some environment variables first, or because of your general awesomeness, make sure you’re starting it from a stock Windows shell. Either cmd.exe or Powershell will do nicely, thank you. If you start VS from TCC and have a couple of build tasks that spawn out to the shell, Incredibuild wants to shell out into TCC to run these tasks and the shelled out task don’t seem to return control to Incredibuild again. Yes, I was too lazy to investigate further as the method described above works.

Install your basic Emacs packages via a single function call

If you, like me tend to carry around or “cloud around” a single .emacs file so you end up with similar environments wherever you have an Emacs install, you know it’s a little painful to ensure that you have the same set of basic packages installed on each one of your Emacs installations. As I had mentioned before I don’t use that many third party packages so my Emacs configurations aren’t that complicated, but I always prefer to have the computer remember things so I don’t have to.

As I started using ELPA last year, I decided to investigate if I could use ELPA and a little custom code to download and install the packages that I know I want everywhere. One evening I got bored enough to write a function that tests for the presence of a bunch of basic packages that I want to have on every Emacs install. It’s not trying to be fancy and automate things to the n-th degree as I didn’t want to spend the time each time I start Emacs, so I just have to remember to invoke the function after a new install. Here is the function, in all its simple glory:

(defun install-required-packages ()
  (if (>= emacs-major-version 24)
      (progn
        (setq package-archives '(("ELPA" . "http://tromey.com/elpa/")
                                 ("gnu" . "http://elpa.gnu.org/packages/")
                                 ("marmalade" . "http://marmalade-repo.org/packages/")
                                 ("melpa" . "http://melpa.milkbox.net/packages/")
                                 ))
        (package-refresh-contents)

        (when (not (require 'bm nil t))
          (package-install 'bm))
        (when (not (require 'icicles nil t))
          (package-install 'icicles))
        (when (not (require 'smex nil t))
          (package-install 'smex))
        (when (not (require 'zenburn-theme nil t))
          (package-install 'zenburn-theme))
        )))

Unique buffer names in Emacs

A common annoyance with Emacs when working on a code base that has duplicate file names is that the mode line tends to display the buffer names as “one.py:<1>”, “one.py:<2>” etc etc. That doesn’t help much with telling them apart and I find it confusing.

I was introduced to the Uniquify library a while ago. This library allows you some control over how you want to display the buffer names of buffers that contain files with duplicate names. I use the following configuration in my .emacs:

(require 'uniquify)
(setq uniquify-buffer-name-style 'post-forward uniquify-separator ":")

The above code will display duplicate buffers as “one.py:left” and “one.py:right”, with “left” and “right” being the directories that contain the file. The screenshot below shows Emacs with two files with the same name opened from two different locations:

Screen Shot 2014-08-23 at 3.49.46 PM

Timo’s occasional link roundup, late July edition

A couple of interesting articles about debugging. Debugging doesn’t seem to get a lot of attention when people are taught about programming, I assume you’re supposed to acquire this skill by osmosis, but it is actually one of those skills that should receive much greater attention because it’s one of those that separates highly productive developers from, well, not so productive ones.

Why I’m Productive in Clojure. I’ve long been a fan of Lisp and Lisp-like languages, even though I wasn’t originally that happy with having Lisp inflicted on me when I was at university. Because it was weird and back then I didn’t much appreciate non-mainstream languages. These days I do because that’s where you usually find better expressiveness and ideas supposedly too strange for mainstream languages. I guess that makes me a language hipster.

And while we’re on the subject of lisp-like languages – I’ve never heard of Julia, but this blog post made me wonder if it should be on my list of languages to look at.

We have a Nest thermostat and I wasn’t too keen when I heard that Google bought them. Probably have to look into securing it (aka stopping the data leakage). While I understand the trade “your data for our free service” model from an economics perspective, I do take some issue with the “we’ll sell you a very expensive device and your data still leaks out to us” model. Nests aren’t exactly cheap to begin with.

Debugging on a live system that’s shouldn’t be live. Been there done that, on a trading system…

Netflix and network neutrality, as seen from the other side. I’m an advocate of regulating ISPs (especially the larger ones) as public utilities and essentially enforcing network neutrality that way. Netflix obviously has been going on about network neutrality for a while now but the linked article does make me wonder if those supposed “pay to play” payments were actually more like payments for the server hosting. You know, like the charges that us mere mortals also have to pay if we want to stick a server into someone’s data centre.

 

Sprinkler controller upgrade part III – setting it up

Putting the OpenSprinkler and Raspberry Pi together was easy, getting them to run showed my inexperience when it comes to playing with hardware. The overall install went pretty smoothly and the documentation is good and easy to follow so I’m not going to ramble on about it for very long, but just throw up some notes.

First, my old card reader didn’t want to play with any of my computers. Now, the card reader is ancient, but should have been able to work with an SD card. No joy under and available OK, so I ended up having to get a new SD/microSD only card reader.

When writing the ospi image file to the SD card using Mac OS, make sure you write to the raw device and not to the slice (in my case /dev/rdisk4 and not /dev/disk4s1), otherwise you’ll end up with a non-booting OSPI and wonder why. Don’t ask me how I know

Also, the OSPI image doesn’t have Emacs pre-installed, so I obviously had to fix that. I mean, how would I be editing configuration files otherwise?

The hardware installation (aka screwing the controller to the wall and wiring it up) was pretty simple, to facilitate the install I had taken photos of the way the old controller was wired and use that as a guide.

The whole install went pretty smoothly and the controller has been running our sprinklers for a while now. Unfortunately the sprinkler_pi program that I really wanted to use to seems to have encountered a bug that has it trigger multiple valves at the same time; I’m planning to upgrade to the latest version and if necessary debug it a bit because I like its UI better than the default interval_program. The latter however just worked out of the box.

The only concern so far is that the CPU temperature on the Raspberry Pi seems a little high (it’s usually hovering around 60-65° Celcius as it’s outside in the garage. I might have to experiment with a CPU heat sink on that one.

Sprinkler controller upgrade part II – the Pi(e)s have arrived

The Raspberry Pis have landed. Guess which box contains the sensitive electronics and is worth about twice as much as the other one:

pi-parcels-1

That’s right:

pi-parcels-2

Geez Amazon, what is it about the shoddy packing when it comes to items that are bought via Amazon Fulfillment Services? This is not the first time I got something that can only be described as badly packaged.

The OpenSprinkler kit has also arrived, all I’m currently waiting for is a smaller memory card as the regular SD cards I bought are a little to big to fit into the OpenSprinkler case. Anyway, I should have the new hardware up and running on Friday.

The latest project – improving the home’s sprinkler system, part I of probably a lot

I normally don’t play much with hardware, mainly because there isn’t/wasn’t much I want to do that tends to require hardware that’s not a regular PC or maybe a phone or tablet. This one is different, because no self-respecting geek would want the usual rotary control “programmable” timer to run their sprinkler system, would they?

We do live at the edge of the desert and we have pretty strict watering restrictions here. I’m all for it – water being a finite resource and all that – and I want to improve our existing sprinkler system at the same time. It doesn’t help that the people who set up the sprinklers were probably among the lower bidders, to put it politely. OK, to be blunt they seem to have failed the “giving a shit” test when they put the system together. I’ve spent a lot of  last year’s “gardening hours” just trying to make it work somewhat. Not well, just “somewhat”. Time to fix that.

First step was researching hardware. I’m comfortable with Unix type OSs (obviously) and with seemingly the world and their dogs releasing small, low power consumption embedded Linux devices I figured one of them would be perfect. The original plan was to get a Raspberry Pi or a BeagleBone with relay shield/cape and drive the sprinkler valves that way. A bit more poking around the web led me to the various OpenSprinkler modules (standalone, Raspberry Pi shield and BeagleBone cape) and they look ideal for what I have in mind. I’m planning to order the Raspberry Pi version as one of the nice touches is that the Raspbian repository has packages for the Java JDK, which gives me bad ideas of hacking parts of the sprinkler system in Clojure or Armed Bear Common Lisp. I’m not sure that the system is powerful to run either, but one can dream.

The good thing about the various OpenSprinkler systems is that they have the 24V to 5V converter on board so the power supply isn’t a problem. There is already open source software for them that covers the normal requirements and either of them can control enough valves for our current needs without resorting to genius solutions like running two valves off the same controller output because someone installed a wiring loom that is one wire short of being able to control all valves individually. Apparently the fact that the water pressure wasn’t high enough to run two zones at the same time fell in the category of “not giving a shit”.

The next step after getting the hardware is to run convert the existing system to run off the new controller with some additional wiring to be able to control all zones individually. This will require fixing up some of the wiring issues and will also have to tie in with my project of running some Ethernet wiring around the house unless I decide to go wireless for the sprinkler controller. Haven’t figured that part out yet. Given that the controller is “headless” I’m tempted to hide it away out of sight and just run Ethernet and 24V power to it.

Once it’s all up and running I’ll look into adding some sensors for a bit more fine-grained control over the system. Rain sensors are not really helpful out here as it hardly ever rains during irrigation season. I’m thinking about adding at least a couple of moisture sensors for some of the more sensitive plants to ensure that they get the appropriate amount of water but not more than necessary. Not sure I’ll get around to that part this year, first the system needs to be up and running reliably before I go and break it again.

Stay tuned.

I prefer ConEmu over Console2, and so should you…

OK, I admit it – I’m a dinosaur. I still use the command line a lot as I’m subscribing to the belief that I can often type faster than I can move my hand off the keyboard to the mouse, click, and move my hand back. Plus, I grew up in an era when the command line was what you got when you turned on the computer, and Windows 2.0 or GEM was a big improvement.

One of the neat features of the console emulators on both on Linux and Mac OS X was and is that you could run a set of shells in a tabbed single console window. A post on Scott Hanselman’s blog put me onto Console2. That was more like it and I pretty much immediately housed my Windows shells – either cmd.exe or PowerShell – in there. Much better, but over time the pace of development slowed and the last beta release dates from 2011. It’s not like the Beta is buggy or anything – in fact, in my experience it works very nicely indeed – but of course as a software engineer I like shiny new things.

Enter, via another post on Scott Hanselman’s blog, ConEmu – or ConEmu-Maximus5, to give it its full name. If Console2 is the VW Golf to the stock Windows’ console emulator’s 1200cc VW Bug, then ConEmu is the VW Phaeton to Console2’s VW Golf. It’s got a lot more features, it’s actively developed, it works well with Far Manager if you miss the Norton Commander days and it’s highly configurable. Of course, it also can handle transparent backgrounds, but so can Console2.

For me, it has one killer feature – recent versions detect which shells you have installed on your machine and offer you a selection via the green “new tab” button (the one that looks a bit like a French Pharmacy sign), with a choice of running them either as a regular user or admin user:

ConEmu with visible command line processor menu
ConEmu with visible command line processor menu

Why is this such a big deal? Well, it’s neat if you’re using both PowerShell and cmd.exe, but for me it’s a killer feature because I like using TCC/LE, at least at home. TCC/LE is the familiar Windows command prompt at first glance but in the same way that ConEmu is a much expanded console emulator compared to the regular Windows one, TCC/LE is a much expanded command prompt that is a lot more feature rich and has a lot of sensible extensions. And because I’m such a dinosaur, I’ve actually been using its predecessors (4DOS and 4NT) way back when they were distributed as shareware on a floppy disk and you had to buy the manuals for them to get the registration code. And yes, I still have at least the 4DOS manual.

Back to console emulators, though. If I wanted to go nitpicking, both ConEmu and Console2 work less well over an RDP connection than the stock console, which is noticeable if you tend to remote into machines quite frequently. It’s not that they work badly, but Microsoft clearly spent a lot of time optimising the stock console to work well over RDP (or to have RDP work well with the stock console), so there is a bit of lag when scrolling. It doesn’t make either tool unusable but you notice it’s there.

Anyway, if you check out one new tool this week, make it ConEmu.

The coder/programmer/software engineer debate seems to be rising from the undead again

First, a confession – I actually occasionally call myself a coder, but in a tongue in cheek, post-modern and ironic way. Heck, it does make for a good blog title and license plate.

Nevertheless, with all the recent “coding schools” cropping up all over the place – at least if you are in the Bay Area – it does seem that being able to code in the context of a reasonably sought after web technology without much further formal training is the path to new, fulfilling careers and of course untold riches in an economy where recent graduates in all fields have problems finding work. Well, at least a career that allows you to rent a room instead of crashing on somebody’s couch.

There are some problems with the whole “mere coder” thing. Dave Winer has some interesting thoughts on his blog, and I agree with a lot of what he says. Scott Radcliff has some additional thoughts, which I also find myself agreeing with a lot.

The process of building a software system is a lot more than just coding unless you subscribe to the view that all the coders do is to take the gospel handed down by A Visionary Leader and convert it into software. Anybody who’s ever built a moderately complex system knows that software doesn’t happen that way, at least not the type of software that doesn’t collapse under its own weight shortly after its release. Of course that doesn’t sit well with the notion of The Visionary Genius that is all that is required to build the next and the narrative doesn’t work at all once you recognise that building software is, in most cases, a team sport.

Expecting someone who has learned to write code to o create a great piece of software is like expecting someone who’s just gone through a foreign language course of similar length to go and write the next great novel in that particular language. Sometimes you get lucky, but most of the time the flow isn’t there, the language isn’t yet used in an idiomatic way and all that together makes for less than an enjoyable experience. Some of the people who manage to enter the profession of software development will learn on the job and grow into people who can build robust, maintainable systems and those are to be congratulated.

The way most people use the term coder, ie in a non-postmodern ironic way, reminds me very much of the time when a programmer’s job was to unthinkingly program as part of a little cog in the giant waterfall development machine, which led to the WIMP (Why Isn’t Mary Programming) acronym. Basically, if the keyboard wasn’t constantly clattering, there was no programming going on. After all, how hard could it be, or as Scott Adams put it so nicely:

That said, maybe we are at a point in time where an army of coders as the modern equivalent of the typing pool is able to create good enough software. Given that most users are conditioned to believe that software is infuriating and buggy, we as an industry might well get away with. Is that the world we want to live in, though?

As creators of software, we do have the ability to choose the environment that we work and live in. If you care about quality, work with like-minded people.

Me? I prefer to call myself a software craftsman. I’m not an engineer, I write code but that’s almost an afterthought once the design is figured out but at the end of the day what I do is build something that wasn’t there before, using vague guidelines that are wrong as often as they are right while trying to tease out what the customer needs rather than what they tell me they want. In most cases there is no detailed blueprint, no perfect specification that only needs to be translated 1:1 into lines of code. Instead, I get to use my experience and judgement to fill in the blanks that most people – myself included – probably didn’t even think were there.

Sounds like a job for a craftsman to me.