Quick tip if you see “bad DLL or entry point ‘msobj80.dll'” when building software with VS2008

Try stopping mspdbsrv.exe, which is the process that generates the pdb files during a build if it is still running. My understanding is that it’s supposed to shut down at the end of the compilation, but it seems that it can turn into a zombie process. If that happens, you can get error shown in the post’s title when linking your binaries.

Anyway, I just ran into this issue and stopping the process via the Task Manager resolved the issue for me.

On combining #import and /MP in C++ builds with VS2010

I’m currently busy porting a large native C++ project from VS2008 to VS2010 and one of the issues I keep running into was build times. The VS2008 build uses a distributed build system; Unfortunately the vendor doesn’t support VS2010 yet, so I couldn’t use the same infrastructure. In order to get a decent build speed, I started exploring MSBuild’s ability to build projects in parallel (which is fairly similar to VS2008’s ability to build projects in parallel) and the C++ compiler’s ability to make use of multiple processors/cores, aka the /MP switch.

Read More

Some clarifications regarding last week’s anti-VC6 rant

This post started out as a comment to Len Holgate’s post referencing my anti-VC6 rant. The comment got a little out of hand size wise so I’ve decided to turn it into a separate blog post. I think I mixed a couple of issues together that should been separated better but weren’t – after all, my blog post was more of a rant.

First, if your client or employer is maintaining an existing system and the system is in maintenance mode only, we’re talking about a bug fix or a small enhancement then it doesn’t make sense to upgrade the compiler. That’s what I was referring to as a system that is “on life support”. Yes, it goes against the grain for me personally as a software engineer who likes to improve software but the effort spent on making the transition does not make sense from a business perspective.

What I do take issue with is when you are developing a new system or are working on a large refactor of an existing system that is in “embrace and extend mode”, and for whatever reason the client or employer decrees that it shall be written using VC6. That’s where the penalties come in and that is where the technical debt builds up at the start of a new project or at the exact point in time when you should be addressing the debt instead of adding to it.

The understanding of C++ and the use of its multi-paradigm nature has changed since VC6 was released, we have both new programming techniques and new libraries that (should) improve the quality of the code, its expressiveness and programmer productivity. The prime example of these libraries and the one I was thinking of when writing the rant of course is Boost. The earliest MS compiler they test against in 1.40 is VC7.1 aka VS2003 which is certainly a big improvement over VC6.

Yes, VC6 is likely to create smaller executables and build them faster. C++ compilers certainly are not getting any faster and long compile/link times have been a problem on projects I worked on. Shorter build times and especially smaller executables can be a benefit depending on your particular use case. A lot of the projects I worked on in the recent past are maths heavy and the calculations are performance critical. For these projects, an compiler that has an optimizer which can squeeze a 5% performance improvement out of the existing code one a modern CPU at the expense of 20% larger code is a no brainer. In at least one case it was cheaper to buy the Intel compiler to get better performance instead of putting more engineering time into performance improvements.

Yes, developers like shiny new tools and yes, I’ve worked with developers who considered GCC’s CVS HEAD the only compiler that was recent enough to complement their awesomeness. This is not something I generally agree with although I did update my own copy of Visual Studio from 2003 to 2008 (yes, I did skip 2005) when that version came out simply because it was so much better than its predecessors.

I still think that by insisting on the usage of tools that are positively ancient, programmers get needlessly hobbled and it is part of our job as programmers who care about what they do to educate the people who make these decisions as to why they aren’t necessarily good from an engineering point of view. I don’t think any of the Java programmers I work with would put up with having to work using Java 1.2 and forgo the improvements both in the language and in the available libraries, yet C++ programmers are regularly asked to do exactly that.

Why oh why do people insist on using compilers that are way out of date?

Why are so many companies hobbling their programmers with positively ancient and often positively crappy tools? For once I’m not ranting about companies that are too cheap to provide their C++ programmers with important tools like profilers and leak detectors – the usual “if these were important, the tool vendor would include them” argument, but the one tool right at the heart of the matter. The one none of us can work without in C++ space. I am, of course, talking about the compiler.

Read More

The joy of using outdated C++ compiler versions

Thud, thud, thud…

The sound of the developer’s head banging on the desk late at night.

What happened? Well, I had a requirement to make use of some smart pointers to handle a somewhat complicated resource management issue that was mostly being ignored in the current implementation, mainly on the grounds of it being slightly to complicated to handle successfully using manual pointer management. The result – not entirely unexpected – was a not so nice memory leak.

No smart pointer implementation was found lurking behind the sofa, so I bravely went where other people had gone before (and failed) – I bravely ignored the status of the Sun CC support in the boost library and downloaded the latest version (1.32.0 at the time of me orginially writing this). The compiler I’m using is marked as ‘horribly broken’ in the context of boost, but hey, I only wanted to use smart pointers so it can’t be that bad, right?

First attempts with a newer compiler (WS8/5.5) proved to be encouraging. The smart_ptr tests compiled, but a lot of them failed. After an extended printf debugging session it appears that the temporaries generated by the compiler got destroyed rather later than both the writers of the C++ standard and the boost developers expected. Employing some advanced google skillz soon brought to light that by default, the SUN compiler destroys temporaries not at the end of the statement as the standard suggests but rather when it encounters the end of the scope.

Great. In fact this shouldn’t have come as that much of a surprise as SUN makes a big song and dance about the compiler’s backward compatility – they state that code which has compiled on previous versions of the compiler will definitely still compile on the newer versions. This I found true almost all the time. Unfortunately in this particular case the feature turned into a stumbling block as the backward-compatible behaviour pretty much sabotaged the expected behaviour.

Fortunately the cure is at hand – the compiler supports a command line option (-feature=tmplife) that makes it behave like every other modern C++ compiler on the face of the earth. And hey presto, the tests suddenly pass. Well, obviously those that are supposed to pass!

Unfortunately the current compiler used in the production environment is 5.3/WS6.2, not 5.5/WS8. At least it also does support the tmplife feature, so I’m obviously only a stone’s throw away from getting working smart pointers, right?

Wrong. The smart pointer’d code did compile, but did it link? Of course not, that would be too easy. So back to the tests, but this time armed with the old compiler. The older SUN compilers use a template instantiation database (the infamous SunWS_cache direcotry) to store the object code resulting of the compiler instantiating templates. For some reason, the compiler or linker fail to pull in the necessary object code for the smart pointer externals and all that. Grrr. Closer inspection of the compiler’s man page suggested that the compiler can be convinced to put this information into the object file instead (using -instances=static instead of the default behaviour). This behaviour is the default on the 5.5 compiler, but optional in the 5.3 compiler…

So finally, the smart_ptr test successfully complete using the Sun 5.3 C++ compiler. And the application – with a bit more tweaking – is leaking considerably less memory. The joy of small victories.

Playing with SunStudio 11

This is by no means a review of SunStudio 11, even though I’ve used it for production software. There’s an awful lot of power in the IDE but I’m one of those old-skool guys who’s spent a lot of time learning and customising XEmacs and it’s still the editor I’m most comfortable with, so why change? For that reason, I’ve only ever used the IDE for debugging, for which it seems to be decent enough. As it’s written in Java as so many IDEs are these days (cue Eclipse) it’s not exactly the fastest IDE I’ve ever worked with but once it’s loaded up and running it appears to be decent enough.

The compiler is however a big step forward from just about any of the older SUN compilers I’ve used. It still has some quirks but there is comparative small number of them, so for most applications it now really looks like a proper standard C++ compiler, which is a big improvement over the previous efforts. Yes, there are still some quirks (a couple of them still show up in Boost) but it’s more unlikely that you stumble across those.

Overall I’d say that whichever compiler version you’re currently using, you should probably upgrade to this one. At least if you’re interested in writing reasonably modern C++, that is.