Broken C++: Old Compilers

Dean Michael Berris By Dean Michael Berris
Expert Author
Article Date:

Up until recently I’ve only had to deal with standard-compliant C++. I’ve pretty much enjoyed writing C++ code that abides by an accepted and well-defined standard. This means I have been for the most part using most of the standard features of the Standard Template Library. That changed most recently when I had to deal older versions of compilers which did not support standard C++, and thus I felt the pain that most others have had to deal with a while back.

This post explores the issues as to why C++ has been given a bad name by mostly spineless compiler vendors with deep hooks into various parts of industry.

Do not get me wrong, there are a lot of good compilers out there – much more than bad compiler vendors. C++ is a good programming language if you abide by the standards – but if you’re using tools that don’t support the standard 100% then you will get bitten and you will see the ugly side effect of history. Here I lay out some reasons why the baggage of history is sometimes too much of a burden for progress.

History of Vendors

Some compiler vendors for C++ have been around since the beginning – around the 80′s when C++ has been unveiled and released. However the international standard for C++ didn’t get ratified until 1998. Some compiler vendors built up empires around server systems where customers were already spending tons of money for the stability and support offered by these companies. You’d get huge contracts selling machines and tools that allowed developers to write applications on these platforms that were guaranteed to not change from under them while support for these platforms were active – usually the bigger firms wanted support contracts that span multiple years, renewable at the end, and usually cost a metric ton of money. It’s all good business for the server vendors (and particularly, UNIX server vendors) who made money by making sure that the systems were stable and that any changes to the systems would be isolated into newer versions of the software and are not backward-compatible to older versions (mainly because they’d like to sell more licenses to the new ones, and upgrades are a big business too).

There are industries that have built around this ecosystem of server vendors who relied on the intricacies of the mix of hardware, software, and support. The business dollars were mostly poured into the hardware and support parts, and the software was mainly something that had to be maintained. If the software didn’t have to change, then that was more manageable for these server companies because, well, they can just keep selling hardware and support and it would all be good. Until the software guys caught wind of the strategy and figured out that there’s big business to be made in selling upgrades to software – and this is when the hell starts.

Backward Compatibility and Interface Stability

The biggest headache for software releases is that of backward compatibility. Hardware manufacturers never had to deal with this because, well, they can just ignore the old hardware that was already there and they never had to make patches to stuff that’s already out there. There are two platforms that had issues with backward compatibility and huge deployment bases: Microsoft and Sun Microsystems. On the server-side of things, Sun particularly has a big problem with breaking backward compatibility – mostly because they are deeply entrenched in industries where change was frowned upon: the telecom industry, and the financial industry. Almost every telco in the world has a sprawling Sun server farm and 10-year-old installations of Solaris and Sun OS – it would have been suicide if Sun forced them to migrate their C++ applications that built with the (broken) Sun-provided “standard” library, which they fought very hard to have to change the Application Binary Interface of. Other vendors in the same situation (IBM with the AIX server, DEC with Alpha, etc.) had to deal with the same issue: how do they upgrade their systems to support the newly ratified standard in 1998 of C++, and again in 2003.

On the flip side of things, Microsoft had a different strategy: they made backward-compatible changes a minimum priority, but instead they maintained multiple versions of the compiler and accompanying infrastructure. There were different platform SDK’s (for Win XP, Windows Server 2003, 2005, 2007, ?) and then different compiler releases (Visual C++ 6, 7, 7.1, 8, 9, ?). This allowed developers to stick with the compilers they were comfortable with and not have to worry about later releases until they wanted to upgrade. Because these were separate products and developers were involved in the determination of the features and bugs that were released in every new version of the platform SDK and the compilers, you’ll still see Windows Server 2000 installations using the Visual C++ 6 compiler and they would still be running 11 years later.

The big problem with the UNIX world is this insistence of maintaining ABI backward compatibility – mostly because usually the standard library would be “frozen” interface wise, while the underlying implementation would change to accommodate bug fixes while maintaining that it looked the same from the outside. On Windows you rarely had to worry about this because you just had to say which version of the API you were dealing with – see DirectX 6,7,8,? – while the standard C runtime would even be versioned – vc6, vc7, vc8, etc. – so you could have multiple “standard” libraries installed on the system. Now mixing versions in the same application would be a problem, but that was beyond the compiler/system vendor’s concern.

You see this with the GNU Compiler Collection where the 3.x line would not have binary compatibility with the 4.x line, as well as the Sun compiler where Studio 11 compiler outputs are not binary-compatible with Studio 12 compiler outputs. Sun Studio 11 is notoriously still standards non-compliant because, well, they chose to keep ABI compatibility with older releases of the standard library that comes with the compiler. Because it would be suicide for them to force customers who were stuck with an older version of the Sun standard C++ library to upgrade to a newer version, well, they got killed in other ways.

Another good model for dealing with ABI compatibility is what the Mac guys have in the universal binary format. I think this is a good way of going about it, and if LLVM gets supported on more platforms then we might have a better way of distributing applications in LLVM bytecode instead of native platform executables. That’s another article for another time though.

Multiple Implementations

One of the biggest problems that C++ faced (and continues to face) is that there are a lot of different implementations of the same language. Even with a C++ standard, different vendors have different interpretations of the standard, and have their own extensions that encourage customers to write non-standard C++. As opposed to other languages like Python, Ruby, or Java, C++ has no “de facto” implementation that everybody can base other implementations against. Somehow this is the same problem that Perl 6 is trying to address, and I think that Lisp has suffered from the same problem that’s caused it to stagnate in the face of other more dynamic languages. There are some merits to having multiple implementations, but having a reference implementation is also a good thing.

Because I have been developing code mostly on Linux with an older compiler (GCC 3.4) and having to deal with making sure the code also builds with the Sun Studio 11 compiler and libstdC library that Sun provides, having to deal with just these two different implementations of C++ is really painful. This greatly constrains what kind of C++ I can write and what features of the standard library I can rely on. This I think is one of the reasons why some cross-platform projects constrain the features of C++ they allow in the code. Having dealt with it first-hand has given me an eye-opening look at how bad some of the developers stuck with older versions of the compilers and non-standard compilers.

Bad Policies and Bad Decisions

History written by those who succeed and unfortunately for the case of C++ the history is written by the compiler and platform vendors. There were many chances for these vendors to rectify the situation, by improving the adoption of newer and standard C++ features for a long time already. Sun, while it was still a company on its own had a decision to make to overhaul their compilers and support the C++03 standard since the C++03 standard came out. They should have followed Microsoft’s lead when they made Visual Studio 7.1 start supporting more C++03 features. Developers on the Sun platform deserved more than having a notoriously non-standard C++ implementation on which to build their applications upon. They should have been a lot more aggressive in pushing the standards as they have a lot to gain in having more applications build and run on their platforms.

What standards allow you to do is develop solutions that you have a reasonable level of confidence will apply to a broader base. Unfortunately because Sun has been late to the standards C++ party, applications like Google Chrome which build fine in standard C++03 mode in later versions of GCC won’t build on the Sun Studio compilers and not run on the Sun workstations. Things like KDE which would be a good CDE replacement won’t build on Sun compilers (but will build with GCC). On Sun, when library vendors still develop or release against the broken standard library, application developers are stuck with a broken implementation that does not help the community.

A Plea

Because C++ is a programming language that is crucial to the implementation of various system services and applications where performance matters, compiler vendors, please give us better tools to be able to develop the next great application with 100% of the features available to the programming language. Please be the drivers of innovation instead of the ones holding progress back. Please be more aggressive in pushing for change and making up-to-date implementations of the language for the benefit of all the C++ developers in the world.

If you’re a paying customer to these compiler vendors, please write them and tell them that you deserve more because everybody knows that standards are good and standard-based implementations allow you to do standards-based solutions. If you’re a vendor yourself of applications that still work with older platforms of compilers please be the drivers to push your customers towards upgrading their systems or take the step towards using a more sane implementation of compilers for the systems you will support.

For the sanity and sake of the generations of C++ developers who are yet to learn C++, let’s stop using the non-standard compilers *today*. There is a saying that says:

There are always excuses if you don’t want to do it; there are always reasons if you do want to do it.

- Anonymous

So if you’re like me and you want to further the use of standard C++, find the reasons to support it and upgrade *today*. The only reason these broken compilers are still being used is because we still keep using them.

Share your stories of your C++ nightmares with broken compilers and maybe we can get the vendors to finally stop supporting these broken compilers.

Comments

About Dean Michael Berris
Dean Michael Berris is the writer of C++ Soup! C++ Soup is a blog about what?s new, up-coming, and what?s going on with C++. C++ Developer with years of experience building high performance applications and implementing multi-threaded highly scalable systems.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

  • 160×600
  • 152X252
  • Newsletter Signup
    Get The Email Newsletter! Please subscribe using your company email address
  • 336×280