More on the importance of backward compatibility

There were a lot of good comments on my post about the importance of backward compatibility the other day (both here and in the blogosphere), and a lot more of them were positive than I was expecting, which I find encouraging.

A fair number of people called me out for using such a bad example—come on, changing the OS to fix a buggy application? Fair enough. Perhaps I diminished the point I was trying to make by referencing that extreme example, but it’s a worthwhile example for one reason: At Microsoft, the user experience comes first, notbefore developer sensibilities.

Fortunately, as others (rightly) point out, it’s hard to imagine a situation where such extremes would be necessary in the Linux world—for one thing, we don’t have the sheer number of legacy binary applications to deal with, nor do we have the same volume or average user profile as Windows. But the point stands. User experience should always win.

If you want another example, one from a company who any developer would agree is an outstanding engineering organization, here’s one: “Sun has maintained binary compatibility between operating system releases for nearly a decade, enabling existing Solaris applications to run unmodified on Solaris 10. This means that Solaris applications developed ten years ago will run on Solaris 10 unchanged, taking full advantage of new and advanced Solaris features.”

11 comments on “More on the importance of backward compatibility

  1. Skatox

    Another example is that if you have a file created with X program version 1.0 , and a co-worker has a file with X program version 2.0 wich don’t have backward compatibility. Both people can’t interchange their files, even if they were created with the same program.

    It’s funny to see people in offices working with different MsOffice versions and they can’t interchange some documents due to lack of backward compatibility.

    Sorry for bad english.

  2. Forest

    “At Microsoft, the user experience comes first, not developer sensibilities.”

    At Microsoft, the money comes first. Sometimes it takes a while for that monetary incentive to translate into improving the user experience.

  3. Joe Buck

    Ian: the main reason that Windows is not a rock-solid platform, far better than anything the free software world could provide, is *exactly* because Microsoft goes to such extremes to make backward compatibility work. Trying to keep every API going back to Windows 3.1 or even before working, even with bug-compatibility, results in an over-complex and brittle platform, and this detracts from the user experience.

    Microsoft has hired a lot of brilliant researchers, and work coming out of their SLAM project allows one to prove that device drivers are free of a whole variety of bugs that afflict us. If they could get rid of all of their old cruft, we would have a very difficult time competing with them; their code would be virtually crash-free and ours wouldn’t be. Fortunately for us, they have a huge handicap: their installed base, their need to be backward-compatible, all the shoddy drivers shipped by third parties. Many of those old interfaces are broken by design.

    So yes, we do need to care about backward compatibility, but it has to be recognized that there is a cost, and that the cost might be so great that some other consideration dominates.

  4. Ian Murdock Post author

    Wouldn’t Solaris be the counterexample to that broadly held view, namely that binary compatibility leads to over-complex and brittle platforms? -ian

  5. Steve McIntyre

    Solaris also has severe problems with backwards compatibility, for example the *horrendously* old, buggy, non-POSIX version of the Bourne shell that it still ships with as /bin/sh. I’ve personally seen several people have problems shipping simple shell scripts in their packages because they have to work around this. There’s a point where backwards compatibility needs to be dropped in order to allow compatibility with newer standards that everybody else has already moved to.

    Also, “run on Solaris 10 unchanged, taking full advantage of new and advanced Solaris features” is a little contradictory. To use many of the newer features, you’d have to at least rebuild…

  6. stephen o'grady

    i agree with Joe Buck that there’s a definite cost to preserving ABI compatibility, but not that it’s necessarily overly complex and/or brittle software.

    i see instead the primary cost as throttling innovation. ISVs and OSs alike can be held back by strict adherence to backwards compatibility – either through an inability to provide new features or a lengthened development cycle to provide them.

    whether that cost is justified or not depends on your particular aims. will write more on this shortly.

  7. Luis Villa

    Solaris most definitely does not provide a counterexample. Leaving aside how little Solaris hardware and software is actually supported to the extent MS supports Windows, the only reason Solaris has been able to move into the 90s at all is by expending massive amounts of engineering effort, and throwing out many of their own internal rules re: backwards compatibility. GNOME/JDS doesn’t have the same compatibility guarantees as CDE did, and for good reason- you can’t (or at least open source can’t yet) build modern software with the complexity and power expected by modern users without gigantic amounts of engineering effort that even Sun can only barely afford.

  8. Jim Butler

    I generally agree with Stephen O’Grady–throttling innovation is a potential cost of maintaining backward compatibility.

    In a network protocol standards committee that I have served on, we have debated the backwards compatibility issue more than once. Most of the committee members recognized that it is sometimes necessary to break backwards compatibility in a limited way in order to make significant improvements to the protocol that improve the user experience.

  9. George

    I understand and agree that backward compatibility is a highly desirable and important feature of an OS however I think that having an optimised and stable kernel is by far a superior priority ! More often that not, optimising the existing software will break bacwards compatibilty and similarly, ensuring backward compatibility often places hard limits on the optimisations possible.

    If I had to choose, I would definitly have an optimised kernel even if that means loosing all backward compatibility.

    Let us not loose the “evolution” concept that makes Linux so much better that competitors

  10. Tortanick

    Wouldn’t it better to use virtualiseation or compatability layers like Wine than to actually have native backwards compatability?

Comments are closed.