October 20, 2006 ·
About 5 minutes
First of all, sorry for not completing the comparison between systems earlier. I had to work on some university assignments and started to play a bit with Haskell, which made me start a rewrite of an utility (more on this soon, I hope!).
Let's now compare the development platform provided by these operating systems. This is something most end users will not ever care about, but it certainly affects the availability of some applications (specially commercial ones), their future evolution and how the applications work e.g. during installation.
As you may already know, both systems are Unix-like. First of all, they provide a comfortable command line interface with the usual utilities and development tools to get started very easily. They also come with the common POSIX interfaces to manage files, sockets, devices, etc. which allow a great deal of compatibility among operating systems that support them. The problem they have is that they are too low level, are C-specific and are "console-based"; there is no way to develop visual applications with them. This is why almost all programs use some sort of abstraction layer over these interfaces apart from some library that provides a graphical toolkit; otherwise development times could be extremely long and there could be lots of portability problems. These extra libraries brings us the biggest difference among the two OSes.
When you are coding for Linux, the de facto standard graphical interface is the X Window System which comes with its own set of graphical libraries (Xlib) to program applications. The problem is that these are, again, too low level for general usage so developers have come up with some nice abstractions that provide widgets, layouts, etc. Among them are the well-know Qt and GTK+ toolkits. These, on their own, also lack functionality to build complete desktop environments (DE), so KDE and GNOME were born on top of them. They not only provide a consistent graphical interface but also a development platform on which to build applications: each DE has a set of services and components that make the implementation of shiny tools a breeze.
However, application developers are faced with the difficult task of choosing the adequate subset of libraries for their application, which at its root means choosing one of the two major development platforms (KDE and GNOME) — if they don't implement their own, something not that uncommon. For tiny programs this may not be an issue (as can be seen with the duality of tools available), but it certainly has issues for big applications (you certainly do not want to rewrite, e.g., The GIMP, for KDE) and commercial ones. In some way you can think as if you were coding for KDE or GNOME, not Linux. You may argue that competition is good but, in my opinion, not at this level.
On the other hand, Mac OS X has three frameworks: Cocoa, Carbon and Cocoa on Java (I'm not sure this last name is correct, but you get the idea). Carbon is from the Mac OS 9 days and Cocoa on Java is not recommended for anything else other than learning. Even if you chose to use Cocoa on Java, in the end, you would be using plain Cocoa so you needn't consider it in the equation. In other words, the only reasonable choice when developing an application for Mac OS X is to approach Cocoa. This brings a lot of consistency between applications, keeps a single set of services available for all programs to use and allows easy interoperability with each component. (Not to mention that you either use Cocoa or you don't; you cannot do strange mixes... or I haven't seen them.)
Oh, and before you tell me that Qt is also available for Mac OS X... yes, it is, but it is built on top of Cocoa. So there is a common, high-level layer beneath all APIs that provides consistency among them.
As a side effect we have the problem of application redistribution. End users do not want to deal with source code, so you have to provide them some binaries. But how do you do that on Linux to ensure that they will work on any system? Keep in mind that "any system" does not mean any version of a specific distribution; it means any distribution! Well, the thing is... it is almost impossible: there are problems everywhere that prevent binary applications to be transported between systems. I'm not going to discuss this here because it is a rather long topic; check out the linked article for more details (and I think they are missing some).
Contrarywise, Mac OS X is simpler in this aspect. There is just one operating system with a consistent set of libraries, so you build software for those explicitly. You only need care about compatibility of some APIs between versions. And if your application uses any non-standard library, you can bundle it in the final binaries for easy redistribution (OK, OK, you'd also use static binaries in Linux). This of course also has its own drawbacks, but in general is nicer on the developer's eyes.
There are other differences, but the point I want to make (and which is entirely my own view) is that the diversity in Linux hurts development. Different distributions make it hard to package software for each of them (can you conceive the amount of time wasted by package maintainers of each single distribution out there?) and bring many binary compatibility issues. Because, you know, Linux is just the kernel. Aside that, different desktop environments pose some hard decisions to the developers and there is a lot duplicate code in them to manage common stuff; fortunately Freedesktop.org is solving some of these points.
Systems as Mac OS X (or the BSDs, or Solaris, etc.) are better in this regard because the system is a single unit distributed by a single group of people. So, whenever I say I use "Mac OS X Tiger" developers know exactly what my system has available for them.
Yeah, this is a rather generic rant against Linux and is possibly not that important in our comparison, but I had to mention it because I've faced the above issues multiple times.