The Unix Carapace

In a previous post, I said (or at least implied) that Mac OS X should not be considered “Unix with a pretty GUI on top.”

But why not, you ask? What’s the big deal? After all, in some ways it’s a pretty accurate technical description.

It’s all about perceptions, and priorities.

Another way of phrasing it would be to say “Unix with a pretty GUI shell.”

In my estimate, a shell is an add-on used for convenience. Shells are interchangeable: the important thing is the underlying functionality. Shells just modify how end users access it.

Software becomes less shell-like the more people use it and develop for it based on the added functionality, or for a combination of the new and base functionality. Windows 95 is (shudder) a good example of this. The real money was and is on Windows apps, not DOS apps.

Software becomes more shell-like when you can discard it at will.

GNOME and KDE are large, complex…shells. Sure, there are some apps that are built on top of GNOME or KDE, but they can be exchanged for others. Linux, and the ability to run Unix applications, is what matters.

When people talk about how wonderful it is that you can run Unix applications, mostly unchanged, on Mac OS X, what they’re saying is that Mac OS X is a really nice Unix shell.

If Apple’s biggest achievement with OS X is attracting lots of Unix developers who think that, then it has failed. Because those developers can, and will, discard OS X at will. And their Unix apps, which will run just as well on cheaper Intel boxes, won’t sell Macs.

In my opinion (and all of this is just my opinion), Apple’s priority should be the applications that will.

The Unix Complacency

I understand the benefits, mentioned by the commenters to my last post, of the “Unix Bargain” that Apple struck when it bought NeXT.

I’m very happy about what it’s done for the platform:

The vastly improved underpinnings (fewer crashes, etc.) have prevented a complete exodus to Windows.

The ports of Unix-based developer infrastructure have allowed Macintosh developers to be more productive.

But my unease remains, so here’s my second attempt to explain why.

The way I see it, the reason the Mac received a Unix graft isn’t because it’s somehow the best of all possible worlds to piggyback on another platform, a platform with wildly different assumptions about its user base. It is because the Mac doesn’t have the marketshare to pay for or attract these benefits otherwise.

The tendency to think of the current situation as nirvana, rather than a compromise, blinds people to the costs of the compromise.

A parting thought I’ll come back to later: the Macintosh is not Unix with a pretty GUI on top. The more developers think like that, the more chance there is that OS X becomes just another Linux variant that happens to costs $129.

1/6: Edited for clarity

The Unix Ghetto

There are several very helpful comments to my last post filling me in on the Unix justification for the use of the /usr/local/bin location for Subversion’s utilities, and why that location isn’t part of the default PATH environmental variable.

All helpful, all logical. All, as one commenter put it, “Mac OS X’s BSD heritage shining through.”

Still, it makes me feel uneasy, and what follows is a first attempt to describe why.

Don’t get me wrong. I was being paid to program on Unix long before I was paid to program on the Mac. Emacs before CodeWarrior, CGI before Toolbox. Unix has an elegance and a rich history that’s hard to match.

But I don’t want Unix to replace the Mac.

For end users, the debate has pretty much ended. Apple’s APIs are the expected way to interact with them. Anyone who seriously suggests that X11 or a command-line interface is an acceptable way to package consumer applications is, well, not taken seriously.

But as a developer, I’m frustrated by how all that is thrown away when discussion turns to the tools I use on a day-to-day basis. Here, Unix is expected. To use Subversion, I need to know the command line and, as the commenters pointed out, the Unix filesystem hierarchy and its history. I need to know Unix, even if what I’m trying to build is a straight Cocoa application with no Unix API or convention dependencies. I’m expected to know Unix.

I don’t want a ghetto to arise where end users get helpful, easy-to-use applications and tools, but the infrastructure for making and selling similar GUI-based tools to developers withers away because, well, everything’s Unix. Just compile it yourself and go.

What happens when developers all live on the command line, but they’re the ones in charge of writing the elegant, GUI-based applications for the rest of us?