Here are two articles about Apple, both by seasoned software experts ("hackers" in the original, good sense of the term).
The first is an old (2000) article by Jordan Hubbard, one of the founding developers of the FreeBSD project. FreeBSD is a Unix-like operating system that was once almost as popular as Linux. It evolved from BSD, which was originally a set of enhancements to the original Unix operating system of AT&T, developed at Berkeley; over time BSD became a nearly full-fledged operating system in its own right, and in the early 1990s, gave rise to several operating systems that achieved considerable respect among the cognoscenti. (Among other things, the network protocol that governs the Internet, and several widely used pieces of internet software, came from BSD.) Apple heavily borrowed from FreeBSD (and continue to credit it on their developer page) in creating Mac OS X, which is a fairly traditional Unix system under its shiny hood. Hubbard's article is a review of a beta version of OS X, written from the perspective of a Unix enthusiast and FreeBSD developer. It turned out that he liked OS X so much that, shortly afterwards, he left the FreeBSD project and went to work for Apple (as did several others).
The second is a recent essay by Paul Graham, perhaps best known as an evangelist of the Lisp programming language, but a very influential online essayist. It is prompted more by the iPhone than by Apple's computers, and the way Apple has handled the iPhone "App Store" leads him to claim that they "don't understand software". Moreover, he argues that the loss of goodwill among talented programmers (ie, people like Hubbard, who were attracted to Apple ten years ago by the vision of a solid, Unix-based, user-friendly desktop system) will cost Apple dearly in the future.
My previous post on Linux "just working", compared to the Mac, was titled partly in jest, though the point about the different development models was serious. There is no question that the Mac offers a more user-friendly interface for beginers than the best Linux systems, has a better set of applications included (especially the multimedia stuff), and -- perhaps most crucially -- allows you to install most well-known open-source Linux/Unix applications via third-party efforts like the Fink project.
Macs also have a far better reputation for stability than Windows, and comparable to Linux. Here, however, we have to consider the great advantage Apple has in this respect. Apple builds the hardware, and can ensure that it works optimally with its software. With external peripherals (such as the HP printer/scanner I mentioned in my previous post), things aren't so certain. Linux, and Windows, meanwhile, work fine on the vast majority of PCs out there. Microsoft works closely with every important PC hardware vendor to ensure this. Linux is more volunteer-driven but, these days, several of the major manufacturers -- notably Intel and IBM, but also many others -- fund Linux development, employ individual developers, and make the specifications for their hardware available. Still, testing for every possible combination of hardware devices is impossible. Computer makers will generally test that their machine works with previous releases of Windows, but Microsoft cannot ensure that all existing computers will work with Windows 7. Linux, meanwhile, receives little testing support from manufacturers. So it is remarkable that these systems work as well as they do.
Apple has received brickbats over the years for their proprietorial control of their hardware and their refusal to permit installation of Mac OS X on third-party systems. I personally think the world would have been much worse off if Apple, and not Microsoft, had dominated the personal computer industry. But Steve Jobs, Apple's CEO, is a rare example of a hard-headed businessman who also understands the intricacies of software. (Bill Gates is another. But Gates was never interested in the hardware side of things, so MS-DOS and, later, Microsoft Windows drove the widespread availability of cheap, "commodity" hardware, which in turn spurred the growth of Linux and other free operating systems. But I digress.)
Many people these days forget -- or are not even aware -- that Apple went through dark days for nearly a decade from the mid-1980s, after power struggles between founder Steve Jobs and CEO John Sculley (a former Pepsi executive hired by Jobs) resulted in Jobs being ousted from the company. Sculley's tenure was not a resounding success. Jobs was not much more successful commercially with his next venture, NeXT, but it was a highly influential platform in the academic community (among other things, it was the platform used by Tim Berners-Lee to develop the World Wide Web). After Apple bought out NeXT in 1997, Jobs was restored as CEO of the unified company. The fortunes of Apple have soared wildly since. On the hardware side, he brought out revamped Macintosh computers and laptops, the iPod audio device, and the iPhone; on the internet side, he launched iTunes, the online music store that (together with the iPod) changed the music industry; and on the software side, he decided to entirely scrap the "classic" MacOS and develop a new system, Mac OS X, from scratch. Well, not from scratch: as noted above, it uses FreeBSD (and the Mach microkernel, and much other free/open-source software) at its core; and its graphical interface, and underlying infrastructure, is heavily based on the NeXTstep interface developed by his previous company. Today, OS X powers not only Apple's computers, but also their iPhone and iPod Touch handheld devices.
Two software-side decisions of Jobs are particularly interesting. When choosing an underlying "base" operating system for OS X, he did not go with Linux, which was then (as now) the best-known free version of Unix: he instead picked FreeBSD, which was lesser-known but highly respected by those in the know. And some years later, when Apple decided to develop a web browser, they chose as a base not Mozilla (whose ancestor was Netscape, the dominant browser of the 1990s, and whose Firefox browser is widely used today), but Konqueror, a browser used only by a small subsection of the Linux community. Specifically, they used the HTML and Javascript engines of that browser, on top of which they built their own interface. They also released their modified version of the html engine as the open-source project Webkit, and it is used by several other projects including Google Chrome.
In both cases, the decision to spurn the well-known name in favour of a much lesser name was justified by pointing to the smaller, cleaner, more maintainable source code; and in both cases, the result has been wildly successful. I wonder how many CEOs would have taken such gambles on relatively "unknown" software -- indeed, in the case of OS X, betting the future of the company on it. Google, in contrast, has chosen to use Linux, the "safe" choice, as a base for their operating systems (Android and the Chrome OS); and though they chose Webkit for their Chrome browser, they did so only after it had already been "proven" by Apple.
Small wonder, then, that Apple's share price takes a hit every time worries about Jobs' health surface.
Though the co-founder of Apple, Jobs was never a hacker (that was Steve Wozniak) but he understood software and software hackers. Which is what makes Paul Graham's article above a bit worrying. Has Jobs lost touch with the "hacker ethic"? Or has he lost full control of his company, perhaps due to his still mysterious health issues?