Tech —

Copland 2010 revisited: Apple’s language and API future

How future-proof is Apple's development platform? In 2005, Cocoa and …

Copland 2010 revisited: Apple’s language and API future

Predicting the future of technology is a tricky business—just ask Bill Gates—but the allure of prognostication is strong. I've been known to try my hand at it. Sometimes I get a good read on things, like in 2008 when I wrote, "in the grim future of Apple/Adobe, there is only war." Vague, humorously hyperbolic, and with no explicit timescale: all the essential ingredients of a successful prediction.

Other times, I'm not so lucky. Five years ago, I wrote a three-part series of articles entitled Avoiding Copland 2010. This time, the message was earnest, specific, and had a year right in the title. In other words, a perfect setup for failure. Well, here we are in the year 2010—the future!—so it's time for me to take my lumps…or perhaps crow triumphantly? But first things first. What was this "Copland 2010" thing about, anyway?

Background

Copland was the code name for the most infamous of Apple's several failed attempts at creating a next-generation operating system. In the 1990s, when Copland was initiated, "next-generation" meant supporting memory protection and preemptive multitasking, both of which classic Mac OS lacked. Since then, Copland has become the poster child for Apple's nearly company-destroying failure to acknowledge and successfully address a serious technical gap in its software platform in a timely manner. It was only through the improbable acquisition of both a viable modern operating system and a formerly exiled company founder that Apple was saved.

In part one of the series, I put forward my thesis: that the Objective-C language and the Cocoa API are the parts of Mac OS X that are the most in danger of falling behind the competition, and that by the year 2010, Apple could find itself facing another Copland-like crisis due to its lack of a memory-managed language and API. In part two, I elaborated on the assumptions underlying my thesis. They were:

  • that fully automatic memory management will eventually be an expected feature of the primary application development environment for a desktop OS;
  • that by 2010, the rest of the industry will have adopted languages and APIs that feature fully automatic memory management;
  • and that existing (2005) technologies, and obvious evolutions thereof, do not adequately fill Apple's need for a memory-managed language and API.

Many of these assumptions were hotly contested.

In part three, I surveyed the landscape for languages and APIs that could supersede Objective-C and Cocoa. I also tried to encourage those who doubted the specific timeline to at least look at the bigger picture.

After all, everyone can agree that Cocoa and Objective-C will be obsolete someday. Okay, maybe someone out there thinks that won't happen until in the year 2050, but someday, right? […] What should replace Cocoa? What can replace Cocoa? What's Apple's next big move in the language and API wars?

In the article, I considered Objective-C with garbage collection, Java/JVM, C#/.NET/Mono, and even obscure efforts from Apple's past, like Dylan, rejecting them all for some combination of practical, technological, and political reasons. Apple, I concluded, needed to start down what looked to be a long, difficult road to finding or devising its own successor to Cocoa/Objective-C as soon as possible.

The future is now

So, how did things turn out? If we are to take the title and timeline literally, the conclusion is clear: Apple is not currently experiencing a Copland-like software platform crisis. It may be on the cusp of a very different kind of crisis, but that's another story. As far as Wall Street (and Apple's own balance sheet) is concerned, the future looks bright for Apple.

How did I get it so wrong? Or did I? Let's reconsider my assumptions. Is fully automatic memory management now an "expected feature" of desktop software development? Not according to most Mac OS X developers, it seems. Garbage collection was indeed added to Objective-C, and Apple has made a considerable effort to promote its use. But the "second-class citizen problem" I described five years ago has also come to pass. Most Cocoa developers, including Apple itself, are still using manual retain/release memory management in most of their applications. Garbage collection is not a no-brainer choice for all Mac developers today, and is still sometimes seen as a potential performance risk.

Contrast this with the most prominent competing desktop platform, the Microsoft .NET framework and C# language on Windows, where memory-managed code is the default and everything else is considered risky, literally being denoted with the "unsafe" keyword in the source code.

Nevertheless, Mac developers and users are not panicking like they did in the Copland era about memory protection and preemptive multitasking. If there's a crisis coming, it's definitely not here yet. So much for "2010." But why?

Now the future is later

Microsoft started working on the .NET Common Language Runtime over ten years ago. Since then, it's had four major releases which have included significant new C# language features as well as increased support for dynamic languages like Python and Ruby. If this is the desktop software platform competition, then Apple is getting its ass handed to it, technologically speaking.

Yet despite this reality, these technical issues are not exactly at the forefront of Mac developers' minds. The reason can be summed up in three words: mobile, mobile, mobile. The ascent of Apple's iOS (formerly iPhone OS) platform has been and continues to be dizzying. With it comes a set of constraints not seen in the desktop computer market in years: 128 to 256 MB of RAM, in-order CPUs that max out at 1GHz, and a complete lack of swap in the virtual memory system. It's been more than a decade since Apple shipped a desktop or laptop computer that was limited to so little RAM, and even longer since a Mac was sold that did not support virtual memory paging to disk. Oh, and by the way, there's also no Objective-C garbage collection in iOS.

This new hardware reality has effectively set the clock back on higher-level programming languages and frameworks in the minds of Apple developers, and Objective-C's nature as a superset of C has renewed its status as a perceived advantage. It's hard to get worked up about still dealing with low-level, per-byte-precise entities like pointers and C structs when your application is constantly receiving low-memory warnings from the OS.

Then there's the magnified importance of user interface responsiveness on mobile devices. Apple's ruthless dedication to maintaining a direct, lively user interface is a big part of what distinguished the iPhone from all earlier touchscreen phones and many of the copycat models that followed. Even today, the fractional second of latency that separates a new iPhone from lesser devices when scrolling a list or flicking through screens remains a subtle but perceptible differentiator. And as with the memory constraints, developers' minds can't help but draw at least a dotted line from the admirably reactive user interface to the low-level nature of iOS's native API.

Reality check

There's a problem with this narrative, however. Just like its biggest desktop competitor, Apple's fiercest mobile-market rival one-ups Apple in the modern development technology department by offering a memory-managed language and API on its mobile platform. And make no mistake, Google's latest Android release, with its don't-call-it-Java Dalvik virtual machine, is no slouch. (I'll claim a tiny nugget of foresight for having endorsed the idea of a register-based VM, the design approach that would eventually be used in Dalvik.)

To add insult to injury, Google is even building on some of the low-level libraries that Apple has helped to develop over the past few years, adding its own performance enhancements and embarrassing even Apple's mighty iPad with a mere Android phone in an old-school-Apple-style performance bake-off. Yes, WebKit is written in C++, and that's the point: providing a higher-level API to application developers does not preclude taking advantage of high-performance, lower-level libraries.

And it's not just Google. Microsoft, predictably, has brought over its .NET platform and added some even higher-level languages and APIs to its latest mobile efforts. Even poor Palm offered more abstraction and safety to its developers. This is the actual competitive landscape Apple faces.

Obviously, such technical details are dwarfed by larger issues when it comes to determining mobile-market success. Things seem to have ended pretty badly for Palm, for example, friendly web-technology-based SDK and all. But they were still one of the most credible threats to Apple's mobile user interface dominance. Google's still out there, of course, and it's not going anywhere. And Microsoft…hey, you never know, right?

The fate of individual competitors aside, the fact that the most dangerous players are all coming out of the gate with languages and APIs a generation ahead of what Apple offers should be a huge warning sign. And again, this is all happening in the memory-starved, CPU-constrained mobile world. On the desktop, Apple is even farther behind.

It is 2010, after all. "The future" or not, it's getting a bit silly for GUI application developers to perpetually be one bad pointer dereference away from scribbling all over their application's memory. The world has moved on; Apple should too.

Channel Ars Technica