Regarding Objective-C & Copland 2010

Copland 2010” refers to a series of articles, written in 2005, and a recent episode of Hypercritical, in which John Siracusa argues that at some point down the line Objective-C and the Cocoa APIs are going to have to be replaced with more modern technologies. Failing to address this sooner rather than later, as was the case with the Classic Mac OS, will leave Apple in a vulnerable, perhaps insurmountable, position relative to their competitors who have adopted “modernized” development technologies. Apple’s last-ditch attempt to bring modern technologies to the Classic Mac OS was codenamed Copland. Siracusa hints at the severity of the situation, if his argument turns out to be sound, by calling his projected development platform nadir “Copland 2010″.

In broad strokes: Siracusa is right and Siracusa is wrong. There are two parts to the Copland 2010 argument — a statement of the problem and a call to action with some suggestions. In the broadest sense it’s basically impossible to seriously debate the merits of the problem as he lays it out. Undoubtedly, at some point, Objective-C and the Cocoa APIs will be in the dust-bin of history. At issue, however, is the time scale. It’s unfair to hold Siracusa to 2010, in Avoiding Copland 2010, part 2 he writes, “I actually think the year 2010 is a bit too early, but I didn’t want to use a date that was too far in the future”. We’ll be fair and take 2010 to mean “near future”, in computer industry terms. Let’s call it between five to fifteen years.

I don’t think it’s at all likely that we’ll see the utility of Objective-C and the Cocoa APIs become totally eroded over the next ten years (the argument was first made in 2005, so we’ve used up five years of the near future). I do think it’s likely that during that period it will become more and more apparent what the next big thing is — where the industry is going. I don’t, however, think that any of the popular “modern” languages fit any better with the future of computing than Objective-C does. We’ll revisit this point.

Siracusa, during his Dark Age Of Objective-C episode of Hypercritical lays out the following criteria for a modern language:

  • Automatic memory management
  • Native strings
  • Regular Expressions native (but a library is acceptable)
  • Native object system
  • Named parameters
  • Succinct syntax for common operations
  • Acknowledgement of concurrency
  • Single vendor driving the development

I agree with most of these, as I think many developers would. We could quibble over the weights we assign to each but, by and large, that’s a decent list. Let’s use it as a score-card for modern Objective-C.

✓ Automatic memory management
Objective-C 2.0 can be garbage collected.

✓ Native strings
Objective-C has @”” strings which are mapped, via a compiler option, to NSString instances. If the requirement is for special operators to work on strings then I’d argue that’s actually an issue of not having a “succinct syntax”.

½✓ Regular Expressions native (but a library is acceptable)
Regular expressions are not native to Objective-C. There are libraries available and, as of iOS 4.0, Foundation includes a regular expression facility.

✓ Native object system
Objective-C has a native object system. Note well that Siracusa was not saying that everything had to be an object, he was referring to prototype based languages like Lua or JavaScript where developers roll their own inheritance systems.

½✓ Named parameters
The Objective-C message sending syntax provides half a solution to this feature. Ideally, one would want to simply name their parameters, omit any that aren’t required in the call, and have it all just work. While Objective-C doesn’t go this far it does go far enough to significantly increase the readability of any given message send.

When The Boy Cries Wolf

It’s one word with an exclamation mark followed by a list of dated quotations from various sources running back through the past seven years. It’s the most on-point, refined, minimalistic yet communicative piece of writing I’ve seen. You don’t need to read it, you absorb the point by how long it takes to scroll through it to get to the punchline at the end. There isn’t one. Using even one more word to explain or expand upon the joke is uneconomical. “Wolf!” is concision enabled by a common understanding. It’s the eye-roll between friends where the friends are everyone who can read. Brilliant stuff. The fewer words that writer uses the better the reading gets. This has to be the sweet spot.

Here’s the thing about the story of The Boy That Cried Wolf though. We all know the story, it’s been handed down as a parable for countless generations. It goes like this: a boy is tasked with tending a flock of sheep for a village. Lonely, he cries “Wolf!” for attention. After a few times of doing this the villagers stop responding. Eventually a real wolf arrives and eats the boy and all the sheep. Moral of the story: don’t tell lies because when it’s real nobody will believe you.

Here’s an alternate perspective on that story: bored by false positives the villagers stopped investigating possible failures and as a result lost a young boy and all their sheep. If you’re a developer think of this as letting a bunch of warnings get past you in your code. One day one of those warnings will kill your application dead and you’ll say to yourself, “I really should’ve paid more attention to all that screaming”.

With regards to the security issues at hand: I’m with Gruber and believe the current issue is overblown and, yes, there’s been a constant refrain of fear mongering about this kind of thing. That said, every new potential threat should be taken seriously and steps should be devised to counter it. There was a new kit released recently that’ll make creating Mac mall-ware easier than ever. The holes it exploits should be considered grave and addressed as quickly as possible. In the story of the Boy That Cried Wolf the village ultimately paid the price for not being vigilant. The interpretation has always been to take it as a parable to improve personal behaviour but what I enjoy most about that tale is that it works both ways — there are two parties at fault: the attention seeker and those who took the cognitive shortcut of disregarding what the attention seeker was saying because they’d been wrong in the past.

As a whole, despite not being very communicative or responsive, I don’t believe Apple disregards this kind of thing. It’s my belief that one is able to write a piece titled “Wolf!because we’re still at the stage were this kind of report gets a response. Slower, perhaps, than many would prefer, but still, it is eventually addressed.

I really liked this piece at Daring Fireball but the ambiguity of the statement troubled me. Yes, these people are likely calling “Wolf!” without it really being a threat — but how much have the previous threats been diminished by the townsfolk showing up with torches when called upon?

My argument, in a nut, is this: as a customer you’re more than likely ok to ignore these dire prognostications, as one of the people who is tending the sheep this kind of thing should be on your mind and be something you aim to address quickly.

Crying “Wolf!” too much bites both ways — the crier ends up looking like an idiot until they’re right and then the whole village loses out.

Something, something, broken clock, huzzah.