Category Archives: Uncategorized

Place Your Bets

Matt Neuburg had a great piece that drew a lot of attention recently titled Lion Is a Quitter. Neuburg argues persuasively that the auto-termination behaviour of Mac OS X 10.7 Lion is inconsistent, unreliable and poorly integrated with the user interface and, worse, runs counter to user expectations. I agree, the situation as it stands is less than desirable and in a few ways it is simply broken. Neuburg is right.

It doesn’t matter. Many complained, correctly, that window resizing was unreasonably slow on early versions of Mac OS X. Auto-Termination is similarly as clumsy in 10.7. It mostly works but there are usability issues that remain which taint the experience.

The argument for manually managing which applications are running and which aren’t comes down to the conceit that the operator knows what’s best for the machine. The people that feel this way will say they understand how various processes are effecting their CPU, RAM and even page-in & page-out usage. They believe they’re capable of knowing which processes are most negatively effecting their machines and terminating them as required.

If this doesn’t sound familiar to you then I suggest you Google for “config.sys”, “autoexec.bat” and “how the fuck did Mac OS ever work with all those crazy extensions?”. The answer is not, and will increasingly never be, to give the user more control over the processes that are active on a given device. Consider all the factors involved in a modern user-facing process. Can you know which ones are putting the most pressure on the system? You may guess, based upon your last assessment of hard drive, RAM and GPU engagement, but you can’t be sure. There’s one thing I’m sure you don’t know — which process has consumed the most power?

That’s where best bets lie — power consumption. CPU, GPU, RAM and internal bandwidth all keep increasing. Battery life isn’t keeping pace. The best bet given the projections of how the technology we use in computers today is going to go is to make plans to preserve battery life, bank on video and CPU memory and bandwidth, and bet that persistent storage write and read speeds will continue to improve.

The vast majority of the products Apple sells are not tethered to a power supply except when they need to recharge. Power consumption is the key issue going forward. Applications that can be killed as needed, and that includes App Store granted entitlements, can be better governed, better understood and better planned for by the operating system.

The next battle-field will be the ability to leverage battery life. Apple is betting on that now just as it bet on GPU memory and bandwidth back when it first ran with Quartz.

If you’re not yet thinking of power-conservation as being the next big thing in personal technology then I implore you to do so. Yes, patents are murderous, mind-numbing and progress-dulling. But they’re also old-news. Batteries, power-usage and ways to balance them, that’s what’ll count.

Oh, and the patent that covers that will be worth a fortune too.

Progressive Refinement

Latency … is the worst. At the root of our cognitive abilities is our appreciation for cause and effect. The tighter the loop between an action we initiate and the effect we precipitate, the better we are able to correlate them. In many ways this serves as a cognitive short cut — the smaller the time between the input and output limits the amount of interference that could have been caused by a third party. An instantaneous failure result implies that we’ve misjudged the situation as it is, a delayed result suggests that the environment in which we’ve responded may have changed. Every millisecond of delay forces us to ask ourselves, “Have I done something wrong?”

As people who make things that others interact with we are responsible for reducing this potential window of fear and replacing it with a level of comfort our users can embrace. Our goal is to have effect follow cause as immediately and as responsively as possible. This may be old hat and a restating of a previous piece but it’s still a worthwhile point to make.

Using a rough approximation of a visual during animation is a subset of what I’ll call progressive refinement. When animating we’re trying to accomplish two things. First, and if you’ve elected to employ an animation this should be foremost, we’re communicating to the user a change of state or modality. Secondarily, we’re using the time we’ve spent entertaining our user to figure out what we actually want to show them. The state of our application has changed and we’re trying to communicate that to our user progressively. When zooming in on some text in a Safari column, we’re using a lower resolution image and scaling it up we’re using an interpolation (a bilinear filter, from the looks of it) of the original rendering as a progressive refinement of what they can expect the final result to be. After zooming in on a column in Safari the letters are basically the same size and in the same positions and once the final rendering is done, the view is replaced with a higher resolution version of the page that correlates well to the preview that’s been presented.

We take progressive refinement for granted in iOS apps. When we scroll in Twitterrific or Tweetie the scrolling is instantaneous. Filling in the user avatars can come later, or even highlighting the links in the Tweets. These are considered secondary refinements upon what is most important to the user — when they scroll the list it must be responsive. This is drawn from the iPod, or other media listing apps but is especially evident when using Home Sharing — A user will accept a slight delay in the progressive refinement of their view of the data but they’ll be frustrated with a delay in delivering that data to them.

Consider human vision. We’re accustomed to quickly changing our focus or entering a lightened or darkened room. We’re adapted to the idea that details may well present themselves gradually. We don’t find this onerous, we find it natural. If you’re writing software that interacts with humans this is worth keeping in mind — present an outline as quickly and as responsively as possible and from there refine your presentation of the data. Present buttons as disabled and then enable them once you know they’re applicable. Present a list of posts quickly but fill in the avatars individually as you’ve finished fetching them. Present a body of text and then present the result of the Data Detectors that have discovered that “Tomorrow 1pm” is a meaningful phrase. Progressive refinement is a hidden HIG directive that all of you should be aware of.

The hierarchy of information interaction is this: Immediacy, Accuracy, Fidelity. When interacting with information you want it to be fast, then you want it to be accurate, and then you want it to be exactly what was originally expressed.

It’s up to you to decide how the information you’re presenting to your users fits into these categories. If you think stalling a scroll gesture to pull in album artwork or user avatars is the right thing to do though — you’re very wrong and everything that’s good on iOS proves that. Consider what you’re presenting to your customer, consider the costs involved in providing each detail, optimize your interaction to provide the most value in the least amount of time. Then, fill in the blanks.

There’s no magic in software, there’s only ever forethought.

Luckily thinking is free but, hey, feel free to charge your clients for the time you’ve spent reading this.

Don’t Be A Dick: Compiled Flash and You.

Appeal to Authority

I post here infrequently. My intent with each post has been to show some “clever” abuse of Cocoa or the Objective-C runtime APIs. I don’t want to portray what is presented here as the canonical or “best” way to approach problems – my goal has been to offer readers who are amply qualified to disagree with what’s written something that I feel is worth their time to think about. I hold the Cocoa developer community, and particularly those I associate closely with, in high regard. Coming up with novel ideas and finding the time to adequately document them is, by virtue of the caliber of the audience I hope to address, a difficult task which doesn’t fit into a regular publishing schedule. If you’d like to read something smart at least once a week please consider reading my friend mikeash. Don’t bother clicking that because if you’re here and not already reading everything Mike has to say then there’s something wrong with you that one hyperlink will never solve.

With that preamble out of the way I’d like to assert that I am an avowed fan of the Cocoa frameworks and a believer in the language middle-ground that is Objective-C. The argument that follows is a novel addition to this infrequent spamming of ideas I call a Web-Log in that it is not a tightly focused technical piece. Rather, it is an opinion piece – an appeal to authority.

And to whose authority do I appeal? Mine. Because I know what the fuck I’m talking about.

Non-Native Toolchains vs iPhone OS

May the best man win.

Beyond the well known issues with Flash letting arbitrary interpreted code run on the iPhone is problematic. Primarily it precludes Apple from vetting the code executed on the device after they’ve approved the product for sale on the AppStore. One could argue, and it’s a decent argument, that they should not exert this level of control over the apps that run on the device they sell you but, given that they’ve decided to do so, permitting interpreted code to be executed would be a loophole in this policy. What Adobe announced today, however, is not a Flash runtime – it is a compiler for Flash code that produces first class iPhone (ARM specifically) native code.

Jeff LaMarche has examined the Adobe generated .ipa files and you can learn a little more by reading his Tweets starting here. In full disclosure Jeff has bought me more beers at WWDC than I feel I deserve but I trust his analysis of these files nonetheless. Jeff is a smart fellow and uncovers a few clever things in his dissection of these Flash based iPhone apps.

My take is this: if you produce a binary that honours the iPhone Developer contract (uses only public APIs, is not a Flash interpreter, applies the appropriate code signing, etc) then you have exactly as much right to be on the AppStore as I do. If you compiled some Flash code or even used Mono (.NET) to write your app then so be it. If the final product is playing by the same rules I am then that’s good. Perhaps your tool is better for the job you selected it for.

“Be Thou Not Afraid”

(During my favourite x-files episode an “alien” says, “Be thou not afraid, Rocky”. I tried finding a link to that but this link came up first. Dig the horizontally scrolling text and take time to read the poem too. It won’t hurt, even though it does mention God. And the MIDI. I just noticed the MIDI. Click that link just for the time warp effect).

Let’s be frank here – fuck the tools. If you’re as deep a fan of the Cocoa tool-chain as I am then you likely came to them as I did – after years of dealing with the drastically inferior. Do I believe Cocoa is still the best tool? Yes, I do. But let’s not pretend that it’s the only tool. Some crazy people may prefer other tools, and we may well think they’re insane for doing so. The proof, however, is in the pudding. And it’s the pudding that our customers buy. I’m in love with my oven and at this point I doubt I’ll ever change it but I have no illusions that fashion won’t pass me by. If Adobe, or anyone else, can produce tools that provide a more compelling application on the iPhone then good for them.

Ridiculing their or others efforts is ultimately self defeating. If you love software as I love software you should think twice about being so dismissive of alternative approaches.

Do I think the “Ahead Of Time” compiled Adobe Flash will take over the AppStore? No, I don’t. And I know that because I can write better software than that.

But one day I’ll be wrong. And if there’s one thing I hate more than being wrong it is having dismissed as trivial that which has defeated me.

“I’ve Been Wrong Before”

($0.99. Go buy Dusty Springfield’s I’ve Been Wrong Before. And if you don’t after having heard the 30 second demo then quit bitching about Alyssa. She is carrying my child after all…)

I like to believe I’m a clever fella and here’s my secret to being clever – I am always wrong. Always.

Programming is an exercise in overcoming how wrong you’ve been in the past. At first you’ll overcome the syntax errors, then you’ll overcome the structural errors, and then you’ll come to align your code with the standards of a greater community and you’ll feel safe and like you’ve made it. You haven’t – you’re still wrong because you’re always wrong. You are playing a game you cannot win. And let’s face it – if it was a game you could win you’d not be playing at all.

If you’ve written code then you’ve looked back at your work and known it could have been done better. Think bigger. Think beyond the micro improvements that could be made within the domain we understand best and consider the scope of our experience. If you’ve been around this clock as I have then you’ll recognize a good idea comes every five minutes and one that sticks every fifteen. Let’s not pretend that we know what time it is because we know what we saw last time we checked our watches.

At the top I claimed to be an authority and that I knew what the fuck I was talking about. That’s still very true – If you think you’re an authority you’re wrong. Take it from me, I’m the authority on authority and if you think you know better then eventually you’ll be in for a really big shock.

Be wrong as often as possible – It’s the only way you’ll ever be right.

Love, Guy