Apple's ban on intermediate platforms, and what this means for web apps

Dear web developers hoping to build apps for the iPhone: we're fucked. But Apple is shooting itself in the foot.

Some background

There's a big fuss right now because as part of the iPhone OS 4.0 release, Apple has explicitly banned the use of intermediate platforms to create iPhone apps (and hence presumably iPad apps, since they run the same operating system).

Their motivations for doing so are the subject of debate. The supremely well-informed Jon Gruber of Daring Fireball thinks Apple is doing it to lock in iPhone as the de facto standard for mobile development, in the same way that Microsoft managed to get a lock on the PC market despite the many flaws of Windows -- by attracting critical mass of developers, and hence apps, and hence users, and hence developers, in a virtuous, monopoly-creating feedback loop.

This interpretation has been tacitly acknowledged by Steve Jobs himself. However, Jobs placed the emphasis on another aspect of the post, saying

intermediate layers between the platform and the developer ultimately produces sub-standard apps and hinders the progress of the platform.

This spins it as a user-friendly decision rather than a ruthless business one, but there's no reason it can't be both, and one imagines it being both would be just fine with Mr. Jobs.

Whether or not Apple is correctly positioned to dominate mobile apps a la Microsoft is a subject for another post. But right now, I think the idea that intermediate platforms are unwelcome on the iPhone raises an important question for web-native developers, such as myself.

Does the web count as an intermediate platform?

When iPhone first launched, Apple announced that apps will be web apps. They were supposed to be first-class citizens and in fact were the only way of producing apps for the phone. There is even still a web apps directory, a neglected, poor man's App Store for web apps.

Since then the real SDK was introduced. It's unclear whether it was planned all along, or if it was a strategy adopted after Apple saw the enthusiasm and creativity going into jailbreaking, which allowed developers to run custom apps on iPhone before that was officially allowed. Meanwhile, the APIs web developers were promised for iPhone never materialized: location is now available, but a hundred others are not, and with the release of iPhone OS 4.0 that list has grown.

And now the official word is that intermediate platforms are not welcome to make apps for iPhone. I can't think of a more obvious and widespread intermediate platform than the browser environment, and whether you believe the motivation is a better user experience or a hard-nosed attempt to monopolize mobile development, web apps lose.

Because there's no denying it: web apps provide worse user experiences than native apps on the iPhone right now. They don't have to -- Apple could expose all the APIs via the web, and add extensions and libraries to Safari that would allow the beautiful, fine-grained UI controls currently available to native apps. In fact, they already built one, called PastryKit. Its non-release, despite being high quality and inclusion-ready, is another indicator that Apple is deliberately ignoring web apps as a platform for the iPhone.

Our one hope as web developers for developing on iPhone with full APIs was being able to build web apps that would get compiled down to native code (or, on clever platforms like Appcelerator Titanium, run as WebKit instances inside a customized, lightweight native app). With this change of the rules, the future of platforms like these looks very uncertain, and the door has been slammed in our faces.

The illustrious ppk thinks all iPhone apps should be web apps. I think the chances are slim, and getting slimmer all the time.

It's a damn shame. And the wrong call.

Everyone knows the largest development platform in the world isn't Windows, or Mac, or desktop or mobile: it's the web, the only platform that runs on all of those, plus nearly everywhere else. Ignoring the giant and ever-growing contingent of web-native developers -- people who grew up writing apps for the web, have never written apps for anything else, and see little reason to start -- is to ignore the tide of history.

The unstoppable march of technology has taught me that what ten years ago seemed like a ludicrously inefficient idea soon becomes standard practice. Running an entire IDE as a Java app, for instance, or installing each major component of my development environment in its own separate virtual machine. Computational efficiency is repeatedly sacrificed for speed of development, because computers are cheap and getting faster all the time, while developers remain expensive and oh-so-slow.

So it doesn't matter if, right now, native mobile apps are faster. That advantage is momentary. It does matter that the experience is better, but that just means there's an opening in the market for a platform that really does treat web apps like first-class citizens. Android or, perhaps, Palm, if they get acquired by somebody more capable of building out a platform.

At no point will web apps be faster than native apps. And the experience might never be quite as good. But one day it will be "good enough". The desktop hit that tipping point more than five years ago -- what's the last really exciting new desktop app you installed? In my case it was Chrome, and that was because it was a better browser. To pretend that won't ever happen on mobile devices is silly.

Once it happens, the web will win again. Attempts to lock web apps to your platform with useful but proprietary extensions will fail, as Microsoft failed with ActiveX. Developers will put up with building simpler apps because they run everywhere, really everywhere. Developers go where the users are, and the users, no matter who made their hardware or wrote their software, are always on the web.

The web will win, eventually. But in the meantime, find somebody who knows Objective-C.