Many people are looking at which platform is better to develop for, Android or iOS. You can look at the current percentage of smartphone owners, you can look at the type of people who buy devices on one platform versus another and you can look at the ease of use of developing for one platform for another.
Each of these points is interesting, in that there is no definitive answer. In fact, there probably is no one size fits all solution for everyone. The only way for there to be a one size fits all answer is to have only one platform to choose from. Clearly, this isn’t going to happen, so there will always be a division among developers.
The latest ComScore numbers show that there are more iOS owners than Android owners. So clearly it is better to go with iOS. But at the same time, Android apps can be written in Java – a language almost all programmers know, as opposed to Objective-C – so it is easier to write for Android. On the flip side of that argument is the issue of fragmentation, which plagues Google, but which Apple doesn’t have a problem with. Then there is the entry barrier, in that developing for Apple means you have to buy a Mac, but with Android, you can develop on any platform.
Each of these issues is important, no doubt. Is there an answer to every concern? No. Because the second Android stops being fragmented, the growth stops. The same goes for if Apple starts to open the development process. There are two opposing platforms, and developers have to choose.
If there is no easy answer, then how can developers choose? By looking to the future of the platform.
Let’s start by looking at the development process that Apple started when it released its initial SDK. For 95% of applications, the code you wrote back then for your first app will run with little or no changes on the devices released since then. Even devices with radically different screen sizes (the iPad 1 and 2) run the apps as they were originally designed. There haven’t been radical changes to screen resolution, or problems with making pixels match up across devices and generations. That is all the past though, and we should be looking to the future. Is there any reason why Apple would want to change this process? I see no reason why they would. It makes it harder for developers, and – more importantly, to Apple – applications wouldn’t work that used to work, which is bad for users. In the near future, barring any monumental changes to the iOS platform, Apple will continue to support older applications, and the applications being created right now will work on devices that are still in Apple’s pipeline.
The same isn’t true for Android. I don’t know of a study with definitive numbers, but based on purely anecdotal evidence, applications written for first-gen Android devices don’t work as well on current-gen Android devices. Every time a new device is released – by a manufacturer and carrier that isn’t worried about cross-device compatibility – a developer is forced to change the application dramatically so that it doesn’t crash, and so that it looks the same as on older devices. Why is this? Let’s come up with a hypothetical. You have Motorola, and they release the Droid Super Awesome, which is the greatest Android device ever, for only $100 dollars, unlocked. A developer clearly wants to have his application work for this device, so while he is writing the initial version he makes sure it works for the Droid Super Awesome. The application works great, the developer is happy, Motorola/Google can advertise the application, and everyone involved makes money. Six months later, HTC comes out with the Revolution X. It has double the specs of the Droid Super Awesome, costs the same as the Droid Super Awesome, and is the latest and greatest.
If this was iOS, the developer would quickly check to make sure the application still works (which is more than likely would), and go about his business of creating the latest application. Sadly, this is Android. So he tests it out on the Revolution X, and the application doesn’t look right, all the elements are misaligned, and it crashes. Instead of the developer having time to continue to make his next killer application, he has to dedicate time to making his application work on the new device – which runs the same firmware as the last device – because these new consumers expect the application to work on all Android devices.
This is the problem that application developers have. With iOS, you can look to the future and not fear that your application will require major changes to work on the next major device released. On Android, not so much. This doesn’t seem to be a big deal if you have a team that only does Android and a team that only does iOS, but it is. Most app developers don’t have a disposable income that they can use to hire as many programmers as they need. Even if they do have the money to afford a team that works solely on Android and a team that works solely on iOS, it still will be an uphill battle. Over time, the iOS device will get the features before Android, because the iOS developers can focus on the next release, but the Android team has to worry about the next release *and* the last versions (and *all* devices).
It comes down to a battle of the future. Maybe Google and Andy Rubin will come around and make the future seem a lot more sure, but they don’t have the track record. Apple does have the track record, and doesn’t need to do anything extra to prove it. This isn’t a battle of what Google and Apple might do though, to make developers more at ease regarding the future of the platform. Developers are creating applications right now, based on market reality, not based on what might happen. When you look at it that way, it is clear that Apple and iOS have a much more steady future than Google and Android.