I hear/read a lot of things about native apps and HTML5 based apps. That native is best, that HTML5 is slow, that HTML5 is faster to develop but provides substandard apps, it always ends up reduced to the argument that native is better because… it is native!
I’ve been developing applications for quite some time now, from desktop to systems integration, from webpages to web applications and from native to “not so native” apps and all the time I’ve actually never seem a client or a user complain that the applications were not “native”. Ok, lets get step back a bit, before thinking about native applications lets try a different type of comparison.
Do our users know the difference between an application made in Java and an application made in C++ that runs on MS Windows? And do they have any idea about what makes an application made in C++ with, say a GCC compiler collection, different from one made in Visual C++ running on top of the .Net platform? I don’t actually think so, I’ve never really seen a user complain that my applications were bad because I used Java instead of C++ (mind you, no user of my applications ever complained :D ).
Being “native” means only one thing to me: that the user doesn’t find the application different from what the supporting platform offers and this means that an application for iOS will look like what an application for iOS should look like and and application for Android will look like… well, like the usual mess they look like (just kidding, I do like Android applications, but they were messy looking before 4.0). Nowhere in my definition (and I should point out it is my definition) does it say anything about the programming language or the technologies used to develop the app. This is what matters to me: the user experience.
Well, I have a lot to say about HTML5 based applications (phonegap/cordova is great), and about apps done in Objective-C or Java, but today I just wanted to rant about this topic.