Sérgio Lopes Sakabatō, the reversed blog!

Native vs HTML5, does it really matter? Just a rant....29 May 2013

I hear/read a lot of things about native apps and HTML5 based apps. That native is best, that HTML5 is slow, that HTML5 is faster to develop but provides substandard apps, it always ends up reduced to the argument that native is better because… it is native!

I’ve been developing applications for quite some time now, from desktop to systems integration, from webpages to web applications and from native to “not so native” apps and all the time I’ve actually never seem a client or a user complain that the applications were not “native”. Ok, lets get step back a bit, before thinking about native applications lets try a different type of comparison.

Do our users know the difference between an application made in Java and an application made in C++ that runs on MS Windows? And do they have any idea about what makes an application made in C++ with, say a GCC compiler collection, different from one made in Visual C++ running on top of the .Net platform? I don’t actually think so, I’ve never really seen a user complain that my applications were bad because I used Java instead of C++ (mind you, no user of my applications ever complained :D ).

Should it matter if I developed an iOS application using HTML5, Javascript and CSS3 instead of Objective-C, as long as I’m able to provide the proper user experience and to give the user the same experience he is expecting to find in the platform (in this case iOS) with either technologies it doesn’t really matter if the app is a bunch of HTML and JS files wrapped in a zip file  (or IPA if you prefer).

Being “native” means only one thing to me: that the user doesn’t find the application different from what the supporting platform offers and this means that an application for iOS will look like what an application for iOS should look like and and application for Android will look like… well, like the usual mess they look like (just kidding, I do like Android applications, but they were messy looking before 4.0). Nowhere in my definition (and I should point out it is my definition) does it say anything about the programming language or the technologies used to develop the app. This is what matters to me: the user experience.

Yes, there are valid and important reasons to choose Objective-C or Java, and so are there to choose HTML5 and Javascript. But like in any other application, it should be a choice based on the requirements and not on the idea that “native” (as in written in Objective-C/Java) is better just because it is “native”.

Well, I have a lot to say about HTML5 based applications (phonegap/cordova is great), and about apps done in Objective-C or Java, but today I just wanted to rant about this topic.