As HTML5 becomes more popular, the misinformation surrounding this new standard grows. It has become a catchall phrase for the mobile Web, and its features and capabilities are widely misunderstood.

The problem: Everyone wants HTML5, but they’re not quite sure what it is. Some view it as the answer to mobile apps. Others think they need to convert their applications to it.

How can you separate the myths from reality?

Working for a software company, we see HTML5 misconceptions nearly every day. So let’s examine some of the most common of these misconceptions, and explain why they’re false. Hopefully, this article will paint a clear picture of HTML5, and give you a better understanding of how it can improve your Web applications. But first, before we dive into the myths surrounding HTML5, let’s quickly explore its history to give you a better idea of what it is and where it came from.

A brief history of HTML5
After publishing HTML 4.0 in 1997, the World Wide Web Consortium (W3C) discontinued work on HTML, with the belief that further extending HTML would be difficult. Instead, in 1999, the organization started work on XHTML, a new language that combined HTML with XML.

Unhappy with the direction HTML was taking, a group of developers at Opera and Mozilla proposed a different vision for the Web at a W3C workshop in June 2004. They believed Web applications were not being adequately served by existing technologies, and outlined seven design principles that they viewed as critical to the future of the Web:

1. Backward compatibility and a clear migration path: Web applications should be based on technologies that developers are familiar with. Any solution that does not offer a clear migration path, or requires the use of binary plug-ins, will likely be unsuccessful.

2. Well-defined error handling: Error handling must be clear and consistent across different browsers and user agents.