Ease of use has always been central to a great UI design, but what “ease of use” means, how it is implemented, and who needs to implement it evolve over time. Some think business and consumer application UI designs necessarily differ, while others think they are converging.

“Usability has expanded to things that are not as well defined and measurable like ‘appeal,’ ” said Tobias Komischke, director of user experience at Infragistics.

While it is possible to measure how much time it takes to complete a task, it is more difficult to accurately gauge users’ emotional responses to a UI’s color scheme, aesthetics and the perceived value of the application.

“Rich interactivity goes beyond look,” said Komischke. “The feel of richness is not addressed by the old philosophy, which is hard on developers because aesthetics are hard to define. You need [the] specialized skills of a visual designer and interactive designers who understand the input/output relationships.”

According to Rich Dudley, technical evangelist at ComponentOne, modern UIs feature clean layouts, smooth rounding, and simple glyphs that guide and enlighten the user. 3D effects are also gaining popularity.

“Controls and content should be logically grouped and neatly arranged, and text should be very readable,” he said.

The idea is to create applications that are easy to use, intuitive and immersive, so consumers and business users will remain engaged with them.

“Users have short attention spans, so you have to present lots of information quickly and allow them to drill down. Otherwise, they won’t pay attention,” said Julian Bucknall, CTO of Developer Express. “Long lists of numbers and text are not great.”

_

Microsoft and other platform providers have established UI design guidelines that can help streamline UI design and development efforts, although some find them too constraining.

“A modern application should be simple, look cool and work smart on every platform,” said David Intersimone, VP of developer relations and chief evangelist at Embarcadero Technologies. “The days of following the guidelines provided by the platform vendors are over. Some desktop applications are using HTML and CSS to render their UIs. Others are using Direct2D, Direct3D or OpenGL to render their user interfaces, instead of just settling for the common and UI controls that the operating system provides.”

Native experiences vs. experiential consistency
User experience has to balance many factors, some of which seem to be at odds with each other. While users expect applications to work well on individual devices, they also expect to have a consistent experience.

“A consistent user experience could be defined by the operating system or environmental factors such as whether you are standing at home or sitting at the office,” said Daniel Jebaraj, VP at Syncfusion. “You need to make sure your application is consistent with its purpose. You also need to make sure it’s consistent with other applications on the platform such as Microsoft Office so they are easy to use and learn.”

Users get frustrated with applications and windows that differ from their expectations, which is a detail that can get lost among endless aesthetic possibilities and technological implementations. Todd Anglin, chief evangelist at Telerik, said users increasingly expect applications to feel native.

“There’s a huge validation with the explosion of mobile,” he said. “The idea is to [address users’ desires for] a native, interactive look and feel rather than one size fits all.”

As much as businesses would prefer to write an application once and aggregate it out across platforms and devices, in practice it is difficult to achieve superior user experiences on individual devices without doing some refining along the way.

“Your design needs to contemplate older legacy users. And if you’re building mobile apps you can’t just build them for the iPhone,” said Telerik’s Anglin. “Windows has a Metro UI while the iPhone has a Bubbly UI. You need to consider that because your applications have to fit into the ecosystem.”

Human and non-human factors to consider
Theories about “effective” UI design have changed over time based on what is technologically possible. For example, serif fonts like Times Roman commonly used in printed books and newspapers are considered more readable for body copy than sans serif fonts like Helvetica. Online, screen resolutions determine whether serif fonts are practical or not.

“Best practices change based on technology,” said Infragistics’ Komischke. “The golden rule is to use sans serif fonts because screens are incapable of rendering the serifs. But now that Kindles have super resolution, it’s possible to do.”

Accessibility—a user’s ability to understand and use an application—is an important issue, but it is sometimes ignored in favor of aesthetics, according to Anglin.

“It’s the Wild West with the cool kids doing cool things, but they don’t care about accessibility,” he said. “Popular applications fall into the trap of building client-side using CSS3, JavaScript, or HTML5, so the UI is initialized in the browser. A pure client-side UI can be problematic because it may look great in a perfect scenario, but then a searcher ends up seeing a blank page.”

ComponentOne’s Dudley agrees, adding that a great design means little if, for example, an assisted reading device is unable to interpret the screen properly.

“Only a small percentage of the population uses an assistive reading device when browsing the Web, but a huge percentage of the population has some sort of visual limitation,” he said.

Although different types of color blindness exist, the congenital red-green type most common among human males implies that something other than or additional to color-coding should be used to distinguish screen elements.

How users consume information must be considered when designing a UI. It is important to understand where and how people search for information on a screen and how that can be influenced by the presentation of aesthetic elements.

Some UIs use so many saturated (pure) colors that every element visually competes for users’ attention. According to Komischke, it is better to use one or two base hues, then coordinate the color scheme. Although bright colors may work well for Web pages that are viewed for short periods of time, they are less effective for applications that are designed for use throughout the day. Bright colors are harder on the eyes, which can cause users to “burn out.”

It is also a mistake to create screens or pages that are cluttered from top to bottom with text and visual elements, because it can confuse or disorient users. A better approach is to emphasize the most important elements, lowlight less important elements, and avoid extraneous elements.

Tables and data grids are a good example. Users care about the information contained in tables and data grids, but they are less interested in the X-Y grid lines that separate the information. The lines can serve as noise if they are visually equivalent to the information contained within them. One way to reduce the noise is to choose softer colors for infrastructural elements that work well within the environment, such as gray or light blue instead of black.

“If you know users’ eyes travel from the top left to the bottom right, then you can apply that knowledge to ensure that the most important information does not get lost on the page,” said Komischke. “The signal-to-noise [ratio] is important where signal is the intention, and noise is the infrastructure like menus and scroll bars. The question is how to best highlight what’s important.”

Determining what is most important from the users’ point of view requires user involvement, prototyping and testing. DevExpress’ Bucknall said that, in the interest of designing a better UI faster, his customers are making a point of getting their UIs in front of users sooner than in the past.

_

“Don’t try to create the greatest UI off the bat,” he said. “Show [users] something and then refine it using a combination of prototyping and usability testing.”

Those intimately familiar with UI design also consider Fitt’s Law, which predicts the time it takes to hit a target area based on the size of the target and the distance between the target and the starting point. The easier it is to hit the target, the better the user experience.

“If I’m sitting in front of a computer screen filling out a form that requests my name and e-mail address, the Submit button should be just below [the data fields] because it’s easy to hit the target,” said  Komischke.

To speed up and ease users’ understanding of complex information, data visualization controls have become popular in the form of pie, bar, radar and other chart types. According to Komischke, pie charts are the de facto standard for business intelligence applications even though research shows other chart types are more straightforward. Infragistics and some other control vendors allow users to toggle between pie charts and bar charts for that exact reason.

“You can design a UI based on what you think will work, but if you use research, you can save time,” he said.

Understanding the target audience is an important part of research: Who comprises it, what they are trying to do, and why their expectations are what they are.

“A user profile needs to include the business process if it’s a business application. If it is a consumer application, you need to understand what the consumer is trying to accomplish,” said Syncfusion’s Jebaraj.

“You also have to consider how the user expects to interact with information, their prior experience [accomplishing the same tasks], the experience the device delivers, and the target deployment environment.”

The effect of the user experience movement
The recent focus on user experience has caused designers and developers to think and behave differently. While UIs have always reflected some combination of form and function, the approach is much more user-centric than it once was.

“User experience made designers and developers rethink how people interact with applications rather than just how the applications look,” said ComponentOne’s Dudley. “You could have a great looking Web page, but if it’s page one of 17 in a wizard, the user experience isn’t practical.”

Users are easily frustrated by confusing UI designs, which are tempting to create given the wide availability of themes, buttons, menus and other elements that are easy to implement. A common pitfall is applying the newest and seemingly coolest tools and tricks without stopping to think about the effect the changes may have on user experience and process efficiency. A perfect historic example of that was Flash abuse in the form of spinning logos that actually degraded user experience, according to Telerik’s Anglin.

“It’s a mistake to focus on discrete steps or details rather than the larger process, because you can overcomplicate the UI,” said Jebaraj. “You also need to make sure that your applications are not locking up because of disk and network access problems.”

While some designers overemphasize aesthetics and underemphasizing usability, it is also possible to overemphasize usability at the expense of aesthetics.

“Users are or should be at the center of every software and UI decision,” said Embarcadero’s Intersimone.

“User and usability testing have to go hand-in-hand with architecture and implementation decisions. User-centered design and the user experience are a must for successful software projects, unless they are server-side, service or infrastructure software projects [that require] a minimal UI, if any.”

Although the concept of usability is not new, it has been hard to apply what are considered some of today’s best practices given tool limitations. The integration of tools like the Microsoft Visual Studio IDE and Microsoft Expression Blend UI design tool help designers and developers more efficiently transform designs into working code, although there is still room for improvement.

“Before, designers used to e-mail their designs to developers, but now the processes are not completely separate,” said Komischke. “Visual designers need to be more technical to generate XAML assets, and we’re now seeing more iteration between designers and developers.”

As new technologies and techniques emerge, it is easy to lose sight of best practice fundamentals in favor of pushing the limits of technology.

“As graphic processor speeds increase, designers have been coming up with fancier UIs that include things like alpha blending and gradients,” said Jebaraj. “Showing off GPU power doesn’t necessarily result in a good user experience [if it’s] just blind application of processing power.”

Communities also influence UI design by voting with their budgets and ratings. Anglin said the Internet reduced barriers to idea sharing while app stores have lowered the barriers to platform adoption. (Consider the recent and rapid growth of the Android platform, for example.)

“Consumers are emotionally driven and want to be entertained, so they’re pushing applications in that direction, which changes the flow of application design and hardware adoption,” he said. “You can’t just build something that’s OK. UIs, even in business, need to emotionally satisfy users if they’re going to be successful. Although consumer and business adoption cycles may differ, the impact on the application space means you as an application designer or developer can’t afford to be complacent. Don’t make excuses based on past experiences or personal beliefs. Otherwise, your applications will not be well adopted.”

Of course, like fashion, UI designs reflect trends that are sometimes adopted en masse, which can lead to redundancy.

“You could argue too many applications look a lot alike because they use lots of white space and contain little bits of text,” said Bucknall. “The focus on user experience has allowed users to get to the information they want more simply.”

Just don’t be too minimalist. Failing to provide users with enough information to make decisions can backfire.

“Security issues are a prime example of asking users questions when they have little information to act on,” said Jebaraj. “You can address that with trust and signed applications as opposed to prompting users to make decisions. It’s a mistake to [overly complicate] a user experience.”

A related error is to inundate a user with process steps that appear to have little or no context. By adding a visual progress indicator, users can readily identify where they are in a process and where they are going. Just be sure to save the information they entered automatically so the user does not have to reenter it.

“A sign of a bad UI is any case in which the user sails along and then the application or process crashes. Then when the user gets back to where he was, the information has disappeared,” said  Bucknall. “A bad UI seriously frustrates users.”

The state of the art
Menus are morphing into things like ribbons and buttons while static images are giving way to animations, video and 2D/3D images that can be manipulated by users. Regardless of vertical market, a general theme is ease of use and understanding.

In the healthcare market, hospitals are using mapping and imaging controls to quickly visualize hospital floors and the patients who populate them. Medical professionals are also using smartphones and tablets to monitor vital signs and patient records, which were previously available only on desktop computers.

In the financial services industry, stockbrokers can monitor trends in real time using sophisticated dashboards that are capable of tracking, drilling down into, and analyzing richer forms of information.

Organizations regardless of industry are also aggregating their applications out to different types of devices to accommodate internal and external user requirements. Embarcardero’s Intersimone said his company is noticing increased used of 2D and 3D vector graphics; flexible display screens that can bend around surfaces; and touch, gesture, voice and biometric UIs.

“Who doesn’t want one of those large touch-screen walls or tabletop touch surfaces like you see on CNN and in the movies like Minority Report?” he said. “Eventually the user interface will appear inside our brain, and the user interface will be our mind and body. For now, we don’t even use our feet as part of the office/desktop user interface.”

DevExpress’ Bucknall thinks Apple’s Sparrow e-mail client is a good example of what can be done with a UI. It is capable of logging into AOL, Gmail, MobileMe and Yahoo to deliver a unified Facebook-like stream of e-mail messages.

“They’ve taken something like email which people think of as set in concrete and not only made it better but completely different,” he said. “How to present information is a pressing question. It’s better to present lots of information in a format that can be quickly understood and provide drill-down capabilities.”

Context is also getting more attention because it helps weed out the relevant from the irrelevant in a sea of virtually endless information.

“Contextual applications show you the choices you can make based on your exact situation, never showing irrelevant options,” said Odi Kosmatos, VP of research and development at Xceed. “Wearable computers will pave the way for this more than ever.”

Richer applications mean richer experiences that respond to users and keep them engaged. While there are debates about which technologies are best for building immersive applications (Flash, HTML5 or Silverlight), there are certain common features that users have come to expect such as support for rich media, webcams and microphones.

Because Web, mobile and desktop applications are all influencing each other, developers must keep in mind how that is affecting user expectations. For example, users expect desktop applications to provide printing and saving capabilities, and they also expect the same functionality from Web applications. Similarly, mobile behaviors like gesturing are being implemented on larger-screen devices because it is considered more efficient than using a mouse.

“People get used to metaphors in daily use like search fields, back buttons, cancel buttons, and automatic save capabilities,” said Bucknall. “Web and mobile metaphors are affecting desktop applications [because] people are used to them.”

The best UIs are those that deliver user experiences that yield positive emotional responses. To get there, it is important to achieve a balance between aesthetics and usability, rather than sacrificing one for the other.

Those interviewed for this article recommend considerable user involvement and testing with an emphasis on iteration, so designs meet the actual (rather than imagined) requirements of end users. Best practices and research save time, but no one understand user tastes better than the users themselves.

Seven tips for creating better UIs
A good UI design has the user front and center from the earliest concept stages all the way through to maintenance. Note that the initial concept and prototype have been separated intentionally by most of the component providers interviewed as they involve different levels of information.

1) Try to understand the problem you are trying to solve from the users’ perspective.
It is better to start with a hypothesis that is based on facts rather than personal assumptions. Who is the target user or user groups? What type of application is it (e.g., an e-mail client, a trading system, etc.)? Will the users stand or sit when they use the application? What is the average period of time the application will be used? What is currently used to accomplish the same task? What is/are the target device(s)? If you’re building a business application, make a point of understanding the business process and how the application fits into it.

“From a developer’s perspective, it starts with being application users ourselves,” said ComponentOne’s Dudley. “You really wouldn’t want someone building an e-commerce website without ever having made an online purchase firsthand. You really have to put yourself in the end users’ shoes, which can be difficult.”

2) Start with a sketch, storyboard, wireframe or pictures.
It is better not to be too literal when presenting initial concepts, so stakeholders can remain focused on the concept rather than the nuances of specific functionality.

“If you use Visual Studio at this stage, people will ask you why stuff doesn’t work and why they have to wait 12 weeks for a product,” said DevExpress’ Bucknall. “You’re better with a hand-drawn sketch that gets the point across.”

3) Get feedback from stakeholders.
Listen closely to what stakeholders like and don’t like about the concept. How does it differ from what they use now? How does it differ from their expectations?

“The more a developer or designer listens to feedback, the better we become at anticipating how design should look in future projects,” said Dudley.

4) Build a prototype.
Make sure it includes the workflow or process. Which functions or steps are used most often? Which can be consolidated or eliminated? Would it improve user experience to assume defaults or ask relevant questions?

5) User-test it.
Where are users attempting to find information? Does that differ from how the information is presented? How easy is it for users to complete tasks and interact with information? What is their reaction to the aesthetic look and feel? What are they saying? What are they not saying but what does their body language say?

6) Iterate until you achieve a “final” implementation.
“If you want to build a great UI, you have to do it in an iterative way,” said Telerik’s Anglin. “If you’re building line-of-business applications, you need to be consistent with what users expect. And if you’re developing consumer applications, you want to entertain them and get an emotional response.”

7) Test it.
This includes UI testing, usability testing and stress testing. Stress testing is important because some applications may work well in a lab environment under “normal” circumstances, but may falter if there is network latency, data access or connectivity issues.