Interactive software needs users to guide it through a process. But many steps have been or can be automated. The promise of machine learning is to automate any remaining un-automated steps. How should a software architect find the limits of automation and the right role for people in a system?

In the early days of interactive word processing, the user had to do it all — typing, layout and checking. Now, spelling and grammar checking are standard, and automatic layout according to a template is routine. What next? Will natural language generators absorb our ideas and data, and then generate a suitable narrative? It’s been done for weather forecasts, and NLG technology is anticipated to provide a text narrative alongside the widely used ‘dashboards’ of analytics systems.

The point I want to make is that the role of the user in every interactive system will change. If the software you work on is interactive, you need a vision for opportunities to add automation that makes your users more productive. Also, you need to remember that machine learning can enable automation without ever having to understand the task.

No understanding?

Consider old-school data entry — yes, transferring the information from handwritten forms into a computer system. Unlike an optical scanner, a human will look at a blurred handwritten form at different angles, squint at it, hold it closer to the light, and perhaps ask a colleague for their opinion. More than that, humans understand context.

Machine learning theory says you can add a learning layer. This would require scanned input images, then it would build correlations between those images and the outputs generated by the data entry clerks. When the correlations are good enough, transform the learning layer into an automation layer to generate outputs which contain all the learned correlations. Hey presto — automated data entry, and all the task know-how came from existing users doing their assigned jobs.

That’s the theory. We all know the limitations. I have never heard of machine learning being the whole solution to automation of a manual data entry task, and I’m not trying to promote this idea. But this approach transformed software for language translation, and it also re-opens the question of whether every user action is needed in interactive software.

Interactive means now

Engineering software is fertile ground for interactive software — dynamic images, context-aware menus, rapid click-response sequences to create and manipulate design data. Except in simulation. The numerical calculations behind structural or fluid flow simulations take too long.

Until now. This year, Ansys demonstrated its ‘Discovery Live’ software. This uses the parallel processing power of Graphics Processing Units (GPU) to provide simulation results as part of the interactive response to design changes.

This will trigger change in the way product development teams work.

Interactive simulation will make simulation part of the up-front process, improving early-stage design choices, eliminating (or, more likely, reducing) the need for a longer cycle of later simulation for a selected subset of all potential choices.

The right role for people

Smartphones have led the way, finding shortcuts and convenient interactive sequences. Now many systems make similar assumptions about user intent and make most option selections unnecessary.

Though important, these are quite low-level, detailed design and optimization choices. There are also bigger-picture choices.

The examples above point to three types of bigger choice: direct task automation (often implemented as new functions); machine-learning based automation (likely to capture and duplicate an existing way of working); and moving existing ‘batch’ software into the interactive sequence (perhaps enabled by special hardware, cloud capacity or data access).

To make these choices, a development team needs to be able to see how their interactive system fits into its environment — of other systems and business processes. The important wall to break down is the assumption that users will continue to do the same thing in the same ways.

About Peter Thorne

Peter Thorne is Director for research analyst and consulting firm Cambashi.