Human-computer interaction is an increasingly important part of our technological world.

Dr Beran Necat from Girne American University looks at how this hugely significant interactive junction has evolved over time and suggests ways in which it might be improved.

Human-Computer Interaction (HCI) is a multidisciplinary science drawing on cognitive psychology, behavioural psychology, psychometrics, systems engineering, computer science and ergonomics.

These sciences are contributing to the study, research and development, methodology and practice of designing and implementing interactive computer-based user interfaces including software, hardware, training and documentation that can be used efficiently, safely and with satisfaction.

Not too long ago, computer usage was considered to be esoteric and its users were mainly technically oriented people who intentionally kept interfaces unfriendly as a form of job protection. But as computers become more accessible to more users the interest in user-interface design and the need to improve the user experience grew rapidly.

Having more effective interfaces enable pilots to fly airplanes more safely, doctors to diagnose medical conditions more accurately, enables more effective education, and lets designers and artists be more creative.

A bad interface can be dangerous. A computer controlled radiation treatment device called the Therac-25 killed many of the patients it was supposed to be treating. Its tragic flaw was found to be in its complex software and a poorly designed interface (Schiederman, 2003).

A good application interface hides the complexity from the user and allows the user to work more naturally.

Early HCI initiatives arose from the use of the cathode ray tube (CRT) and pen devices and many of its techniques date from Ivan Sutherland's SketchPad PhD thesis in 1963 which marked the beginning of computer graphics as a discipline. (Myers, 1996).

Since then work in computer graphics has evolved complex algorithms and sophisticated hardware that allowed the manipulation of realistic objects.

Most CAD/CAM systems use interactive graphics to manipulate solid models, which is a technique called direct manipulation and was made popular by Alan Key of PARC in 1977.

Direct manipulation is a key component of the graphical user interface (GUI). An icon is a bitmap metaphor that represents a computer object. The term 'icon' first coined by David Canfield in his Standford PhD thesis called Pygmalion.

AMBIT/G was another early system with an interface technique consisting of icons, dynamic menus and items that were selected by a pointing device.

The phrase 'What you see is what you get' (WYSIWYG), which refers to the content appearing almost exactly as it will be when finally printed, originated with the Bravo text editor and the Draw drawing program (Hewett et al, 2004).

Twenty five years of government funded HCI interface research at universities and the contributions from corporate research laboratories at IBM, XEROX, and AT&T and others have made applications easier to use.

Early developments from the man-machine symbiosis, the augmentation of human intellect, and the Dynabook led to the development of the mouse, point-and-click editors, bitmap images, personal computers, windows and the desktop metaphor. (Myers, 1996).

Developments in Operating Systems introduced techniques for interfacing input and output devices and windowed environments which gave rise to 'user interface management systems' and 'user- interface toolkits' (Myers, 1996).

Multiple tiled windows were demonstrated by Engelbar's NLS demonstration in 1968. Alan Kay proposed the idea of overlapping windows in his thesis in 1969 and again in his Smalltalk system at XEROX PARC in 1974. Tiled windows where demonstrated with COPILOT and the EMACS editor in 1974.

The XEROX PARC Cedar Window Manager was the first major tiled window manager, which was followed by the IBM funded Andrew Window Manager from the Carnegie Mellon University's Information Technology Center in 1983.

The XEROX Star released in 1981, Apple Lisa in 1982 and the Apple Macintosh in 1984 where early commercial window managers. The UNIX based X Windows system was developed at MIT in 1984 (Myers, 1996) and the Intel based windows platforms were IBM's OS/2 first released in 1987 and Microsoft's Windows in 1983.

User interface toolkits (software that supports programming re-use in this case for graphical user interfaces) are often referred to as a library of widgets.

Widgets are the next evolution of operating system device handlers once used to process user input and output in old style text based terminal interactions.

Modern graphical based interfaces go beyond simple text entry. They enable the direct manipulation of objects like drag-and-drop and cut-and-paste which allows for opportunistic and incremental task planning where users can try something and see what happens.

This type of interface, called display-based interaction (Payne, 1991), provides many avenues for interactive problem solving (Hartson, 1996).

Building applications with graphical user interfaces needs a different programming approach. Event languages, like those used in VisualBasic allow developers to connect code segments to interface and system events.

IBM's VisualAge took the graphical programming paradigm a step further letting developers use direct manipulation as the programming style. GUI programming tools rely heavily on reuse and existence of externally available components and widgets like menus, dialogs, push buttons, entry fields, combo boxes, list boxes and many others.

Components that can be dynamically used at runtime like Microsoft's OLE and ActiveX, Apple's OpenDoc, and Sun's Java Beans allow developers to assemble applications from a palette of parts, very similar to the way hardware is assembled.

These types of components can also provide very sophisticated integration capabilities to applications, such as those including a spreadsheet, a Word document, or a Visio drawing. Application builders are advanced Application Development Environments (IDE's) that let developers construct applications from a collection of parts.

Business Process Management Suits (BPMS) use direct manipulation based editors that analysts use to define business processes. These processes are graphical representations of transitions and activities that relate to real business processes.

Direct manipulation driven application interfaces are also used to drive remote systems. These include space missions, remote controlled drone aircraft, remote medical procedures and simpler applications like remote controlled domestic appliances such as security systems, and HVAC systems.

New types of devices are finding their way into all kinds of uses and designers are continuously challenged to provide generic services that can run on them all.

These devices, which range from large wall mounted LCD, plasma and projection displays to small portable devices like cell phones and PDAs, need designs with plasticity in order to cater for the display size variations and to support internationalisation as well as accessibility for disabled users.

Devices are that are wearable or even implantable are forcing designers to investigate new interfaces that will be context-aware, attentive, sensing users' needs and providing feedback through ambient displays (Shneiderman, Plaisant. 2005). Such devices will communicate with each other using wireless technologies like Bluetooth.

The creation of the World-Wide Web application by Berners-Lee in 1990, and the hypertext based Mosaic web browser greatly influenced the direction and development of collaboration technology and software (Myers, 1996).

The evolution of email and internet browsers using HTML, FTP, AJAX and other JAVA based technologies have created a very powerful way for users to easily communicate, share and browse information with minimal learning.

Successful interface designs go beyond user friendliness. A system that is defined as easy to use but does not support the users' needs is considered to be a failure. Usability means 'usability in the large' - that is, easy to use plus usefulness.

When a system interface is properly done using explicit goals they generate positive feelings of success, competence, mastery, and clarity in the user community (Shneiderman, Plaisant. 2005).

New taxonomies, explained theories, predictive models, and prescriptive guidance from academic, industrial research, and experimenters, as well as clearer understanding of the cognitive, social, economic and ethical impact are impacting the way interfaces are designed.

New products have a greater chance of being delivered within budget, on time, and meeting their designed objectives when the needs of the user has been properly identified, defined and understood and made central to the design process.

Users are demanding that applications be more usable. Ben Scheiderman in his book Leonardo's Laptop tells users to get angry and complain when products fail.

Organisations are realising that efficient and effective user interfaces decrease employee work errors, increase employee satisfaction, and reduce training costs, which translates to increased profitability.

The focus on usability has now become a competitive necessity for the commercial success of software (Butler,1996) and the attention to usability by designers and developers no longer requires justification from management.

HCI is a diverse field that is having a huge impact on the computer industry. It is driving usability for every-citizen, which has become crucial for the success of a national information infrastructure.

Its future is going to be challenging moving beyond GUI's developing interfaces for all kinds of amazing devices and to continue finding ways to promote usability.