GUIdebook: Graphical User Interface galleryHome > Articles > “The Visi On™ Experience – From Concept to Marketplace”
Go backArticlesThe Visi On™ Experience – From Concept to Marketplace

Reprinted from Proceedings of INTERACT ’84, pp. 871-875.

The Visi On™ system1,2 is a personal computer software operating environment for business oriented application programs. It was developed to increase the effectiveness of personal computers in the office by providing an easy to learn, use, and remember problem solving tool for office professionals. This paper describes its development from the perspective of interface engineering in a small market driven company where competitive time pressures substantially shape the development process.

1. The project

1.1. Overview

Work began on Visi On in early 1981. From a modest beginning involving only a handful of people, the project grew over three years to include the efforts of many. When it shipped in late 1983, total costs had exceeded ten million dollars, total size exceed 350,000 source lines of code, and over 70 software development, publications, marketing and sales people had left their mark on the product.

During its three year development, the project passed through four major stages: specification, prototyping, analysis and respecification, and implementation.

The first two stages occupied roughly the first half of the project and dealt primarily with the Visi On operating environment rather than with the applications. This work was accomplished through informal interactions among a small, highly focused, tightly knit team.

During the latter half of the project, four applications (a spreadsheet, business graphics package, data base, and a word processor) were designed and implemented. The former two were shipped with the initial Visi On release. Applications development required that application specialists, people from the core Visi On system, marketing, and publications all work together. Results were accomplished through more formal and complex interactions necessitated by the ever increasing number of participants.

1.2. The Starting Point

Visi On began as a fairly general product concept targeted at the office market. As the first major step on the road to a concrete product, three questions had to be answered in detail: What, exactly, was the product to do? Who was to use it? and What constraints would shape its design and development?

What would it do?

The core of the Visi On system was viewed as an easy to learn and use problem solving aid which would integrate individual applications. This led to three initial requirements:
Provide a consistent, intuitive, and streamlined interface across all applications to facilitate learning and use.
Enable a number of applications to run and interact concurrently under user control.
Provide convenient transfer of information between applications under user control.

Who would use it?

Visi On is intended for office professionals. While a detailed characterization of this diverse group is beyond the scope of this paper, several important characteristics should be mentioned.

Most importantly, the office professional is a discretionary user, usually having alternative means of getting the job done and thus the option of not using the system. This person must find system interaction a beneficial, enjoyable, and nonconfronting experience.

The office professional is an occasional user who probably spends the majority of time interacting with people rather than with machines. He is not computer fluent and would probably find unacceptable a system steeped in jargon or requiring time consuming manual reading or training.

What constraints would affect its design?

A number of constraints shaped the final product. Among the most significant were: the nature of the software market for personal computers, competitive time pressure requiring rapid development, and the requirement to operate on off-the-shalf, popular, relatively inexpensive hardware.

These factors, together with the limited testing resources available, precluded an extensive external testing program during the early and mid stages of the project. Most of the early testing was informal and performed with VisiCorp employees as users. Many of the significant interface design ideas for the applications were, of necessity, “solved on paper” and the design was frozen without benefit of detailed prototyping.

VisiCorp was not in the hardware business. Visi On was a software solution to the requirements subject to the constraints imposed by the host hardware. Initially, the product was intended for a generic high performance third-generation personal computer. This was defined as one having a 16 bit CPU, 256K of primary memory, a bit mapped display, one floppy disk and a Winchester disk with at least a 5 MB capacity. High display bandwidth was a critical requirement for rapid re-display of changed data during interaction with the user. Although the intended host was “high performance”, the designers often found that they were working at the outer edge of the machine’s capabilities.

1.3. The Designer’s Conceptual Model Introduction

Much of our early thinking about human-computer interaction was influenced by work done at Xerox PARC/OSD3,4. Several alternatives were considered in detail. The initial Visi On design attempted to combine the best interface technologies known to us at the time with the unique requirements of the office professional market and the limitations imposed by the host hardware. This led to a prototype of the core system and an initial design specification.

During this period, the small design team agreed among themselves on the same design philosophy. However, we knew that as the project grew and diversified, if the end product was to present a consistent and effective interface, some mechanism would be required to transfer this design philosophy to new members of the development team. We needed a designer’s conceptual model of the system which, once assimilated, would guide designers to make consistently good design choices. This was especially important when, at 4 a.m. in the morning, they discovered the inevitable gaps in the design specification.

The designer’s conceptual model began as a list of fourteen overlapping, loosely defined principles and ultimately was expressed in an internal, proprietary, 60 page document titled “The Designer’s Guide to Well Behaved Products”. This document culminated several months of effort in refining and extending the original principles, and continued to grow even late in the project. It familiarized new project members with our intended user and extensively discussed the overall effect we were attempting to achieve and the methods for achieving that effect.

The fourteen principles are:
Single/Multiple Activation
Display Inertia
System Information Access
Progressive Disclosure
User Feedback
Cognitive Load Consistency
Operation Optimization Product Structuring
What You See Is What You Get
Least Astonishment

They formalize the intuitions, common wisdom, and ideas present in the literature which seemed to be valuable to the members of the original design team.

Putting these principles on paper helped in two areas. The initial designers were sensitized to critical interface issues, and each statement provided a framework for thinking about the design problem. The principles were a start at instilling a design philosophy in software developers who would soon appear on the scene.

User Feedback provides a typical example of these principles:

User Feedback

Immediate feedback for immediate operations should be provided. Processing feedback to reassure the user that processing is occurring for extended operations should also be provided. The nature of the processing feedback should be dependent upon the direction and context of the processing. Based on the human factors literature, the feedback should be clear as to whether the processing will take less than 3 seconds, from 3 to 15 seconds, or greater than 15 seconds.

User feedback may be used to illustrate how the principles were developed, interpreted and applied.

As stated, it has a relatively limited scope and requires further specification and extension. Analysis of the ideas behind user feedback led us to the conclusion that it was part of the more general problem of helping the user to feel and be in control. This line of thought eventually led to a number of interface “solutions” which were then described in “The Designer’s Guide to Well Behaved Products”.

They include:
uniform commands
basic interaction techniques (the same thing is always done the same way)
a single method of initiating actions
visually and behaviorally reinforced contexts
engendering a feeling of familiarity through the use of physical metaphors
what you see is what you get (WYSIWYG)
direct manipulation

Basic interaction techniques (BITs) are particularly noteworthy and are discussed further in section 2.2 below.

Predictable behavior was probably one of the most important ideas underlying feeling in control. The intended user was people-oriented, not machine-oriented. He had no idea what was going on behind the screen. Had the machine heard him? Was it doing the right thing? Should he abandon it and use more familiar, tried, and true methods?

Figure 1
This image can be zoomedFigure 1
The interface had to capture the user’s trust by helping him to feel in control at all times. This requirement led to a reactive interface philosophy in which the system waited for the user to initiate an action, let him know what it needed to complete the action, told him what was happening as the action was performed, and then signaled completion of the action or stated why the action couldn’t be performed. This behavior was formalized in the “user interaction model” illustrated in figure 1.

2. Reduction to practice

2.1. Introduction

The core Visi On system interface is based on a desk top metaphor similar to that found in the Xerox® Star3. The screen displays overlapping rectangular areas, called windows, in which applications run and display results. These are akin to pieces of paper on the user’s desk.

All application windows have the same spatial layout, including a menu line which provides application-dependent courses of action. At the bottom of the screen is a fixed, application-independent menu for controlling the desk top. This menu allows the user to move and change the size of the windows, set windows aside, transfer data between applications, solicit help, and open other windows which provide application specific-options.

The core system interface and several demonstration applications were prototyped early in the project. Informal testing with VisiCorp staff and limited external testing with representative users was performed. While the utility of the initial test results was limited because no production applications were available, the need for a small number of changes was indicated. Appropriate parts of the prototype were accordingly modified and reevaluated. The interface developed smoothly.

Application interfaces, on the other hand, were a real organizational and educational challenge. While the core system team was small and in reasonable accord, the applications involved a steadily increasing number of people with diverse backgrounds and their own ideas on what constituted a good interface. These ideas were not limited to software developers. Almost everyone had an opinion – “I’m not an artist but I know what I like”. Forging a cohesive team whose members didn’t think in terms of “we” versus “they” was a major undertaking. Keeping everyone headed in the same direction was initially a problem.

2.2. Application Interface Development

A number of techniques were used to promote uniformity. In addition to the designer’s guide already mentioned, we held weekly integration meetings which dealt with interface issues relevant to all applications, and periodic interface design reviews for individual applications. These reviews were primarily paper design evaluations with some accompanying blackboard scenario simulation. The supporting interface mockups were static and only told part of the story.

A small number of people were utilized as interface technique resources and participants in interface “gedanken” experiments. In this capacity they functioned more as advisors and teachers than as legislators. When differences of opinion arose, as was inevitable, they were solved in one of several ways: an appeal was made to supporting results in the literature if they existed (rarely), an appeal was made to the designers guide (usually requiring interpretation), one faction attempted to convince the other faction of the correctness of their position through appeals to logic or interpretation of market data (sparse), or a central authority simply made a decision. Later in the project, differences of opinion were resolved, by necessity, on the basis of their projected effect on the schedule.

Where mechanisms for standardizing aspects of the interface were possible, they were put into place. A small group of technical writers were initially given responsibility for producing error messages, prompts and help frames in accord with their interpretation of the designer’s guide. This required close coordination among the writers, software developers, and interface resource people.

Another method of standardization involved designing a set of fifteen basic interaction techniques (BITs) which the application designers were required to use when performing specified interactions with the user. They are:

prompting the user
invoking actions via an application’s command menu
choosing from multiple choices
line edited input unedited keystroke input mouse input
list input form input
option sheet interaction
multi-media input (mouse, keyboard, ...)
delay feedback (machine is busy)
error message presentation
sound response

BITs help the user to feel in control as discussed in section 1.3, by guaranteeing that the same things are always done in the same way. In all cases the BITs supply a standard method of interaction. They encapsulate all user-machine activity necessary to carry out a specific task, including providing selection feedback and error messages.

The Development Process

An application’s development proceeded as follows:

The designers produced a product specification and mockup (clarification of the specification) based on the known product requirements, user characteristics, hardware constraints and designer’s conceptual model. The specification was reviewed, modified and approved.

As the product developed, the uniformity techniques mentioned above were applied. Evolving application interfaces were monitored periodically by interface techniques resource people and marketing.

Once the software was reasonably complete and stable, its interface was evaluated. The earliest testing involved VisiCorp employees as subjects, followed somewhat later by external subjects. Relatively late in the development cycle, tests with cooperative corporate customers willing to sign a confidentiality agreement was performed. Finally, some pre-shipment user feedback was obtained during several weeks of extensive dealer demonstrations and training.

3. Lessons learned

3.1. Introduction

The Visi On operating system made heavy use of prototyping early in its development cycle. Thus there were relatively few interface surprises. In contrast, the application interfaces were developed using a paper based evaluation approach.

As described above, the paper-based approach involved evaluation and approval of interfaces based primarily on their paper specifications. These evaluations were augmented by scenario walk through and mockups, but these ultimately required the reviewer to visualize something that didn’t exist based on an inevitably incomplete description.

Because a reasonably complete working interface became available relatively late in the development cycle, the process was somewhat open loop. The risk that there would be late problems which only heroic efforts could correct was very real.

3.2. A Paper Based Methodology is Fine But...

The paper-based design specification is necessary and has several things to recommend it. Most importantly, it provides a record of design decisions and their rationale. Of somewhat lesser importance, it can be widely disseminated and read, or carried about and studied at leisure in almost any location. But as a method of evaluating an interface, it is sadly deficient.

There are just too many important details for it to be really complete or to work effectively. The design gaps, which always appear under time pressure, produce a vacuum which the software engineer obligingly fills with a personalized design which is exactly what he would want if he were a representative user – but he isn’t.

Paper specifications, even if they were complete would not tell the story. They fail to portray the dynamics and synergy of the interface and are consequently imperfect mechanisms, at best, for review. The reviewer tends to use the specification to simulate the interface in his head. To do so, he must interpret liberally. This process is tedious, not enjoyable, and leads to incomplete reviewing. Further, the interpretation inherent in the simulation process leads to nasty surprises – “But that’s not how I thought it worked!”

The solution to these problems is extensive early rapid prototyping and testing. This is discussed briefly under conclusions below.

3.3. Product Evaluation Context

Another trap is over reacting to interface test results obtained out of context. One gets quite different results if the subject is asked to “just play with the system” and perform a set of artificial mini-tasks, or to perform useful and familiar work. Mini-tasks are useful for studying specific interface problem areas, but they may give results which are at odds with tests performed within a problem solving context. An example is comparing function activation times via mouse-pick and keystroke without regard to context.

3.4. The Novice/Expert Design Point

We began with the goal of accommodating a totally naive user, an expert, or a user anywhere in between. We were able to produce sample interface designs for certain tasks which appeared to be so guided that almost any user who could understand english could correctly perform the task. This was accomplished by breaking the task into many primitive subtasks and liberally supplying prompts.

These designs were reminiscent of some mainframe interfaces which used teletypes. They did not make reasonable use of the output bandwidth of the system and were excruciatingly painful for anyone other than a novice. The Visi On interface philosophy was essentially visual, rather than symbolic. To accommodate both the truly novice user and the expert, and still properly utilize the display bandwidth, we would require two essentially different interfaces. We eventually biased the interface in favor of the more experienced user. A totally naive user has to acquire a certain amount of knowledge before he can solo. The requirement is not great, but neither is it zero.

3.5. Conclusions

The reality of compressed schedules in a competitive market, the frequent dearth of published human factors material relevant to our work, and the inadequacies of paper evaluation methodologies all underscore the need for rapid prototyping tools applied as a formal part of the early design process.

A prototype implemented early, refined and tested may serve as the nucleus of the interface design specification supporting the conventional paper functional specification. The prototype would provide an unambiguous review mechanism, provide a concrete gauge for measuring the production software, and could be evaluated early enough in the development cycle to allow end user reactions to substantially influence the final product.

George H. Woodmansee
San Jose, California. U.S.A.


  1. Woodmansee, G. H. Visi On’s Interface Design
    BYTE, July, 1983; Volume 8 Number 7 (Pages 166-182)
  2. Lemmons, P. A Guided Tour of Visi On
    BYTE, June 1983; Volume 8 Number 6 (Pages 256-278)
  3. Smith, D.C.; Harslem, E.; Irby, C; Kimball, R. The Star User Interface: An Overview
    Proc. of National Computer Conference; 1982, June 7-10; Houston (Pages 515-528)
  4. Goldberg, A. et al. Smalltalk
    BYTE, August, 1981; Volume 6, Number 8, Smalltalk Theme Issue

Star® is a registered trademark of Xerox Corporation. Visi On™ is a trademark of VisiCorp.

Page added on 28th July 2004.

Copyright © 2002-2006 Marcin Wichary, unless stated otherwise.