Apple's human interface research - video tools
Written by David Tebbutt, MacUser 11/91 item 04 - scanned
In the fourth part of our exclusive insight into Apple's research projects, we take a look at the Human Interface Group's work on the incorporation and manipulation of video on the Mac.
So far in this series we have looked at Apple's research in the fields of hand-held computers, the placement of 3D objects using 2D controllers, animation at the desktop, and using speech for delegating tasks to the Mac. In this issue, we look at the Human Interface Group's (HIG) research into the management of dynamic information, with particular emphasis on video. Some of the HIG work will appear in QuickTime when it ships later this year.
Researchers in the media applications team are trying to find ways of helping ordinary users, not just media specialists, to handle analogue and digital video information. Typically, users would want to take an analogue recording, digitise it so that it can be manipulated, and either put it into a document, such as a MacWrite memo or Persuasion presentation, or be written back to an analogue medium.
These processes need to be made simple and feel natural. The intention is to make dynamic data available to every application, not simply to replicate television on the Mac screen.
Having decided to put video sequences into a document, certain questions arise such as: if a movie is stopped, how will the user know it's a movie and not a PICT file? Or: if there's no sound, how will the user know there's no sound? Another challenge is to help users deal with memory intensive movie files - a 20 second, 120 by 160 pixel, compressed movie might require 1M of storage. These time and memory factors may mean that new principles are required when building interfaces for dynamic objects. Mike Mills, media applications team leader, says: ``We cannot just depend on well worked out guidelines from the world of static text and graphics.''
A video sequence, unlike a static picture, needs to be played back while viewing the document it is part of. Among the challenges HIG faces is what to do about the on-screen playbar - a device with similar controls to a videocassette recorder. If the playbar appears on top of the digital movie, it masks the view. If it appears outside, then it can obscure underlying text, or cause formatting problems in parts of the host document. Decisions also need to be made about what to do with the slider when resizing the video window.
Once the media applications team decides to do something, Mike Mills or Michael Arent, the group's visual designer, develops rapid interactive prototypes in Director and HyperCard. This enables the team to demonstrate its ideas and to get feedback on them.
Jonathan Cohen, a programmer and designer on the media applications team, points out that it is very difficult to imagine all of the issues in advance when you are creating a prototype using products such as Director and HyperCard. For example, you may have a great idea for an interface, but you might not find out that the performance is terrible, or that it feels awful to use, until you actually try and write it in C.
Cohen also says that once you have developed libraries of routines that can be thrown together to make prototypes (a process which might take several months), writing real code is no slower than using one of the prototyping applications. At the moment, Cohen does development work in C++ and MacApp, but he is seriously considering using MacLISP as a prototyping language, having seen it running on the more powerful Macs.
Targettable Controller
The targettable controller is like an on-screen video Control Panel. It allows the usual operations like fast forward, play and step, but these work both forwards and backwards. The jog-shuttle slider underneath this row of functions treats them as a continuum. When you release the mouse button, the movie stops playing, which is very handy when you're searching for a particular point in a movie or when you notice something of interest in the unfolding video stream. By choosing a point between Step and Play, you could run a movie in slow motion.
A zoom control allows you to choose a portion of the movie frame to be used. The real movie, showing the effect of your actions, runs in a separate window, while a miniature still frame from the movie to the right of the slider shows the full frame with the zoom area outlined.
A pair of controls lets you find a point in the movie using a slider or by providing precise timings. Another lets you run the movie forwards, continuously in a loop, or repeatedly forwards and backwards. The final control lets you adjust the sound volume. Because this is a targettable controller, you can use it to control the movie you've selected in the currently active window, which might contain several separate movies.
Movie Grabber
Users need an easy way to build video sequences, complete with special effects. The media team has developed a number of such tools. A year ago, it came out with a movie grabber. This allows the user to grab a portion of a movie and store its details in a table view. The user can elect to grab an entire frame or a sequence of frames. They also have the choice of digitising the whole video image, cropping the image, or creating virtual camera effects like zooming or panning the incoming digital sequence.
Each movie fragment is represented as three columns in the table - start frame, end frame and movie. Each cell, called a proxy, is a 60 by 80 pixel representation of the original frame or part-frame. Behind the scenes, the table contains the controller data for the video device, so that it can get to the right video segment again. The movie cell, when activated, plays the fragment from start to end, performing the necessary interpolations if zooming was used. It is also possible to make film clips run backwards by transposing the start and end frames. Each row in the table represents a different movie fragment.
Since the development of QuickTime, two more columns - Poster and Preview - have been added to the view. Poster is a selected frame from the movie and Preview a short film clip. The purpose is to give the user a taste of what the movie fragment contains. By default, Poster is the first frame of a movie.
Transition Factory
Another tool has been developed by the media applications team to help users make transitions from one movie sequence to another, such as fades and wipes used to smooth out changes of scene. A row of three boxes contains proxies of the first movie, the transition effect, and the second movie. When the user is ready, the resulting movie appears in a fourth box.
When transition options are shown as static icons, they don't give a good enough idea of the effect they are intended to provide, so each transition is now stored as a movie. The user can actually play the icon to get an idea of what will happen before applying the effect.
The Transition Factory also gives the user the ability to overlay titles and credits, as well as apply the more obvious transition effects.
Movie Logger
HIG is wrestling with the challenge of how to make it easy for the user to move through a long movie source - a videodisc of a film, for example - without losing a sense of context. One answer is the Movie Logger, which allows the user to plunge from an overall view (a coarse temporal sampling, in HIG terms) to a frame by frame view (fine temporal sampling) in just a few steps. It does this by displaying frames or key events, sampled at given intervals from the video, in a row across the top of the screen.
At the moment, these are selected by time, although it should be possible to refine this so that each frame is, for example, the start of a scene. The user then chooses a frame from this row, and the Logger generates a new row of frames around the chosen one. This time they are much closer together as the temporal sampling is much finer. This process can be repeated to any number of levels until the movie is viewed at the desired degree of temporal resolution. The spread (degree of temporal magnification) of each level can be adjusted by moving a slider at the next higher level.
The team will soon conduct trials to see if users are comfortable with this hierarchical approach. In the meantime, work is continuing on other ways of stepping through large amounts of video data. According to Cohen and Mills, the implementation of scene change detection will make quite a difference to the logger's acceptability.
The Future
Once QuickTime 1.0 is released, the media applications team will start looking into areas to help drive the evolution of this technology. They will examine how to enable the author of a dynamic document to customise the behaviour of a QuickTime movie. In QuickTime 1.0, a user will be limited to playing back a movie within an application such as WordPerfect or Persuasion, but authors will soon want to provide users with richer kinds of interaction. For example, they might want a movie to stop on a given frame by itself, or play itself back at an accelerated speed, reposition itself on the screen, or even trigger a particular sound track when the user clicks on a given frame. To provide these interactions, a movie will have to be scripted, or put under program control. One of the next projects for the media applications team is to discover how to design an easy-to-use end-user scripting environment for movies.
MacUser would like to thank Mike Mills, Jonathan Cohen and S Joy Mountford for their help in researching this article.