Workpackage 4 Final Report
DRAFT TABLE OF CONTENTS
v. 1.1 - 27.06.1997
1. Introduction
[Angela Sasse & Daniel Pilon]
2. User Requirements
2.1 General requirements
(For this chapter, we have listed sources of input - structure
to be decided later.)
- framework USER-SYSTEM-TASK-ENVIRONMENT
[Daniel & Angela to turn this into subsection/paper from WP4 meeting Nice]
- literature
- existing guidelines
RACE, LUSI, INUSE
survey
[UCL: Louise - with help from Anna]
- Websearch (Daniel)
- other papers (UCL: Anna has a lot)
2.2 Specific Requirements
- ReLaTe user feedback (Louise & Jane)
- MANICORAL (Heuristic Eval, Roy to find out if there is any more)
- COBROW (Roy to investigate)
- Virtual Classroom (UiO)
- Stanford/KTH trials (John Guice)
- Shell, Remote Stats Consultant
- Remote Tutorials & Peer Interaction [David Hearnshaw]
- other
- TELES
- RUS
- HP
3. User Interface Design
3.1 New interfaces for tools
3.1.1 sdr (Louise's paper - not)
3.1.2 rat (Anna Bouch's M.Sc.)
3.2 Tool online tutorials
3.2.1 rat (UCL, Rhun Jones' project)
3.2.2 sdr (UCL, Jason Rainbird's project)
3.3 Integrated User Interfaces
3.3.1 General requirements
3.3.2 Specific Solutions
- ReLaTe
- Collabone
3.4 Implementation Issues
- Managing tools
- Tcl/Tk, JAVA
4. Tool Evaluation
4.1 Conference management
4.1.1 sdr (Louise M.Sc. project)
4.2 Audio
4.2.1 rat (UCL, Anna Bouch)
4.2.2 Freephone (INRIA, Walid Dabbous)
4.3 Video
4.3.1 Vic
(UCL & CRC: Louise, Jane, Daniel)
4.4 A/V Synchronisation
(UCL: Louise on trials of ReLaTe synch)
4.5 Shared workspace tools
4.5.1 wb (UCL: Louise, Jane)
4.5.2 nt (UCL & CRC: Jane, Louise, Daniel)
4.5.3 Teledraw (RUS: Andreas)
4.6 Recording and playback
4.6.1 mmcr (UCL: Louise & Jane)
4.7 MPoll
(CRC - Andrew)
4.8 Security & encryption tools
(GMD - Elfriede)
4.9 Network monitoring tools
4.9.1 mtrace (UCL & CRC: Panos, John R.?)
***If work gets completed on time & partners make sufficient input
4.10 Using ported tools on PCs
- selecting cards & other hardware (UCL: Jane, Isidor)
- installation (Jane)
- other usability issues (ditto)
****
5. Other factors impacting usability/user perception
5.1 Networks
5.1.1 Stability (UCL, Panos but input from all required)
5.1.2 Transparency (Angela from EMMANATE proposal, Panos, Colin to help)
Diagnostics & interpreting info cross-ref to mtrace
5.2 Hardware
5.2.1 Selection (cross-ref to 4.10 if available)
5.2.2 Installation (cross-ref to 4.10 if available)
5.2.3 Workstations
Performance issues: CPU, BUS, memory
(UCL: Isidor)
5.2.3 Input and Output devices
- headsets
- mikes
- speakers
- cameras
- video projection
- keyboards (language support)
- graphics tablets & pens
- displays (real estate problem)
5.2.4 Video servers
(UCL, Lambros)
5.3 Physical environment
5.3.1 Space
(private notetaking)
5.3.2 Auditory Environment
5.3.3 Lighting
6. Methods for Assessment
6.1 Technical feedback
(bug reports, rat-trap, FAQs, MICE-NSC work etc.)
6.2 Usability questionnaires
plus data bank of questions (UCL: Anna, Angela, David Hearnshaw;
Daniel)
David Hearnshaw's contribution (should possibly be moved to evaluation chapter)
6.3 Online capture (Daniel)
6.4 Interview methods
6.4.1 Interviews
Face-to-face
remote Interviews
(ReLaTe & Shell - UCL)
6.4.2 Focus groups
Local
Remote
(UCL: Angela, Louise, Jane, Anna Bouch)
6.4.3 Situated Roleplaying
6.4.4 Brainstorming sessions
6.5 A/v quality assessment
6.5.1 Audio (Anna - incl. MOS)
6.5.2 A/v synch (Anna)
6.5.3 video assessment HIGHVIEW (Angela, Anna)
6.6 Interaction/sociological analysis (KTH: Jon Guice)
6.7 Observation
6.7.1 Usability observation
- real-time scoring
6.7.2 Performance evaluation (TASK-based)
- expert assessment
- recordings
- transcripts
7. Conclusions
7.1 Issues in methodology
7.1.1 Problems via current methodology
7.1.2 New approaches to usability assessment
7.2 Usability of mcast conferencing
7.2.1 Current state
7.2.2 Proposals for improvement
7.3 Future directions
7.3.1 Techncial developments
- tools
- conferencing environments
7.3.2 Accompanying measures
- training
- guidelines
References