The total cost of hardware and software for the system was approximately $50,000, versus the approximate $300,000 to outfit 6 nodes using an acceptable equivalent commercial alternative. The majority of the cost involved outfitting the microbiology station. The cost for a typical conference room with a PC, LCD projector, echo canceling microphone, and camera can be as low as $5,000 to $8,000. The system was taken up as a production service immediately upon installation. In spite of our initial fears that user interface complexity would pose a problem, the system was accepted and used enthusiastically. It is currently in use 6-12 hours per week, for 10 distinct scheduled and ad hoc meetings, involving 60–80 individuals; 75% of meetings involved two sites, 25% involved three or more; 10% of meetings are devoted to clinical care, 60% to teaching, and 30% to research or administration; 24% of personnel in the department have now been involved in its use, including 25/39 faculty, 60/360 technical or administrative staff, and 15/15 trainees. The system helped satisfy a licensing requirement of the Accreditation Council for Graduate Medical Education (ACGME) by enabling infectious disease plate rounds.
Five months after deployment, department users were asked to complete an anonymous web-based survey. Responses were received from 17 faculty, 17 lab and administrative staff, and 9 trainees. Trainees have attended the largest number of video-conferences (range 1–50, median 25) followed by staff (range 2–26, median 6) and faculty (range 1–25, median 5). The median of the estimated number of virtual meetings per month was 5 for trainees (range 1–8), and 1 for faculty (range 1–5) and staff (range 0.5–4). Average hours of usage per week are: faculty (1–2 h), technical or administrative staff (1–4 h), and trainees (4–6 h). The 30 respondents to the question of travel time saved estimated a total of 103 man-hours per month (on average, 3.43h per individual). Responding “yes” to the question “has use of this system allowed you to attend meetings that you would have otherwise missed?” were 10/17 faculty, 14/17 staff, and 4/7 trainees. Those having exposure to other videoconferencing systems included 3/17 faculty, 3/17 staff, and 2/9 trainees; of those offering a comparison, 4/6 found our system comparable to the other systems they had seen, 1 found it better, and 1 found it much better.
Asked to rank their overall level of satisfaction with their virtual meeting experiences, faculty had the highest level of satisfaction, and trainees the lowest (see ). When asked “ignoring all other factors associated with the use of remote conferencing (such as time or money saved, the ability to record and archive, etc.), and focusing purely on comparing the experience of a virtual meeting to that of a face-to-face meeting, how would you say that virtual meetings compare to face-to-face meetings?” using a scale of 1 (“strongly prefer virtual meetings to face-to-face meetings”) to 5 (“strongly prefer face-to-face meetings to virtual meetings”), responses indicated an (unsurprising) preference for face-to-face meetings, as shown in . When asked to compare the two forms of meeting taking into consideration the ancillary factors, comparisons shifted in favor of virtual meetings by an average (median) of 1 ranking (range: -1 to 3 for faculty and staff, and 0 to 4 for trainees).
Effect of Ancillary Factors on Perceived Value of Face-to-Face vs. Virtual Meetings (numbers in parentheses are ratings when considering ancillary benefits).
In separate survey questions, we solicited both positive and negative written comments about the system. Among the positive comments (and the number of times they were mentioned) were those related to the reduced need to travel and/or time saved (16), ability to record and archive (3), ability to attend meetings that would otherwise be missed (3), enabling meetings that otherwise would not be held (2), mention of high quality audio and/or video, and/or low latency (2), low cost (2), ability to present to larger group than would otherwise be possible (2), expandability and flexibility (2), satisfaction of training license requirements (see above), and ease of use (1). Among the negative comments were mention of audio drop-outs (10), other problems with audio that may represent speaker location or microphone placement (5), problems due to insufficient camera resolution or poor placement (4), excessive time or complexity involved in setting-up for a meeting (3), the need for larger projection screens (2), complexity of use (2), the need for better written operating instructions (1), inability to get into a conference room due to scheduling issues (1), and excessive shyness of participants (1). Others commented on the inability to safely show mycology/AFB specimens on the microbiology lab video equipment in its current location, and on various limitations in the AG user interface (including the inability to resize vic video windows, the difficulty in following the mouse as a pointer, occasional asynchrony of audio and video, and the lack of an “intuitive” user interface).