Competitive Analysis & Usability Study

Clemson Online

Opportunity

In Summer and Fall 2014, Clemson Online put out a number of RFPs to vendors of both synchronous and asynchronous conferencing systems, seeking potential alternatives to what the University currently uses to design and deliver online courses, Adobe Connect and Blackboard Learn. As part of Dr. Howard’s Usability Testing Methods seminar, Clemson Online commissioned some classmates and I to explore a few of the synchronous conferencing systems the University was planning to adopt, and provide them with a recommendation to aid them in their decision. My group and I were assigned both WebEx and GoToMeeting, and asked to evaluate both systems individually as well as comparatively on each system’s ease-of-use and on the functionality it provided faculty and staff.

Client

Vice-Provost Witt Salley and Clemson Online

Role

Usability Researcher and Consultant

Responsibilities

  • User Interviews
  • Experiment Design
  • Usability Study
  • Task Analysis
  • Competitive Analysis

Tools

  • Camtasia Studio
  • Think-aloud Protocol
  • Techsmith Morae
  • GoToMeeting & Webex
  • Adobe Premiere CC

User Testing & Research

After kickoff meetings with our client, and gathering more information about the users and faculty that typically design and teach courses online, my team and I conducted active-intervention think-aloud protocols with a number of faculty who expressed having had previous experience teaching courses online. Additionally, we devised and tested scenarios for use during the think-aloud sessions, which were developed based on the data we gathered from our clients, and which were meant to simulate a typical interaction faculty might have using each system. We used Techsmith Morae to both collect data and capture our participants' screens, and each of us alternated both running and facilitating the test and coding the data and taking notes, so that we all had experience running the various parts of a study.

Data-Driven Analysis

Using Morae, we were able to compile graphs and metrics for average time on task and error rate for both systems, as a more effective way of visualizing and presenting our results. In total, we tested five participants, each Clemson faculty with varying levels of experience both teaching courses online and with each system, along twelve tasks. The scenarios we tested specifically addressed the functionality that we had identified as the most critical for faculty and staff from our interviews, including: document and screen sharing capabilities, audio and video recording capabilities, the ability to use and interact with whiteboards, and the ability to divide classes into managed breakout groups. For a full set of the tasks we used, and corresponding results and findings from our study, you can see the video below, or access the final recommendation report we delivered to our clients here.