The Journal of Extension -

June 2020 // Volume 58 // Number 3 // Tools of the Trade // v58-3tt1

Sharing Feedback, Sharing Screens: Videoconferencing as a Tool for Stakeholder-Driven Web Design

Videoconferencing is a low-cost, high-reward tool for engaging stakeholders in participatory design and usability testing for online Extension products. This article details how we used the videoconference program BlueJeans to facilitate the redesign of the Great Lakes Aquatic Nonindigenous Species Information System database and highlights the benefits of this technique along with the methods we used for stakeholder interviews in the project.

El Lower
Research Associate, Great Lakes Aquatic Nonindigenous Species Information System
Michigan Sea Grant
Ann Arbor, Michigan

Rochelle Sturtevant
Program Manager, Great Lakes Aquatic Nonindigenous Species Information System
Michigan Sea Grant
Ann Arbor, Michigan

Devin Gill
Stakeholder Engagement Specialist
Cooperative Institute for Great Lakes Research
Ann Arbor, Michigan


Involving stakeholders in participatory design can be a daunting task. This is particularly true when stakeholders are spread across a broad geographic region and the time and resources needed to conduct in-person usability testing for an Extension product are limited. Each method of acquiring participant feedback has its own benefits and drawbacks: Online surveys often have low response rates and limited potential for capturing nuanced user feedback whereas in-person interviews or focus group sessions provide highly detailed information but can be time-consuming and expensive to conduct (Duffy, Smith, Terhanian, & Bremer, 2005). However, a growing body of literature suggests that the use of videoconferencing through Voice over Internet Protocol (VoIP) technologies may be an attractive solution, either on its own or as part of mixed-methods research (Archibald, Ambagtsheer, Casey, & Lawless, 2019; Hesse-Biber & Griffin, 2012; Scanga, Deen, Smith, & Wright, 2018).

By using the BlueJeans videoconference program to conduct usability testing, we implemented an intensive, user-driven redesign process for the Great Lakes Aquatic Nonindigenous Species Information System (GLANSIS) database with minimal expense. During virtual interviews, we led aquatic invasive species managers through a semistructured interview process while using the screen-sharing tool to allow participants to demonstrate in real time how they used the database. This multimedia interview technique resulted in rich, detailed feedback from our stakeholder group on how to improve navigability, highlight important features, clarify instructions for tool use, and address FAQs, directly shaping the site's new design.

Project Background

The GLANSIS database is a "one-stop shop" for information about aquatic nonindigenous species in the Great Lakes region and is coadministered by Sea Grant, the National Oceanic and Atmospheric Administration, the U.S. Geological Survey, and other regional partners. GLANSIS is nearly two decades old, however, and many new tools were added on an ad hoc basis over time, leading to a convoluted web layout where key functionality was difficult to navigate or even locate in the first place.

To improve navigation and functionality of the GLANSIS database, our team decided to undertake a usability evaluation of the web product. Within the study, we defined usability as the effectiveness, efficiency, and satisfaction with which target users can achieve goals while using GLANSIS (Hornbaek, 2006). In 2018, we conducted an initial online survey to identify recommendations for improving website usability. However, text-based survey methods did not provide the in-depth information we needed to develop targeted recommendations for improvement. With a user group scattered across the entire Great Lakes basin, traditional in-person focus group sessions for usability testing were not particularly feasible without extensive—and expensive—travel involved. Additionally, phone interviews, while time- and cost-efficient, presented barriers to discussing the visual aspects and user experience of the database in detail.

Videoconferencing ultimately proved to be the ideal solution to our user engagement challenges. We received detailed, interactive feedback through simultaneous voice and text chat and observed the users' experiences of the website in real time through screen sharing. This mode of interaction highlighted a number of key factors essential to improving usability of GLANSIS, including addressing difficulties with navigation, promoting more intuitive tool use, and addressing information gaps to support user decision making.


We conducted 10 individual interviews with aquatic invasive species managers about their experiences with GLANSIS using the BlueJeans online meeting platform. BlueJeans is a subscription-based virtual meeting platform that aggregates video, audio, chat, screen sharing, and other web conferencing tools into a single application (Bluejeans, n.d.). Participants were emailed an invitation to join the virtual meeting through the Blue Jeans application, with the email including instructions to access the meeting through either a hyperlink or a conference dial-in number. We were able to quickly troubleshoot any issues with accessing the BlueJeans meeting, and most participants reported that the BlueJeans platform was easily accessible and intuitive.

Once connected through BlueJeans, we coached participants through the use of a screen sharing function to allow them to navigate through the database in real time while participating in a semistructured interview. Interviews ranged from 30 to 90 min in length and included questions on which areas of the site participants regularly used, how easy or difficult various tools were to find or operate, and what areas of the site could use improvement (see appendix).

The recording function built into BlueJeans allowed us to review and transcribe each interview with permission of the participant. These recordings included webcam video of participants as they gestured at the application as well as a recording of their computer desktop as seen through the shared screen as they highlighted or clicked elements of the database. We coded transcriptions of the interviews and interviewer memos using NVivo qualitative analysis software to identify emerging themes and recommendations among participants' responses. Coding is a method of qualitative content analysis in which text is organized into categories of information that are labeled and analyzed for frequency and comparative occurrence across source documents (Hsieh & Shannon, 2005). Analysis of these codes generated key themes that informed our recommendations for redevelopment of the database interface. These organized stakeholder responses led to a new landing page with clearly marked tools, the addition of an FAQ section, and an accessible new layout with streamlined navigation. We also added a user contribution portal where we continue to solicit feedback on site design and content, and we will continue making iterative improvements based on user feedback.

What We Learned

Videoconference-based usability testing is a convenient, cost-effective, and highly productive interview method, particularly when both time and funding are limited and stakeholders are in disparate geographic locations (Hewson, 2008). This type of Internet-mediated research is still at an early stage of development, but it represents a particularly promising new frontier for engagement (Archibald et al., 2019; Kenny, 2005). BlueJeans is just one of the many video chat applications that can be used to facilitate this kind of project; other popular platforms include Zoom, Google Hangouts, GoToMeeting, Join.Me, and more, many of which are available for free online or through university software bundles. When Extension professionals are tasked with creating or improving online tools and resources, this simple strategy may effectively remove many of the barriers to involving stakeholders in participatory design.


Funding was awarded to the Cooperative Institute for Great Lakes Research (CIGLR) through the National Oceanic and Atmospheric Administration Cooperative Agreement with the University of Michigan (NA17OAR4320152). The CIGLR contribution number is #1158.


Archibald, M. M., Ambagtsheer, R. C., Casey, M. G., & Lawless, M. (2019). Using Zoom videoconferencing for qualitative data collection: Perceptions and experiences of researchers and participants. International Journal of Qualitative Methods, 18, 1–8.

BlueJeans. (n.d.). How our video conferencing works [Web page]. Retrieved from

Duffy, B., Smith, K., Terhanian, G., & Bremer, J. (2005). Comparing data from online and face-to-face surveys. International Journal of Market Research, 47(6), 615–639.

Hesse-Biber, S., & Griffin, A. J. (2012). Internet-mediated technologies and mixed methods research: Problems and prospects. Journal of Mixed Methods Research, 7(1), 43–61.

Hewson, C. (2008). Internet-mediated research as an emergent method and its potential role in facilitating mixed-method research. In S. N. Hesse-Biber & P. Leavy (Eds.), The handbook of emergent technologies in social research (pp. 525–541). New York, NY: Guilford Press.

Hornbaek, K. (2006). Current practice in measuring usability: Challenges to usability studies and research. International Journal of Human-Computer Studies, 64(2), 79–102.

Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288.

Kenny, A. J. (2005). Interaction in cyberspace: An online focus group. Journal of Advanced Nursing, 49, 414–422.

Scanga, L. H., Deen, M. K. Y., Smith, S. R., & Wright, K. (2018). Zoom around the world: Using videoconferencing technology for international trainings. Journal of Extension, 56(5), Article v56-5iw1. Available at:


GLANSIS Usability Interview Guide

Thank you for agreeing to be interviewed in our study to evaluate the usability of the GLANSIS website! We would like to begin by asking you a few questions about the work that you do.

  • Can you please state your name, the organization that you work for, and the title of your position?
  • What do you consider the geographic scope of your work area to be? For example, Western Lake Erie, all of Lake Ontario, etc.
  • What type of aquatic nonindigenous species (ANS) information do you use in your job?
  • Where do you access the ANS information that you need?
    • When do you go online for ANS information?
    • Which websites do you use to access ANS information?
  • How frequently do you use the GLANSIS database?

Usability Evaluation: Open the following webpage using Go To Meeting screenshare. Allow interviewee to have control over the screen:

  • Can you show us through the shared screen which areas of the GLANSIS website you use?
    • How do you use this information?
      • Is there information that GLANSIS doesn't provide that would be helpful to you in accomplishing your work?
  • How quickly would you say that you can find the information that you need?
  • Are there areas of the site that are confusing or difficult to navigate? Which? Can you show us?
  • In your opinion, how reliable is the information in GLANSIS?
  • What might prompt you to use GLANSIS instead of another source of ANS information? (What do you think is the competitive advantage of using GLANSIS?)
  • Overall, how do you think we're doing? If you were to give the GLANSIS website a grade, what would it be? Why?
    • Which areas of the website do you think we could improve?
    • Are there new types of information or functions that you would like to see included in future website updates?
  • Do you have any other thoughts that you would like to share about GLANSIS?