Can we talk? Usability testing of Canvas Discussions Redesign with instructors and students

englu061
Community Participant
2
325

englu061_2-1722285253930.png

Day 2: July 10, 2024 at 10:00am

Presenters

Session Overview

The session outlined findings from a usability study conducted on the Canvas Discussions tool, focusing on collaboration with Instructure to improve its functionality. Based on four key findings, the study aimed to understand user experiences to implement changes and/or improve training. 

Usability partnership & study process

Since 2015, the University of Minnesota academic technology teams have been conducting usability studies to understand how instructors and students use different learning technologies. This work is carried out with internal and external stakeholders such as University of Minnesota academic technology leadership, the academic technology community, and our learning technology vendors.

For this study, the Canvas Discussions tool was selected for usability evaluation for two reasons: Instructure was redesigning the tool, and it’s used in approximately 25% of University of Minnesota academic courses. Discussions are heavily used in those courses: as often as 14-15 times during a 15-week semester-long course.

In conjunction with the University of Minnesota Usability Services team, our project team developed a focus question to guide our study. We wanted to know: What prevents people from engaging with one another using the new Discussion tool in a satisfying way? We then determined the criteria for participant recruitment and tasks that participants would complete during the usability testing sessions. While the participants were participating in the usability testing sessions, project team members took notes that captured their observations about what the participants did/said and points of confusion. After completing all usability testing sessions, the project team met to debrief and review issues that the observers captured in their notes.

Summary of Findings

  1. Mentions Feature:
    • The "Mentions" feature allows users to tag others in discussion posts using the "@" symbol, similar to social media platforms.
    • Students immediately recognized this feature, while instructors were unsure of its functionality and notification settings.
    • Based on these findings, UMN plans to educate both user groups about the feature and ensure they know how to turn on notifications through Instructure's knowledge articles.
  2. Anonymous Discussions:
    • New settings include options for partial or full anonymity in discussions.
    • Instructors found setting up anonymous discussions problematic, as options were only available during initial setup.
    • Instructure will modify the tool to allow editing these settings post-creation, aligning with the functionality of Assignments and Quizzes.
  3. SpeedGrader Confusion:
    • Instructors confused discussion replies with feedback comments in SpeedGrader, leading to miscommunication.
    • The plan to help mitigate this confusion includes extra communications to clarify this distinction and training new instructors and TAs about proper feedback channels.
  4. Group Discussions Feedback:
    • Instructors expressed dissatisfaction with the Groups feature, describing it as difficult and cumbersome.
    • Although not the primary focus, this feedback was documented, and Instructure is beginning to explore improvements.

Future Directions

The collaboration between UMN and Instructure in UX studies is still developing. Plans are to invite Instructure to future studies and use the findings to inform communication and training for new features.

Resources

Presentation slides

2 Comments
Jeff_F
Community Coach
Community Coach

I am a little surprised that there is no note or mention by the faculty respondents nor presenter of the sort order debacle (now resolved), pagination, and how the default view is set to collapsed vs. expanded. Instead, the summary speaks to mentions and anonymous discussions as the first two items.

TanyaUponAvon
Community Participant

Hi @englu061

I was just wondering whether the subject of discussion rubrics came up at all in this study. I was hoping the discussion redesign would make it easier for students to access the rubrics; sadly, it appears not. 

I think discussion rubrics should be shown prominently on the discussion page, as is done with assignments. (Currently, of course, students have to go to the three dots and choose Rubric from the dropdown menu. This byzantine access process obviously minimizes the utility and relevance of discussion rubrics.)

I'm surprised this isn't a frequently cited issue with Canvas discussions. If we want to encourage students to communicate with and learn from each other in asynchronous courses, we need to use discussions effectively (as you know). Lack of peer-to-peer interaction is often cited as one of the biggest drawbacks of online learning--yet discussions as a tool for developing student exchanges (particularly by including reply posts) are underappreciated and underutilizied. I have heard the complaint from students and faculty alike that discussions are useless time wasters. They think discussions are just boring writing exercises. My response to that is simply, "Then you are not using them correctly." One key element that is typically missing is transparency for the student on why they are participating in the discussion and what they are expected to get out of it. 

Students need structure and feedback on their performance in discussions. They need to know what the instructor is looking for in a discussion post/response as much as they need that information with assignments--perhaps even more so, since discussion expectations may vary from one instructor to the next. How many students are looking for (much less reading) the evaluation criteria for their discussions when we make it so difficult/conuterintutive to even find the rubric?

Discussions can't take the place of face-to-face peer interaction, but they can go a long way toward bridging that gap--when used effectively--and an essential part of an effective activity is a transparent evaluation criteria. 

TL;DR: Does anyone care about how discussion rubrics are inexplicably hidden in Canvas?