Canvas Rubric Redesign: Progress, Challenges, and Next Steps

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

RaviKoll
Instructure
Instructure
75
16592

Canvas.png

We know you have all been waiting in anticipation for the all new rubric redesign. First and foremost, we want to express our gratitude to each and every one of you who has shared your thoughts, suggestions, and feedback with us. We're here to share an update on our progress and some adjustments to our timeline.

A Complex Redesign: As we dive deeper into the redesign process, we've encountered some unforeseen complexities. Redesigning the Rubrics functionality is no small feat – it's a multifaceted endeavor that requires careful planning, testing, and iteration. While we've been working tirelessly to bring you a revamped Rubrics experience, we've encountered some hurdles along the way that have led to a slight delay in our timeline.

An Extended Timeline: We understand that delays can be frustrating, especially when you're eagerly anticipating new features and improvements. That's why we want to be transparent about our timeline adjustments. The Phase 1 of Rubrics redesign will be released in the beta environment in Q2’24, with a production rollout due with the subsequent release cycle. The release will be implemented under a Feature Flag, giving you the flexibility to enable or disable it according to your preference.

Changes in the Pipeline: In addition to the revised timeline, we also have some updates regarding the features that will be included in the June production release. While the majority of the enhancements will still be part of Phase 1, we've made the decision to push the implementation of Student Self-Assessment to Phase 2, which we understand isn't ideal. However, the complexity involved in its implementation, including modular inter-dependencies and architectural considerations, has led to these delays. Despite this, we didn't want to hinder the overall release progress. Our goal remains to get the redesign into your hands as early as possible, even if it means a slight delay for Self-Assessments. 

We've received numerous requests for Rich Content Editor (RCE) integration into the Speedgrader experience, and we want to assure you that it's a key part of our roadmap. While we don't have a specific implementation timeline just yet, we're actively working on it. Rest assured, we'll keep you informed and updated as we make progress in this direction. Stay tuned for further updates!

Numerous enhancement requests have been received, and we want you to know that we're actively evaluating each one. Our team is committed to carefully considering all your valuable feedback during the redesign process to ensure that Canvas Rubrics aligns with your needs and exceeds your expectations. However, we may need to prioritize certain enhancements over others, and some suggestions may require further evaluation before implementation. 

Despite the challenges we've faced, we're more excited than ever about the future of Rubrics within Canvas LMS. We're confident that the changes we're making will streamline your workflow, enhance the assessment process, and ultimately empower both educators and students.

Before we sign off, we want to extend a heartfelt thank you to each and every member of the Canvas Community.

We're excited to offer you an exclusive preview of our ongoing progress with the rubric redesign initiative. Check out this brief video for a sneak peek at what's in store!

 

 

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

75 Comments
JessicaDeanSVC
Community Contributor

Exciting to see! Regarding the video, when can we expect to see captions on there? The music in the background gets rather loud at various points that makes it hard to understand what is being said. 

Additionally, are all the items in the video expected to be a part of the Phase 1 rollout at this time?

RaviKoll
Instructure
Instructure
Author

Everything in the video will be part of the Phase 1 release, this is a real demo from our development environment. Captions have been added. 

paul_fynn
Community Contributor

I have noted issues in supporting Rubric  and Outcomes use that I would be interested to see development on, which really relate to the instructor/instructure interface.

Essentially we seem to be missing a unified approach to the interfaces in different subsets of tools across Canvas, and particularly when it comes to Rubrics, Outcomes and Mastery outcomes.

Hence navigation and capabilities seem to differ in different places, for example in the ability to search for and change the order of items, or insert items, the availability of RCE editing etc. It would be good if Canvas was to develop more commonality.

For us its quite important that we can download assessments and grades for specific students for audit purposes. The grades page and pdf print facilitates this, but I have to download the submitted work separately to the grades and feedback applied to that work. 

Further, the coloured bars which appear to scree to indicate the grade against each assessment criterion disappear in the print version.

Is there a specific location for others to contribute feedback on rubrics and outcomes ?

hesspe
Community Champion

I liked the background music, though I agree it was too loud at points.  Credits?

The voice sounded computer generated.  I hope that was the case and my apologies to the narrator if not!

valentinesking
Community Coach
Community Coach

Howdy @RaviKoll 

Love the option to drag/drop criterion.

Is the horizontal/vertical view options only for teachers/elevated users; or do students have the option of view, as well?

Also, nice "The Office" reference with the student name 🙂

Val

cvalle
Community Participant

Thanks for this update and video! The main reason that I use speedgrader is when grading an “Essay” type question in new quizzes, but there does not appear to be a way to assign a rubric to a specific question in new quizzes. Please consider adding that ability. Thanks! 

JamesSekcienski
Community Coach
Community Coach

@RaviKoll,

Thank you for this update! There are a lot of exciting new features coming to the design that look promising.  I'm also thankful for the peace of mind to know that it will be released as a Feature Flag to start. 

I have a few follow-up questions based on the details and demo video:

  • The demo shows this re-design within a course.  Is this re-design also applying to the Rubrics area within sub-accounts?
  • Are there any updates to how Outcomes are added to a Rubric in the re-design?  I saw the button for it, but wanted to know how they appear when added.  Will they still be in a read-only state for the descriptions or would users potentially be able to edit them after adding them?
  • Are there any updates coming to the API to support these new designs?
    • Being able to set a rubric to default to Scored is valuable in the interface and would be useful with the API too.
    • It would also help to have API support to create Rubrics at the account level and not just the course level.
  • When viewing the Rubric in traditional view in SpeedGrader, is it possible to resize the amount of space it uses to prevent overlapping the student submission?
  • Will there be any changes to the design for editing a rubric and/or its settings when on an Assignment?
  • Since there are now draft and archived states for rubrics, will there be any changes to how you search for Rubrics to link to an assignment?

Finally, I'm glad to see that the RCE is still something you plan to work on with rubrics and look forward to when you are able to make that available, even if it is only a simplified version.  I know you mentioned that other ideas are also being considered.  Is developing an import feature on your roadmap too (without a known date yet) or is that one of the features that is still only being considered?

audra_agnelly
Community Champion

Will users be able to set their default view, or will Canvas remember the users last preference? The traditional view occupies a lot of screen space. I can see a grader who uses a simple rubric or one they are very familiar with would want to default to the more condensed horizontal and vertical views, and it would be a better experience for the rubric to default to the user's preferred view. 

MBKurilko
Community Member

Please repost with a human voice and with different or no music. These two distract from the content and make it difficult to understand. The music is gloomy and too loud. 

 

 

 

venitk
Community Champion

Small request: when you make these videos in the future, it would be helpful if there was no background music. It really makes a huge difference for viewer comprehension. 

Glad to see reorder in there for the criterion. 

I use the import rubric canvancement all the time; it's such a time saver. Will that be integrated as an actual feature? Will we still be able to use the canvancement? 

James
Community Champion

@venitk 

I'm glad to hear someone is still using my rubric importer.

I don't think Instructure can answer whether the Canvancement will still work as it's not their product. Likewise, I cannot definitively speak to what they plan on doing -- just what I've read on the captions (couldn't hear the words) of their video.

I didn't see anything in the video that would prevent the rubric importer from working. It might take some tweaks to make sure the menu items show up. Most of what was going on in the video was cosmetic --- changes to the UI -- not the underlying rubric or its API calls. That's what gives me hope that it would continue to work. I also hope that the other rubric Canvancements I've written (and there are several because Rubrics needed a lot of love) will continue to function. The one I see failing quickly is the one that allows you to rearrange criterion order, but that one is being addressed by Canvas. Others, like sorting the rubrics so the current context appears first isn't related to rubric creation, but would be a good thing for them to fix while they're thinking about rubrics.

I also see anything in the video that makes me think they're going to duplicate the functionality of the importer script. I'm not sure that copy/paste from a spreadsheet will meet their accessibility demands. They do have an API call that allows for rubric creation, so they leave that functionality up others.

paul_fynn
Community Contributor

Thanks Ravi for the video - great to get an early look... and this is always useful to get the understanding and thinking going at a higher level, so for what it's worth please see below.

So having found time to review the video more fully, and reflecting on existing experience with Rubrics (which we are keen to get all staff to adopt) and Outcomes (less experience and still finding our way), here's a few thoughts in no particular order.

  • Drag and Drop functionality is very welcome - how about multi select and move and or multi delete ?
  • Will we have a bulk import/export capability ? This would be essential when dealing with national or institutional standards which may number in the hundreds or more ...
  • It would be useful to hear ideas on how rubrics might be summarised for different levels of users (students, staff, admin)

Columns would ideally include who created or modified a rubric and at what date with sortable columns - as a reference see the course files area in any course. This assumes that we want rubrics to be available for sharing across the sub account (which in our case we do - we'd also like some control over sub account level rubrics or ruric templates).

Name Date created Date modified Modified by Size Accessibility
Sample.doc 14 Feb 2024 15 Mar 2024 Paul Fynn 5mb  
 
  • This also raises the question of version history/version control
  • Is there scope for a comments box that is not part of the rubric as displayed when deployed?
  • How does the architecture compare with the fields and construction of Mastery Outcomes - can we be consistent between the two - are they essentially the same thing at different levels
  • Outcomes allow for long and short descriptions and titles - is there a need for this in Rubrics - thinking particularly of UK PSRB standards that may be incorporated in grading, or IfATE Apprenticeship KSBs where there may be a need for a concatenated version of the criteria, but with access to a longer version
  • Should criteria be able to carry a reference to an external standard/criterion ?
  • Will it continue to be the case that we can choose whether an outcome is used for grading or tracked separately in the outcomes system ?

Speedgrader experience - do instructors lose sight of the work that they are marking when they 'toggle out'

Is there consideration as to how this will work when marking group work ? Will it be possible to choose to clone the completed rubric across all members of a group, to make it individual, or to close some aspects and individualise others ?

Will the colour bars that indicate which grade box has been selected appear when printed ?

Can we offer a print button for both students and staff - for either the completed or raw rubric ?

In terms of Blue Sky thinking 

  • could we associate specific criteria levels with specific resources / feedback ?
  • can we summarise the levels achieved across all students in the cohort ?
  • is it possible to have a sub account reporting feature on
    • courses / assignments that are using rubrics

Finally, the conceptual relationship between grades, rubrics and outcomes is not an entirely intuitive one generally within Higher Education where we have vocationally very expert practitioners developing familiarity with academic philosophies, processes and systems.

Is there scope to think about how the presentation within Canvas across Speedgrader, Rubrics and Outcomes might more graphically support colleague's understanding of this configuration / structure and how it might be used.

This could be as simple as the choice of layout, shading and colouring, and perhaps greater clarity on the relationship between Rubric Criteria and Outcomes (are these in fact capable of being managed as if they were the same thing, and the difference is in how they are deployed rather than in the criterion/outcome itself ?).

Rubrics and outcomes are too important to be left to the 'teacher-bricoleur', although the latter are important to get things moving and off the ground. How can Instructure present these in such a way as to make obvious to organisations at strategic level what the possibilities and benefits of a carefully managed/curated organisational approach would be ....

 

simone_laughton
Community Participant

Thank you for this update and this looks very promising.

I have provided some feedback below regarding features that could be helpful for our instructors:

  • Bulk / batch import / export of rubric criteria and scores (separate scored items and aggregate score, not just aggregate score).  One of the issues, especially with the Outcomes system is that it is difficult to determine patterns of competence and performance when only an aggregate score is provided and available to the instructor.
  • Clearer sense of how the outcomes will work and whether it is possible to import / export outcomes (as we have some instructors who would like to map their assignments / quizzes / activities to externally developed learning outcomes (e.g., Common European Framework of Reference (CEFR) scale - https://www.languagetesting.com/cefr-scale)
    • As a sidenote - an issue with the Learning Outcomes system and the Mastery Learning Outcomes from both instructor and student perspective was the need for all students to have completed an item before any data could be displayed.  Is this still the case?  Would it be possible to have the ability to see current status based on graded items to date (even if this could be done with a 24-hour delay, it could be very helpful to inform instructor activities in the classroom).
  • In terms of version history / control - we have had some instructors who have experienced issues with Moderated Grading - will there be a process where the instructor can revert to a previous rubric score or add a new one and decide which one will be displayed to the student and used by the system?
  • I agree with Paul's suggestion regarding Rubrics - it would be helpful to have the option for a short label as well as a longer description for rubric criteria.  If this is exportable, then it possibly could be used to support linked data approaches.
  • Criteria should also be able to carry a reference to an external standard / criterion - and this should be a repeatable field as the criterion may refer to multiple different competency frameworks (for example, previously we have done this for to support development of information literacy and digital literacy skills within the same criterion using a different LMS).
  • Ideally, it would be great to have the choice in terms of whether the rubric criterion is used for grading or tracked separately in the outcomes system.

Looking forward to testing out the new rubrics system!

 

 

hollands
Community Contributor

Hello,

 

Seeing some movement on rubrics is awesome. I am hopeful this will make my institution's use of Rubrics a little easier and hopefully more widespread. 

 

I'm sure people are asking 10,000 questions and perhaps it is my own rubric ignorance but I am wondering if the following enhancements are possible:

 

1. Linking rubrics directly into a module. One of our issues currently is rubrics are a little hidden and often students (and faculty) don't even realize one is attached. I would love a revamp to visibility. 

2. Often we have faculty using rubrics repeatedly (discussions for example) that are used for grading. This may be my own lack of knowledge but currently when we (the designer) import the discussion rubric into the discussion and set it to use for grading it forces us to make a copy. I've always found this understandable but also odd since we go from one Discussion Rubric to up to 15-30 copies. Again I may just be utilizing the tool in an oddball way, but I was just curious if there's a better route for this or if not a better route planned. 

 

~Shaun

ldavenport4015
Community Participant

@simone_laughton 

I love your comments on expandable criterion (including on Outcomes criterion)! Also, regarding Outcomes and Rubrics working together, we have begun to use this in our Higher Ed institution to track student competency across a program. I am excited for the rubirc updates and hope that Instructure continues to develop Outcomes side of assessment, as well.

Here is what we are doing and how we are getting the information out of Canvas for reporting. (My dream would be that Canvas develops some Admin Analytics around account level Outcomes to get some snapshots of use and competency without having to export and create the reports for the department): 

  • We are currently using account level Outcomes and Rubrics to measure competency and competency trends. I would love it if the instructor could export a complete list of Outcomes that were assessed from their course. Currently, they only get a gradebook like csv for the students with their scoring method score for each Outcome There is an Admin report (Student Competency) that exports all the Outcome results for each assignment the Outcome was aligned to. That provides every assignment, assignment score(total), Outcome score, and if mastered. The report is also helpful for finding gaps in assessment. To aid our reporting and Outcome integretiy we have a sub-account created for each program that needs to track competency across a program. Basically, we didn't want instructors using account level Outcomes in courses that were not a part of the program we are trying to measure.
  • I'm not sure what you mean by this statement: "an issue with the Learning Outcomes system and the Mastery Learning Outcomes from both instructor and student perspective was the need for all students to have completed an item before any data could be displayed. " What are the instances that you would have an Outcome assessment before you have a graded assignment? I may not understand the question. The Learning Mastery Gradebook is showing the Outcome for each student based on the Outcome assessed by the instructor and the scoring method you chose for the Outcome(Highest Score, Weighted Average, Decaying Average, etc.). The Student Learning Mastery Gradebook (Student side) could use some redesign, but it does show every alignment for the Outcomes measured in a course. I would love the print feature for this to be improved AND for the Outcome criteria instructor comments to show on the Mastery gradebook side for the student. We have a need from the accrediting body for the student to be able to track their progress overtime. We currently have slogged together a Google site and pdf printouts for each student to keep track of their progress. 
  • "Ideally, it would be great to have the choice in terms of whether the rubric criterion is used for grading or tracked separately in the outcomes system." I believe you could accomplish this by removing the rubric criterion and only using the aligned Outcome you imported for that assignment AND uncheck the "Use this criterion for scoring" when you align the Outcome.  AND then check the "Remove points from rubric" box on the assignment "rubric". ← This UI and order of operations is tricky and needs some redesign, in my opinion. 

I'm still learning how Rubrics and Outcomes work together and hoping to see continued development around these items, too! 

Lynn

hesspe
Community Champion

@RaviKoll  I apologize if this naive question, but is there any prospect of unifying the Canvas Credentials rubric with the Canvas rubric.  The fact that this hasn't come up so far suggests to me that either the answer is obvious, or very few people are using Canvas Credentials, or both. 🙂

Mary_W_W
Community Explorer

@RaviKoll Could you comment on whether or not the new rubrics will allow instructors to delete saved rubric comments?

See: Solved: Re: Deleting saved comments from rubrics - Instructure Community - 74654 (canvaslms.com)

I have an instructor teaching a large enrollment course multiple times a year with multiple TAs and she needs to be able to both save comments and then subsequently delete them so they don't become unwieldy. 

Thank you.

dave_barber
Community Participant

This looks like a great improvement...just redacted this as I had got the wrong end of the stick.  Still looks better from a presentational point of view.

 

paul_fynn
Community Contributor

@RaviKoll sorry to raise this here, but it seems timely and related.

From time to time we get reports that grades and feedback comments have been lost from completed graded rubrics, and this seems to relate to Assignments with more than one grader

  • I have a (perhaps faulty) recollection of a discussion some time ago which suggested that it was problematic if two instructors tried to grade the same work on  the same rubric on the same day (perhaps LIt/lag related ?)

Are you aware of any current issues, what might be causing these, and how they can be mnimised from the user end, please ?

embleton
Community Participant

I like the summary screen and clear labeling of whether a rubric is scored or not.
Curious if this setting can be adjusted for different uses of the same rubric.

Can this rubric have negative values?

Many actual use details still unclear.

Production comment: drop the music - very distracting and very unnecessary. The voice was much too quiet. The combination was difficult.

Sylvia_Ami
Community Contributor

In the video, the Rubrics page showed the list of rubrics (saved/archived). One of the columns is, Location Used (see screenshot below). It looks like this is just general information such as the rubric is used in a course in an assignment.

Are there plans to get more specific, such as the title of the assignment or discussion?

rubric-pg-location.jpg

abbyrosensweig
Community Participant

Are there any updates to how rubrics will display to students? There are some display issues currently (specifically with viewing the "long description" field of a rubric, takes a click for each criterion) that I'd like to see addressed as part of this redesign.

RaviKoll
Instructure
Instructure
Author

Thank you all for your valuable responses! We're sorry for the delay in addressing your concerns, but rest assured, we're carefully reviewing each comment posted here.

We've updated the video to remove the interfering music, acknowledging that it was an experiment that didn't quite work as intended.

For those eager to experience the redesigned rubrics, we'll be rolling out access to a limited user group this month to gather early feedback. If you'd like to be included in this group, please let us know, and we'll enable access for you. The beta release for all users is scheduled for June, followed by production release in July.

Please reach out to your CSMs if you would like to have early access to redesigned rubrics in the beta environment. 

Throughout the year, we'll continue to deliver frequent updates and enhancements, with many additional features in the pipeline. These include real-time auto-save functionality in the speedgrader, student self-assessments using rubrics, multiple rubric mapping to assignments, outcome alignment, import/export capabilities, and a lot more. 

Addressing some of the questions raised:

  • You'll be able to edit points and descriptions after creating a criteria from an outcome, with outcome alignment coming soon. Alignment will allow referencing the criterion to an external outcome. 
  • Improved navigational experience is just around the corner, with testing opportunities opening up shortly.
  • RCE integration is on our roadmap, although we can't provide a specific timeline yet.
  • Import/export functionalities will be available from the speedgrader for each student, with bulk import/export under consideration.
  • The redesign will be accessible at all account levels and contexts.
  • API enhancements will be included in future releases.
  • Rubric management within the assignment section will undergo significant overhauls in upcoming releases.
  • Default rating scales during rubric creation are slated for release later this year.
  • Version history and audit tracking are under consideration, and we'll keep you updated on our progress.

Your feedback is immensely helpful in improving the product, and we're carefully considering each comment. While the video and comments may leave some aspects vague, you'll have the opportunity to test the new experience soon. We appreciate your patience and continued engagement as we work to enhance your Canvas LMS experience.

GideonWilliams
Community Champion

Would like to be included in the early release, thanks.

JarrenCollins
Community Member

Hi Ravi,

I'd like to be included in the group for early access.

danaleeling
Community Participant

"You'll be able to edit points and descriptions after creating a criteria from an outcome, with outcome alignment coming soon. Alignment will allow referencing the criterion to an external outcome. "

Will this allow instructors to edit the point values and descriptions for outcomes pulled from the Account Standards? I am referring to the ones the institution maintains in the root admin account. I trust that instructors will not be able to do this. While faculty are free to add their own outcomes and scales at the course level, the admin account outcomes share a common scale used for institutional reporting of learning achievement. Altering these ratings scales would lead to an institutional Outcomes Result report with many different scales, rendering statistical analyses either meaningless or tremendously complex.

I also hope that there will be the option to opt out of the production release until "Default rating scales during rubric creation are slated for release later this year," is released. While instructors can alter default rating scales, starting from an institutionally recommended default rating scale is a useful place to start. At present the institution is not using the language of "meets expectations" or "meets mastery" but is using instead "Optimal, Sufficient, and Suboptimal" demonstration of a learning outcome. The points are redistributed from the Canvas default to better correspond to grades when outcomes are used to directly grade assignments.

JamesSekcienski
Community Coach
Community Coach

@RaviKoll 

I would like to be included in the group for early access.

slwilso3
Community Explorer

@RaviKoll 

We would like to be included in the beta testing.

Stephanie

MBKurilko
Community Member

Thanks, Ravi. Please include UC Santa Barbara in the beta testing.

MB Kurilko

mbkurilko@ucsb.edu

 

TrinaAltman
Community Participant

Hi @RaviKoll:

Regarding your statement: "For those eager to experience the redesigned rubrics, we'll be rolling out access to a limited user group this month to gather early feedback. If you'd like to be included in this group, please let us know, and we'll enable access for you."

Some of our course designers may be interested in this. Can you please tell us more about how to sign up and where the access would be enabled e.g., on one of our instances (if so, which one?), in a separate Instructure instance you'd give us access to, etc.?

Thank you,
Trina

llomax
Community Participant

@RaviKoll

I would love to be included in the early release this month for this redesign. We use rubrics and outcomes in our district and have been eagerly awaiting updates in this area. Thanks!

Lisa Lomax

(Davis School District)

vanzandt
Community Champion

I have the same question and same concern as @danaleeling who stated:

  • "You'll be able to edit points and descriptions after creating a criteria from an outcome, with outcome alignment coming soon. Alignment will allow referencing the criterion to an external outcome. "

    Will this allow instructors to edit the point values and descriptions for outcomes pulled from the Account Standards? I am referring to the ones the institution maintains in the root admin account. I trust that instructors will not be able to do this. While faculty are free to add their own outcomes and scales at the course level, the admin account outcomes share a common scale used for institutional reporting of learning achievement. Altering these ratings scales would lead to an institutional Outcomes Result report with many different scales, rendering statistical analyses either meaningless or tremendously complex.

We need account level Outcomes to remain consistent, and to NOT be editable by each instructor.

lgarmire
Community Participant

I would love to be included in the early release this month for this redesign. We use rubrics for all assignments, discussions and quiz: essay questions.

Laura Garmire

Indiana Online

RaviKoll
Instructure
Instructure
Author

Hello everyone! Thank you for the incredible response to our beta testing invitation. If you need access, please contact your Customer Success Managers (CSMs). We'll enable access to your beta environment for testing purposes later this month. 

JarrenCollins
Community Member

G'day @RaviKoll,

 

Hope you're doing well! I have a question regarding how the criteria sheet/marking guide calculates marks.

 

Is there a way to add a 'best fit approach' mode to the marking guide? QCAA (Queensland Curriculum and Assessment Authority) uses this approach in the vast majority of their senior subjects to calculate marks.

 

Happy to share more information if you are not familiar with this method.

 

Thanks again for all of your hard work!

 

 

RaviKoll
Instructure
Instructure
Author

Hello Community!

To clarify how Outcomes can be edited when added to Rubrics, there are two key elements to consider: the points possible and the rating scale. The rating scale cannot be edited, but the points possible for each criterion can be adjusted. Additionally, the Outcome descriptions are not editable.

kkhan
Community Member

I would like to see what the new outcomes page would look like, since the rubric now has 3 different views. I believe the outcomes page should be intertwine with these changes. Also, I noticed a download feature on the rubrics page, can you please elaborate on this? the robot voice was very distracting

lezonl2
Community Contributor

@RaviKoll thank you so much for these valuable updates regarding rubrics.  We are planning a major update to our outcomes this summer and it would be great to couple that with the the new rubric tool.  As part of the summer release, will faculty be able to add outcomes to a rubric?

SusanOrr78
Community Member

@RaviKoll Sorry if I'm missing something but having tried out the new rubrics in our Beta environment, it looks like points ranges do not display in any of the views, other than the Edit Criterion view. When I save the rubric and attach it to an assignment, I can see the points ranges in the preview there, but when I open the rubric in SpeedGrader, again there are no ranges displayed, just the max possible points for each criterion rating?

rnb024
Community Explorer

I just did some initial testing today. This feels very far away from being ready. 

  1. I ran into issues importing Outcomes, such as this, receiving errors when adding outcomes (in this image I was adding Outcome B, which is not in the rubric).Screenshot 2024-06-10 113338.png
  2. I can't use the Traditional View within SpeedGrader when using a newly created rubric.Screenshot 2024-06-10 110345.png

Overall, the experience seems to be early in development and/or full of bugs. I hope this can all be improved before this redesign is enforced. 

 

RaviKoll
Instructure
Instructure
Author

Thank you for your input. We are currently addressing some of these issues and will review the others. We acknowledge that there may be bugs at this early stage, but we are actively working to resolve them as they arise.

Regarding your specific concerns:

  • The issues with point ranges not being displayed on SpeedGrader and the bug when outcomes are added are already being fixed, and the updates will be released shortly.
  • The traditional view not being available for criteria with more than five ratings is intended functionality, not a bug.
SusanOrr78
Community Member

Further to my initial comment above (which I note has now been addressed in Beta, thanks very much), here are some further thoughts.

1. In the new Rubric builder, in the Create Criterion interface, the default to 4 Points Possible (1) is confusing, especially because these numbers are the same as the default number of ratings (2). Could the default points be set to 0 in the Points column (3)? 

SusanOrr78_0-1719496312594.png

It would also be useful to have a text column between (2) and (3) so that letter grades (A, B, C etc) could be entered as well.  

2.Unlike with the current interface, there is no way to enter the max number of points for a criterion and then have Canvas work out the division into equal Points ranges in one move. Instead, the user has to enter the maximum points for the criterion into any one of the points range boxes (1) (with Enable Range selected) and then work out the top points for each rating manually and enter those into the subsequent boxes for each rating to create the points ranges. This is not intuitive and doesn’t represent an improvement on the current user experience. Also if you change the number of max possible points, you then have to change each ratings range – Canvas won’t automatically adjust it for you. I may be missing something here but if I am, then I would guess other users are too:

SusanOrr78_1-1719496312597.png

Also the use of .1 for the bottom end of each points range isn’t useful. For example, a student would be given a mark of either 5 or 6, not 5.1. The numbers would be better displayed as whole numbers within the ranges in this view. They do appear as whole numbers when viewing the rubric in the assignment but the .1 format appears again when viewing the rubric in the SpeedGrader. It would be useful to have the format consistent across all views. 

 3. In the Edit Rubric view, the expanded view for a criterion isn’t ideal with the criterion ratings shown vertically instead of horizontally. Also, the use of ‘Rating Scale’ is confusing because when you first look at this view, it looks like the numbers 0-4 in the first column (1) represent the marks for the individual ratings, rather than simply the number of ratings within the criterion. The display of the actual points ranges over to the right but with no header and with the lowest range not showing as 0-5 but just ‘5’ in this example is also confusing (2): 

SusanOrr78_2-1719496312599.png

4. In the new Horizontal and Vertical views, it’s not apparent that selecting a rating will reveal the title, description and points range for the associated criteria. Also in the Horizontal view, the rating display number 0 is dropping off the edge of the page if there are 5 ratings and if you add more than the default number of 5 ratings, the horizontal view doesn’t work at all, it defaults to the vertical view: 

SusanOrr78_3-1719496312602.png

 

 

873179959
Community Contributor

As @SusanOrr78 mentioned above: In the old rubric builder, you could change the total number of points at the end of a criterion row and it would dynamically shift all the ratings points - maintaining your percentage distribution. So, if you wanted the highest category to always be 90%-100%, the second category 80-90, you could set that up, duplicate a criterion a certain number of times and then later shift around how many points each criterion was worth. In the new rubrics, you have to do the math on your own and change each rating individually.

Also, @RaviKoll said: that Outcomes ratings could not be changed, but the total points possible for each criterion was editable. I'd like the sub-account level outcomes to be able to pull in, but on a 300 point assignment, it contributes 30 points toward the assignment grade, and on a 10 point assignment, it contributes only 3 points for instance.  I am not seeing a way to edit how many points the imported outcome is worth in that particular rubric.


 

 
INSTRUCTURE
AUTHOR

Hello Community!

To clarify how Outcomes can be edited when added to Rubrics, there are two key elements to consider: the points possible and the rating scale. The rating scale cannot be edited, but the points possible for each criterion can be adjusted. Additionally, the Outcome descriptions are not editable.

rpsloan
Community Participant

@RaviKoll In your May post, it was mentioned the prod release is in July. Is there a date already set?

TrinaAltman
Community Participant

@rpsloan In case it's helpful, Phase 1 of the Rubric Redesign is listed in the 7/20/24 Canvas Release notes

Rubrics Redesign Phase 1

lsorroche
Community Explorer

I noticed that if I try to create the rubric from inside the assignment, I see the traditional design. When it will be the new rubrics design available in the assignments section? Thanks!

lezonl2
Community Contributor

Hi, I apologize in advance if this has already been suggested and I missed it.  Can we please continue to display the bullseye icon to identify an outcome in the SpeedGrader view? We widely used outcomes and being able to differentiate them when grading is very helpful!

Thank you,

Lisa

James_Kocher_UF
Community Champion

Why was the rubric redesign changed from an unlockable feature to just enable/disable? This looked to be a change about a week before release?

We also just found out that if there are more than 5 ratings in a criterion, traditional view is not available. The issue some instructors are having so far with horizontal/vertical view is that it doesn't show the point value for each criterion at first (see below).  It will show after you select a criterion, however.  (Please note this is NOT a real criterion! 😂

 

criteron.png 

 

vs

 

criterion2.png

 Is this something on the roadmap or can be easily updated?

 
 

 

James_Kocher_UF
Community Champion

We've also discovered students cannot see rating descriptions with the enhancement on. This issue has been reported and is with L2.

TrinaAltman
Community Participant

@James_Kocher_UF @RaviKoll  We have the same question about why was the rubric redesign changed from an unlockable feature to just enable/disable? We need to give our instructors the option to try the redesigned version. Per the Change Log at the bottom of the release notes page, this was changed on 7/19/24, the day before the release. Will a lock/unlock option be added, and if so, when? Thanks.