Input Opportunity: Classic to New Quiz Outcome Alignment Migration

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

BenFriedman
Instructure Alumni
Instructure Alumni
18
4807

Quiz Outcome alignments provide important opportunities for assessing student mastery on important standards or competencies within a Canvas LMS course. They offer teachers and instructors a data-driven way to facilitate differentiation and administrators the ability to track student performance in important areas to engage in continuous improvement efforts. While both Classic and New Quizzes provide the ability to create Outcome alignments, you may be aware that they do so in different ways. Outcome alignments are created for a Question Bank while in New Quizzes they are created on an individual item basis or for the assessment as a whole

We’re still in the discovery phase for handling this, and for that reason, we wanted to take this as an opportunity to get your input. Does your institution or district have Outcome alignments that need to be migrated? Click here for a 2-minute survey to help us identify the best way of handling the migration and transition. We appreciate your time and perspective!!

 

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

18 Comments
MattHanes
Community Champion

Howdy @BenFriedman , it appears this survey is not accessible because it is only shared within your organization.

BenFriedman
Instructure Alumni
Instructure Alumni
Author

Thanks, @MattHanes! Updated the survey and it is now open for all. 

leward
Community Contributor

Hi @BenFriedman .  At IU, we discouraged schools and departments from aligning outcomes to quiz banks because it only worked proper if the total points for each question equalled the max rating scale.   I really like the idea of being able to align outcomes with item banks.   If a student answers just one aligned quiz question correctly, that should probably NOT be considered evidence that they have mastered or not mastered an outcome.  But if an item bank has been designed to comprehensively exercise a students knowledge or skills on an outcome, then setting up a random draw section with many aligned questions might be considered a reasonable reflection of the students abilities.   For quiz- and random-draw-section alignments, here's how I would like to see the scoring work.

For all aligned questions in the random-draw section or entire quiz

  • divide the total points earned by the student for the questions aligned to the outcome by the total possible points for those questions.  For example if there are 5 questions with 10 pts each and the student gets 4 correct, that would be 40/50 or 80%
  • multiply the percentage earned for the outcome by the maximum rating for that outcome.  Let's say the max rating is 3.  The student's rating would be 80*3 or 2.4.  This is the score that would be recorded in the account-level Outcomes Results and Student Competency reports
  • If the rating earned is above the mastery rating, the student has mastered the outcome.

 

The same approach could be used for question-level alignments, but for fixed answer questions, but unless partial credit is awarded, the student will either earn the max possible rating (10/10=100%; 100%*3=3) or 0 for each question.  But at least correct answers would result in the scores reported in the mastery GB and account-level reports being consistent with the rating scale.

MARISSASCHRADER
Community Participant

@BenFriedman While we have used outcome alignment in the past and some instructors still use it, our district has build our standards based coursework around the use of specific rubrics (with the standards/outcomes added to those rubrics) and attaching it to the quiz assessment (a New Quiz). This past spring (2021), this started to be an issue and many instructors experienced a problem with using the rubric to override the New Quiz score in Speedgrader. They could do it, but it would revert back to the original New Quiz score and they would have to go back in and regrade it a second time. It seemed that score would hold, but made for double the work and time for grading. We have been asking for this issue to be resolved and allow rubrics to be used with New Quizzes correctly and consistently but continue to get the response that this is not a functionality of New Quizzes... but we have been doing this for two years! We then were told this is a "bug" in New Quizzes and should not be happening--we need this to happen, so can the "bug" be left and allow us to use a rubric to override the New Quiz score on the first grading go around... please? I appreciate your time and hope you understand how much we need this to happen (we really cannot be the only district using rubrics to override scores in New Quizzes). Thank you.

darren_ponman1
Community Participant

If Canvas Quizzes are to work effectively with Canvas Mastery Learning Markbook, the following must occur:

  • Teachers need to be able to align specific course outcomes with individual questions
  • Correct answers in a self-marking multi-choice quiz must contribute to student mastery levels, for that outcome in the Learning Mastery gradebook
  • Teachers should be able to ‘mark’ written quiz responses against a rubric. This mark should then contribute to student course mastery level
    • ie - being able to have a rubric for each question (likely only one criteria), would really speed up ability to provide feedback on quiz text responses. Rather than write feedback for each student and question, teacher just selects appropriate performance descriptor from rubric for that question. This contributes to Mastery accreditation in Learning Mastery markbook.

 

This would address a massive deficit in the Canvas Quiz and Learning Mastery functionality, ensure it remains aligns with future direction regarding educational best practice and increase the usefulness of the LMS exponentially.

 

 

danaleeling
Community Participant
At IU, we discouraged schools and departments from aligning outcomes to quiz banks because it only worked proper if the total points for each question equalled the max rating scale.

I ran into this but found that I could use the following function to rescale all scores flowing from rubrics and quizzes in the Outcome Results CSV report available to the Canvas sysadmin.

export outcome score/learning outcome points possible*5

In this case I am rescaling everything to a five point scale to match the rating scale in use by course level learning outcomes in the institutional bank of outcomes. This means faculty do not have to set every quiz and test question to be five points to match the scale used by rubrics. Because the Outcome Results report includes the type of assessment as a column, reports can disaggregate by whether the source was a rubric on an assignment or a quiz/test if necessary.

leward
Community Contributor

Just noticed that i left out a critical word in my post above.  Here's the original:

 If a student answers one aligned quiz question correctly, that should be considered evidence that they have mastered or not mastered an outcome. 

And here's what I meant to say:

If a student answers just one aligned quiz question correctly (especially something like a multiple choice as opposed to an essay question), that should probably NOT be considered evidence that they have mastered or not mastered an outcome.

leward
Community Contributor

.That's a great tip, @danaleeling.. You are essentially using the method that I suggested for scoring. But ideally Canvas should handle the translation from question point value to outcome rating.  

BenFriedman
Instructure Alumni
Instructure Alumni
Author

@leward and @danaleeling There is a possibility that we could consider this an additional calculation method for Outcomes. If we were able to move it forward, what name would you give it? 

@MARISSASCHRADER I'll look into this further and connect with the Canvas Evaluate team that oversees the Speedgrader. 

@darren_ponman1 Currently you are able to align an outcome to an individual item in New Quizzes. Unfortunately, right now the results do not go to the Learning Mastery Gradebook. We are in the discovery phase for this area right now trying to determine the best solution. Stay tuned!

MARISSASCHRADER
Community Participant

@BenFriedman Thank you! Much appreciated. 

darren_ponman1
Community Participant

@BenFriedman 

That's exciting! However, If it were an additional calculation method it would add to the setup time and process and restrict flexibility. I may have misunderstood, but I would think the preference is for the alignment to follow the existing rules of the outcome in the course. Eg if an outcome is set to decaying average at the course level, then that is the calculation method used for the outcome in the quiz. This way the option is there for the same outcome to be easily, formatively assessed using a variety of assessment methods.
 
If you didn't want the quiz to feed mastery learning, I can't see why you would align a question with an outcome. Ie don't want quiz question to influence mastery? Don't align the question with an outcome.
 
Alternatively, 4 extra outcome calculation methods? Eg
- 'decaying average_includes new quizzes',  'highest score_includes new quizzes', etc
leward
Community Contributor

@BenFriedman .  I'm not sure another calculation method would address this issue for a couple of reasons.

  1. The same outcome can be aligned with both rubrics and quizzes.  The approach we described above should only be used to convert points earned for quiz questions, banks, and entire quizzes, not rubrics.
  2. Calculation methods are primarily for calculating the results of multiple ratings for the same outcomes.  The issue with quizzes is converting a single rating (currently expressed as a point value) to a corresponding value on a rating scale.

That said, if you ever get to a point where it's possible to align an outcomes standard rubric criteria with rating scales different than the outcome's rating scale, this same approach could be used--that is, calculate the percentage of total possible points earned on the aligned criterion and multiple that value by the highest value in the rating scale.

danaleeling
Community Participant
There is a possibility that we could consider this an additional calculation method for Outcomes. If we were able to move it forward, what name would you give it?

I do not have enough experience with LMS variables to know whether there might be a name for a "rescaled" outcome score variable. Given that different institutions would be using different scales, my thinking would be to use something like

outcome score/learning outcome points possible

which would produce a number between 0 and 1. This is called "normalization" in some fields. The value produced is a "normalized" value. I am not certain how useful this would be as most analysis packages allow one to construct this function whether one is working in R, Data Studio, Tableau, or other packages.

Jul_Cro
Community Participant

I will tell you that I am super frustrated by the fact that we cannot run an admin report any longer to gather our outcome results from New Quizzes. If this is not fixed, we will not be able to have instructors align account level outcomes to the quiz questions for institutional level general education student learning outcome assessment which is a requirement for HLC. The old quizzes allowed our chemistry faculty to quickly align outcomes with Question Banks and I feel like the scoring was great using the decaying average. 

Jul_Cro
Community Participant

Hi @BenFriedman we have not heard anything from you lately on this thread and I was told this is where I could get questions answered about outcomes and New Quizzes. Thank you so much!

SandraONeil
Community Explorer

We need to report program outcomes (admin/account level) across courses.  This is not available in New Quizzes and is desperately needed!  Being able to create and align institutional/program level outcomes to questions in an item bank is not valuable if there is no way to actually access the data it produces!

amyslack
Community Participant

So our institution is still working on migrating our assessments into Canvas and since we recognize that New Quizzes is the future we are migrating directly into New Quizzes. Our goal is to be able to look at a course and see the mastery of each student learning outcome across the course. We are disappointed that New Quizzes doesn't work with the Learning Mastery Gradebook - this is so critical. For us for accreditation purposes we need to be able to state how many students achieved mastery of each learning outcome. To me I would think that as we are aligning each question to each outcome that it would be calculated as the number correct/number possible. This would give a percentage for each student on each outcome and that would be able to be seen in the Learning Mastery Gradebook easily for the whole course by each student and each outcome. Maybe these calculations need to be more complicated then this but we do need a way to see at a larger course level student achievement on Program level outcomes that can be then reported back for accreditation purposes and also help to inform instruction so that we can address deficits in student learning on outcomes that show a need to be addressed. It would then be super helpful if reports at the account level (or subaccount) could be pulled and aggregated so that we could see this not just for one course but for all the courses within that program. 

 

I'll admit that the calculation methods in general are a bit confusing and I can see that somehow we might want to represent the student achievement over the time of the course or over multiple attempts but at the core we need to see how a student performed overall an all items related to that outcome. 

BenFriedman
Instructure Alumni
Instructure Alumni
Author

@amyslack Glad you are seeing the value of the LMGB!! I can report that we have started the work to bring New Quiz scores to the LMGB and the Account Outcome Results .csv export. Stay tuned to the Canvas Roadmap to see when we think this will be available.