Celebrate Excellence in Education: Nominate Outstanding Educators by April 15!
Found this content helpful? Log in or sign up to leave a like!
How do I get the answers and scores for individual questions on a quiz submissions? The Quiz Submission Question category claims to do this, but all I get is a slight variation of the Quiz Question Object. I know that it is all still under beta, but is there another way to get the individual question answers?
Solved! Go to Solution.
This API should be called (Quiz Submission Events - Canvas LMS REST API Documentation ) in your case. Then just loop through all the quiz_submission_ids, you should get what you want :-).
This API should be called (Quiz Submission Events - Canvas LMS REST API Documentation ) in your case. Then just loop through all the quiz_submission_ids, you should get what you want :-).
Thanks, somehow I missed that one.
Benjamin, were you able to get this working? Each time I call the quiz submission event API, I see three question_answered events for each submission I look at. The first event has null or blank arrays for all the answers. The second and third events have the answer provided by the student for a specific question. Problem is, the quiz has 30+ questions. If I download the quiz response report inside Canvas and open it in Excel, I see that the students have answered the questions. I must be doing something wrong with the API but I'm not sure what to correct.
As @sam_mcknight suggests, it appears that this returns null and blank arrays instead of the expected values? Is this API broken? Here is an example of the returned results. I can see the result just fine in Canvas.
{'created_at': '2017-10-15T01:10:43Z',
'event_data': [{'answer': [{'answer_id': '',
'match_id': None},
{'answer_id': '',
'match_id': None},
{'answer_id': '',
'match_id': None},
{'answer_id': '',
'match_id': None},
{'answer_id': '',
'match_id': None},
{'answer_id': '',
'match_id': None}],
'quiz_question_id': '983744'},
{'answer': None,
'quiz_question_id': '983745'},
{'answer': None,
'quiz_question_id': '983749'},
{'answer': [],
'quiz_question_id': '983751'},
{'answer': {'': None},
'quiz_question_id': '983752'}],
'event_type': 'question_answered',
'id': '94746173'},
Not exactly broken. The API is marked as being in beta.
I got around this by writing a script that will download the csv quiz item analytics for me. I then combine those csv files and do my analytics.
The URL to use is /api/v1/courses/{CourseId}/quizzes/{QuizId}/reports. Add a form parameter to the header : quiz_report[report_type]=student_analysis. You will get back a Json object and two of its members are a filename and report id. Use the report id to go to /api/v1/courses/{CourseId}/quizzes/{QuizId}/reports/{ReportId}. This will then return the downloaded csv file.
You can do this in any language that allows http calls. In Python, it's really easy. You'll need to append your access_token as a URL parameter for each of these calls.
I have used that but did not work, can you explain it in more detail?
When I take the quiz as the test student, there are no results in the reports file. Is the test student automatically filtered out of the results?
Note that the test student's answers can be seen in the "Show Student Survey Results", but not in the exported file.
I am getting bad_request response ""quiz log auditing must be enabled".
Quiz log auditing can be enabled by an admin via Settings - Feature Options. Its in the Course block.
After enabling Quiz Log Auditing I am finally getting quiz_submission_events and its quiz_data but unfortunately the answers field is just an empty array.
I haven't tried to put all the pieces together yet to see if it's usable, but when I was trying to download submission data to include with Starfish, I inadvertently found quiz submission data. I didn't want it, it really slows things down, but there doesn't appear to be a way to not get it with the call I'm using.
If you fetch the submissions for an assignment and include[]=submission_history, then there is a property called submission_data that is included for quizzes.
It is an array with one element for each question. Here's what it looks like for one of my recent quizzes with 4 questions composed of multiple-drop downs.
"submission_data": [{
"correct": "partial",
"points": 1.5,
"question_id": 90432031,
"text": "",
"answer_for_tail": 25196,
"answer_id_for_tail": 25196,
"answer_for_distribution": 61563,
"answer_id_for_distribution": 61563,
"answer_for_method": 19986,
"answer_id_for_method": 19986,
"answer_for_conclusion": 10295,
"answer_id_for_conclusion": 10295,
"answer_for_claim": 13208,
"answer_id_for_claim": 13208,
"answer_for_groups": 28251,
"answer_id_for_groups": 28251,
"answer_for_type": 87115,
"answer_id_for_type": 87115,
"answer_for_ci": 15394,
"answer_id_for_ci": 15394
},
{
"correct": "partial",
"points": 1.5,
"question_id": 90432027,
"text": "",
"answer_for_evidence": 24718,
"answer_id_for_evidence": 24718,
"answer_for_groups": 4707,
"answer_id_for_groups": 4707,
"answer_for_action": 5303,
"answer_id_for_action": 5303,
"answer_for_method": 86834,
"answer_id_for_method": 86834,
"answer_for_mu": 44212,
"answer_id_for_mu": 44212,
"answer_for_distribution": 98118,
"answer_id_for_distribution": 98118,
"answer_for_result": 68307,
"answer_id_for_result": 68307,
"answer_for_pvalue": 64708,
"answer_id_for_pvalue": 64708,
"answer_for_type": 475,
"answer_id_for_type": 475,
"answer_for_tail": 5003,
"answer_id_for_tail": 5003
},
{
"correct": "partial",
"points": 1.5,
"question_id": 90432040,
"text": "",
"answer_for_groups": 55434,
"answer_id_for_groups": 55434,
"answer_for_method": 69146,
"answer_id_for_method": 69146,
"answer_for_mu": 29235,
"answer_id_for_mu": 29235,
"answer_for_distribution": 67257,
"answer_id_for_distribution": 67257,
"answer_for_conclusion": 99253,
"answer_id_for_conclusion": 99253,
"answer_for_pvalue": 55570,
"answer_id_for_pvalue": 55570,
"answer_for_tail": 36563,
"answer_id_for_tail": 36563,
"answer_for_type": 61334,
"answer_id_for_type": 61334
},
{
"correct": "partial",
"points": 2.25,
"question_id": 90432039,
"text": "",
"answer_for_method": 34332,
"answer_id_for_method": 34332,
"answer_for_type": 69963,
"answer_id_for_type": 69963,
"answer_for_conclusion": 43630,
"answer_id_for_conclusion": 43630,
"answer_for_alternative": 65351,
"answer_id_for_alternative": 65351,
"answer_for_tail": 69714,
"answer_id_for_tail": 69714,
"answer_for_df": 49660,
"answer_id_for_df": 49660,
"answer_for_groups": 74851,
"answer_id_for_groups": 74851,
"answer_for_distribution": 62939,
"answer_id_for_distribution": 62939
}
]
This is awesome! Just tested and works for text entry quizzes. Thanks James for sharing this info.
To participate in the Instructure Community, you need to sign up or log in:
Sign In