I've replied to this question in other places, but missed seeing it here. Sorry for the delay and to help make up for it, I will try to include a more complete response than I've given in other places.
For classic quizzes, all of the responses for each quiz and for multiple students can be obtained using the Submissions API . The trick is to add the query parameter include[]=submission_history. and to use the assignment_id not the quiz_id.
The answers are coded in that multiple-choice responses don't give the actual response, they give the a code for the response, which is then compared to the questions given to identify the response.
When you include the submission history, you get an array called submission_history that includes an element for each submission made (multiple attempts means multiple submissions). Each of those elements in the array includes a property called submission_data, which is an array with one entry for each question that appears to be in the order that it was asked on the quiz, but I won't state that definitely because I answered the questions in the order they were presented in the example I'm looking at.
For example, here's what response looks like:
{
"submission_data":[
{
"correct":true,
"points":2,
"question_id":61125947,
"answer_id":"5172",
"text":"red",
"more_comments":""
},
{
"correct":"partial",
"points":2,
"question_id":61125950,
"text":"",
"answer_for_verb":"saw",
"answer_id_for_verb":"1423",
"answer_for_noun":"texas",
"answer_id_for_noun":null,
"more_comments":""
},
{
"correct":"partial",
"points":2.4,
"question_id":61125956,
"text":"",
"answer_3288":"1",
"answer_6891":"0",
"answer_549":"1",
"answer_5821":"0",
"answer_5900":"0",
"more_comments":""
},
{
"correct":false,
"points":0,
"question_id":61125957,
"text":"",
"answer_for_noun":1572,
"answer_id_for_noun":1572,
"answer_for_food":6061,
"answer_id_for_food":6061,
"more_comments":""
},
{
"correct":true,
"points":3,
"question_id":61125983,
"answer_id":4705,
"text":"0.1250",
"more_comments":""
},
{
"correct":true,
"points":2,
"question_id":61125986,
"answer_id":8243,
"text":"11.0000",
"more_comments":""
},
{
"correct":"defined",
"points":0,
"question_id":61125987,
"text":"<p>peace</p>",
"more_comments":""
},
{
"correct":"defined",
"points":0,
"question_id":61125988,
"attachment_ids":null,
"more_comments":""
}
]
}
Here is what was in the quiz
- Fill in the blank question where the student typed "red".
- Fill in multiple blanks question that had two blanks called "verb" and "noun". The student's responses were "saw" and "texas". Since there is an answer_id_for_verb that has a value, that was one of the accepted responses. answer_id_for_noun is null, meaning it didn't match.
- Multiple answers question where the student selected answer_3288 and answer_549. The student did not select answer_549 (which was correct) or answer_5821 or answer_5900 (both of which were incorrect). There is no way to know from just this data which answers were correct, but the answers appear in the order they were presented to the student.
- Multiple dropdowns question. This works like the fill in multiple blanks question where there were two items called "noun" and "food". The existence of an id for the answer_id_for_xxx means they got xxx correct.
- Numeric question response where the student responded with 0.1250. Technically, the student typed 0.125 and Canvas added the fourth decimal place.
- Formula question where the student answered 11 and Canvas padded it with four zeros.
- Essay question where the student typed "peace" and didn't get any points for it but had to be manually graded.
- File upload question that the student didn't answer but had to be manually graded. You can see from the data that it is going to include the attachment_ids for the uploaded files.
I just realized that my quiz designed to test all the different types of questions doesn't have any multiple choice questions on it. Can you tell I teach math and not a social science? Anyway, it includes an answer property with an answer_id value that matches the list of possible responses.
All of what I wrote is explained in the documentation, but in the appendix at the bottom of the Quiz Submission Questions API page. The placement of the information is weird since you can't actually get the student responses from this API. You can get the ID of the quiz submission using the Quiz Submissions API (use the id field) and then use that ID in the Quiz Submission Questions API.
A much easier way to get the information about each question is to use the Quiz Questions API and the list the questions used in a quiz or submission endpoint. It is for the entire course rather than individual students so it involves fewer API calls. This uses the quiz_id not the assignment_id. It works most of the time, but it has issues showing information on questions that are linked to a question bank. It may also have problems if the responses were modified (a mistake was corrected), but it does allow for you to specify the quiz_submission_id and quiz_submission_attempt in those cases.