Canvas Data 2 DAP Client initdb issues (MySQL)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Edina_Tipter and @LeventeHunyadi Since there is no Git issue tracking for this project, I will post my issue here.
Below are description of the errors when running the `initdb` command using a MySQL connection and how I resolved the error for the following tables:
assessment_questions
Error Message:
(pymysql.err.OperationalError) (1629, "Comment for field 'question_data' is too long (max = 1024)")
[SQL:
CREATE TABLE canvas_data2.assessment_questions (
id BIGINT NOT NULL COMMENT 'The unique identifier for the Assessment Question.' AUTO_INCREMENT,
name TEXT COMMENT 'Name of the question.',
deleted_at DATETIME COMMENT 'The time the question was deleted. If the question has not been deleted the value will be NULL.',
created_at DATETIME COMMENT 'Time when the quiz question was created.',
updated_at DATETIME COMMENT 'Time when the quiz question was last updated.',
workflow_state ENUM('active','independently_edited','deleted','__dap_unspecified__') COMMENT 'Life-cycle state for the quiz question.',
context_id BIGINT COMMENT 'The unique identifier for the question bank''s context (account or course). No longer used as Canvas delegates to `context_id` of the associated AssessmentQuestion.',
context_type VARCHAR(255) COMMENT 'The type of context the question bank is associated with. No longer used as Canvas delegates to the context of the associated AssessmentQuestion.',
question_data TEXT COMMENT 'A variety of question data related points such as: `question_type` (denotes the type of the question. Possible values are `calculated_question`, `essay_question`, `file_upload_question`, `fill_in_multiple_blanks_question`, `matching_question`, `multiple_answers_question`, `multiple_choice_question`, `multiple_dropdowns_question`, `numerical_question`, `short_answer_question`, `text_only_question` and `true_false_question`); `question_text` (text of the question); `regrade_option` (denotes if regrading is available for the question. Possible values are `available` and `unavailable` for question types `multiple_answers_question`, `multiple_choice_question`, `true_false_question` and NULL for others. Defaults to `available` for the allowed question types and NULL for the rest); `correct_comments` (comments to be displayed if the student answers the question correctly); `incorrect_comments` (comments to be displayed if the student answers the question incorrectly); `neutral_comments` (comments to be displayed regardless of how the student answers the question); answers (JSON array of all possible answers).',
assessment_question_bank_id BIGINT COMMENT 'The unique identifier for the assessment question bank this question is associated with.',
migration_id VARCHAR(255) COMMENT 'The unique identifier of the migration that imported this assessment question.',
position INTEGER COMMENT 'The position of the question.',
PRIMARY KEY (id)
)
]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
Resolution:
After debugging the DAP client source code, I was able to resolve the issue by adding some code to truncate the column comment if larger than 1024 characters using this code:
# .venv/lib/python3.11/site-packages/dap/plugins/mysql/init_processor.py
def _create_tables(conn: Connection, table_def: Table) -> None:
inspector: Inspector = inspect(conn)
# New code start
for col in table_def.columns._all_columns:
# Truncate the column comment to 1024 characters
if col.comment is not None and len(col.comment) > 1024:
col.comment = col.comment[:1024]
# New code end
if table_def.schema is not None and not inspector.has_schema(table_def.schema):
conn.execute(CreateSchema(table_def.schema)) # type: ignore
table_def.metadata.create_all(conn)
web_conferences
Error message:
(pymysql.err.OperationalError) (1291, "Column 'conference_type' has duplicated value 'DimDim' in ENUM")
[SQL:
CREATE TABLE canvas_data2.web_conferences (
id BIGINT NOT NULL COMMENT 'The ID of the conference.' AUTO_INCREMENT,
user_id BIGINT NOT NULL COMMENT 'The ID of the user that created the conference.',
created_at DATETIME NOT NULL COMMENT 'Timestamp when record was created.',
updated_at DATETIME NOT NULL COMMENT 'Timestamp when record was updated.',
context_id BIGINT NOT NULL COMMENT 'The ID of this conference''s context.',
context_type ENUM('Account','Course','Group','__dap_unspecified__') NOT NULL COMMENT 'The type of this conference''s context.',
start_at DATETIME COMMENT 'The date the conference started at, NULL if it hasn''t started.',
end_at DATETIME COMMENT 'The date that the conference ended at, NULL if it hasn''t ended.',
context_code VARCHAR(255) COMMENT 'A string identifying the context.',
started_at DATETIME COMMENT 'The time at which this conference actually started at, NULL if it hasn''t started.',
user_ids VARCHAR(255) COMMENT 'Array of user ids that are participants in the conference.',
ended_at DATETIME COMMENT 'The time at which this conference actually ended, NULL if it hasn''t ended.',
recording_ready BOOL COMMENT 'Whether the conference''s recording has been processed.',
conference_type ENUM('AdobeConnect','BigBlueButton','CiscoWebex','DimDim','Wimba','Wiziq','dimdim','__dap_unspecified__') NOT NULL COMMENT 'The type of conference.',
conference_key VARCHAR(255) COMMENT 'The 3rd party''s ID for the conference.',
description TEXT COMMENT 'The description for the conference.',
duration DOUBLE COMMENT 'The expected duration the conference is supposed to last.',
settings TEXT COMMENT 'Settings for the given conference.',
title VARCHAR(255) NOT NULL COMMENT 'The title of the conference.',
uuid VARCHAR(255) COMMENT 'A unique ID used for integrations.',
PRIMARY KEY (id)
)
]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
Resolution:
I changed the default collation of the schema to a case sensitive collation so that the `DimDim` and `dimdim` enum values did not overlap. Here is the query I used for my schema (canvas_data2):
alter schema canvas_data2 default collate = 'utf8mb4_0900_as_cs';
It would be awesome to have both of these issues resolved in a future DAP client release so that I do not need to manually change code or change the default table collation. Thank you for producing this client library and looking forward to future updates!