Problems with content_tags table (CD2)

Jump to solution
seba-ortiz
Community Member

Hi everyone, so basically, I connect and initialize/synchronize CD2 tables using the instructure-dap-client library in Python. Everything was going well until I tried to initialize the 'content_tags' table. I've been encountering the following error for several days, and I can't seem to find the cause of it. Has anyone experienced this before?

UTC [3538] ERROR: COPY from stdin failed: 'content_id'

UTC [3538] STATEMENT: COPY "canvas"."content_tags"("id", "created_at", "updated_at", "workflow_state", "context_id", "context_type", "context_code", "comments", "migration_id", "content_id", "tag_type", "context_module_id", "learning_outcome_id", "mastery_score", "rubric_association_id", "associated_asset_id", "associated_asset_type", "link_settings", "new_tab", "position", "content_type", "url", "title") FROM STDIN (FORMAT binary)

 

UPDATE: tried initializing the table through CMD and it gives me the following error/log: 

2023-10-27 14:19:39,439 - INFO - Query started with job ID: c5c1fa04-d330-4862-b4ee-7a98a51987f4
2023-10-27 14:19:39,440 - INFO - Query job still in status: waiting. Checking again in 5 seconds...
2023-10-27 14:19:45,207 - INFO - Query job still in status: running. Checking again in 5 seconds...
2023-10-27 14:19:51,145 - INFO - Query job still in status: running. Checking again in 5 seconds...
2023-10-27 14:19:57,177 - INFO - Query job still in status: running. Checking again in 5 seconds...
2023-10-27 14:20:02,996 - INFO - Query job still in status: running. Checking again in 5 seconds...
2023-10-27 14:20:08,626 - INFO - Query job still in status: running. Checking again in 5 seconds...
2023-10-27 14:20:14,358 - INFO - Query job still in status: running. Checking again in 5 seconds...
2023-10-27 14:20:21,048 - INFO - Data has been successfully retrieved:
{"id": "c5c1fa04-d330-4862-b4ee-7a98a51987f4", "status": "complete", "expires_at": "2023-10-28T17:19:38Z", "objects": [{"id": "c5c1fa04-d330-4862-b4ee-7a98a51987f4/part-00000-99592007-88c4-4526-a993-85da3b68e704-c000.json.gz"}, {"id": "c5c1fa04-d330-4862-b4ee-7a98a51987f4/part-00005-99592007-88c4-4526-a993-85da3b68e704-c000.json.gz"}], "schema_version": 1, "at": "2023-10-27T16:22:06Z"}
2023-10-27 14:20:22,986 - INFO - Downloading [object 2/2 - job c5c1fa04-d330-4862-b4ee-7a98a51987f4]
2023-10-27 14:20:23,269 - INFO - Downloading [object 1/2 - job c5c1fa04-d330-4862-b4ee-7a98a51987f4]
2023-10-27 14:20:23,886 - ERROR - 'content_id'
Traceback (most recent call last):
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\__main__.py", line 133, in console_entry
main()
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\__main__.py", line 125, in main
asyncio.run(dapCommand.execute(args))
File "c:\users\usuario\appdata\local\programs\python\python39\lib\asyncio\runners.py", line 44, in run
return loop.run_until_complete(main)
File "c:\users\usuario\appdata\local\programs\python\python39\lib\asyncio\base_events.py", line 642, in run_until_complete
return future.result()
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\commands\commands.py", line 31, in execute
executed = await super().execute(args)
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\commands\base.py", line 49, in execute
if await subcommand.execute(args):
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\commands\base.py", line 45, in execute
await self._execute_impl(args)
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\commands\initdb_command.py", line 31, in _execute_impl
await init_db(
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\actions\init_db.py", line 16, in init_db
await SQLReplicator(session, db_connection).initialize(
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\replicator\sql.py", line 67, in initialize
await client.download(
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\downloader.py", line 83, in download
await wait_n(
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\concurrency.py", line 49, in wait_n
raise exc
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\downloader.py", line 81, in logged_download_and_save
await self._download(db_lock, context_aware_object, processor=processor)
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\downloader.py", line 113, in _download
await processor.process(obj, records)
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\integration\base_processor.py", line 19, in process
await self.process_impl(obj, self._convert(records))
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\plugins\postgres\init_processor.py", line 63, in process_impl
await self._db_connection.execute(
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\integration\base_connection.py", line 41, in execute
return await query(self._raw_connection)
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\plugins\postgres\queries.py", line 26, in __call__
return await self._query_func(asyncpg_conn)
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\asyncpg\connection.py", line 983, in copy_records_to_table
return await self._protocol.copy_in(
File "asyncpg\protocol\protocol.pyx", line 525, in copy_in
File "asyncpg\protocol\protocol.pyx", line 442, in asyncpg.protocol.protocol.BaseProtocol.copy_in
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\plugins\postgres\init_processor.py", line 81, in _convert_records
yield tuple(converter(record) for converter in self._converters)
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\plugins\postgres\init_processor.py", line 81, in <genexpr>
yield tuple(converter(record) for converter in self._converters)
File "c:\users\usuario\appdata\local\programs\python\python39\lib\site-packages\dap\conversion_common_json.py", line 82, in <lambda>
return lambda record_json: record_json["value"][column_name]
KeyError: 'content_id'

0 Likes
1 Solution
LeventeHunyadi
Instructure
Instructure

The column content_id in the table canvas.content_tags is marked as required (i.e. cannot have a NULL value) but the output is likely missing this field in the corresponding JSON object (i.e. the field has a NULL value). You should get in touch with support, and include your root account UUID and the job ID such that the team can look into the issue.

View solution in original post

0 Likes