Celebrate Excellence in Education: Nominate Outstanding Educators by April 15!
Found this content helpful? Log in or sign up to leave a like!
Hello,
We are new to Canvas LMS at the University of San Diego. I have successfully configured the CLI/DAP and created and updated the tables in Postgres using dap initdb and dap syncdb commands manually. Now, I would like to schedule a batch file to refresh the tables nightly on a daily basis. Can you please guide me on how to achieve this?
Thanks in advance.
Satish
You will need to trigger the syncdb command at a regular schedule (once at most every 2-4 hours). (Never call initdb on a regular basis.) The list command will return all table names from a certain namespace (e.g. canvas). There are three namespaces in total: canvas, canvas_logs and catalog. The client library documentation has some examples that may help.
Thank you for your response, how do I trigger SyncDB? I tried the sample python scripts and I am getting all weird errors. Is possible to call e the Syncdb commands in a single power shell script (.ps1) (like multiple lines of syncdb commands for the each table). If there is any documentation, can you share on this?
Hello,
we are using python to refresh our tables nightly using a cron job.
https://github.com/cedarville-university/canvas-auto-shifter
Nice work on the synchronizer @ByambaBird .
We need a more out of the box automation. I "crafted" the script below but it keeps slamming the server with dap commands and spiking our CPU every time. Any advice on improving our script (run by cron job every 4 hours)?
Each table is initialized manually before adding it to this script to sync.
import subprocess
cd2_tables = ['account_users','accounts','assignments'...]
for x in cd2_tables:
p1 = subprocess.run('dap --loglevel info --logfile /home/canvas/dap_syncdb/dap_syncdb.log syncdb --namespace canvas --table ' + str(x), shell=True, text=True)
I use shell and text to view the output when I run the process manually. Otherwise, it runs every 4 hours as a cron job.
Hello @ptart
What is included in your subprocess.run()?
If your tables are already built, are you running a dap.replicator.sql.synchronize()?
We use that for syncing our already existing tables.
But because we are doing an async to do these in bulk though it does take a lot of CPU usage.
What are the specs on your server that you are running this on?
Amazing, thank you for sharing this with me.
Hey @ByambaBird - nice job! Are you running this within Azure as a time trigger Function? or on a VM? I'd like to create a Azure PostgreSQL Flexible server and run the refresh from an automation... but keep going around in circles!
Hello @Pete5484
Theoretically it is possible to run the refresh from automation since its all just python.
We have not tried the Azure PostgreSQL Flex server or used the Azure Time Trigger Function thus i do not know.
Its something that again would be possible just not what we are doing.
Looking at The Time Tigger function in Azure, looks like you can include code and since our code is python you could adjust it so that it could be run within Azure. With the correct credentials and destination, you could have it point to the Azure PostgreSQL server....
I just created a simple powershell script and scheduled in the Task Manager of Windows. It is working well as of now, but I will look at the Python script shared with me. Thank you.
Hello everyone, I would like to let you know that we have released DAP CLI 1.4.0 where the main feature is that you can define all or multiple tables for one DAP CLI run:
dap syncdb --namespace canvas --table all OR dap syncdb --namespace canvas --table content_tags,courses,submissions
I hope this helps reducing the complexity in your codebase and makes DAP CLI use easier.
To participate in the Instructure Community, you need to sign up or log in:
Sign In