I think you're convincing me not to update to the latest version of the CLI tool. I'm running version 0.2.2 from May 2016 that doesn't delete my requests file after downloading.
As far as the incremental nature goes, I only load the incremental files since the last time I loaded them. But I generally only load about a week's worth of requests table information as I want something small that I can develop with rather than loading the entire table. Sometimes I pre-process the requests table to remove the ID column and sometimes the user_agent string depending on what I'm working on at the time.
I don't use the CLI Tool for anything other than to download the files. I've got my own BASH script that queries the MySQL installation and only installs files newer than the current version in there. The MySQL script I use to create the tables also creates a version table to hold the last version loaded.
My last download was up to version 482 of the requests table (filename: 482_requests-00000-2a7b048c.gz), so if I wanted to install a week's worth I would just set the last version loaded to an appropriate value before that. If I want to skip the requests table completely and just update everything else (useful after schema changes when I'm not using the requests at that particular time), I just set the version to 1000 (something bigger than what I have) so it skips the requests table. For the requests table, it doesn't create one huge file first, it only loads the files one at a time, which makes it a lot easier to do incremental updates and doesn't waste the space needed to keep the uncompressed flat file around.
I'm not saying that's the best way to do it, just what I've worked out. The MySQL schema and import.sh files are available at canvancement/canvas-data/mysql on GitHub. I would consider those starter scripts for people to get their Canvas Data into something quickly so they can start playing around, but then they'll want to customize it for their installation at some point. I have found that adding some extra indexes can really speed things up.
I would not even consider doing a full reload of the requests table every day -- we were given an old retired server with only 4GB RAM to work with and added some new hard drives. A full-update of everything but requests plus an incremental update of a week's worth of the requests table just took 38:15 to run. 2:49 of that was for 173MB of requests table files. That's just 0.87% of the total requests file were I to do a full load. If you assume a similar rate for the rest, that would be about 5.4 hours to import just the requests table and it would only be growing each day. For a reference, we have 20.8 GB of Canvas Data flat files, 19.8 GB of which is the requests table. We're not a large school, but we have been using Canvas since Fall 2012.