Just now I was thinking it sure would be nice if CoPilot could analyze my Access database and give me tips on things. So I tried an experiment.
I asked CoPilot if it could create databases or if I could upload my database for it to analyze.
It thought it would be able to do that if I could upload an ACCDB or MDB file. It told me it couldn’t create an Access database for me, but it could analyze my upload and that it could help me modify or create a new ACCDB file by walking me through.
Ok, so far so good. I tried uploading an ACCDB file directly, but it wouldn’t let me. That file type was not accepted.
Next I thought maybe I could send it my folder of exported Access object files I use for versioning. When I use OASIS-SVN I export all the objects to a source folder. It said I could zip up all the files in that folder and send them that way. But alas, that was incorrect. You cannot upload zip files.
At this point, I figured maybe I could just select all the files and upload them, but unfortunately there were 2 issues. I was not allowed to upload more than 3 files at once and the file extensions are not acceptable for upload, even though they are text files.
Not to be stymied, I continued my quest. How about if I merged all the text together into one file with each file content beginning with the file name? I figured if I could do that I’d be able to upload a file with everything in it with a txt extension. CoPilot thought that was a great idea and helpfully provided me with a script to consolidate all file contents into a single txt file. Ok, now we’re getting somewhere.
Next I dutifully saved the script in the folder above the source folder because I didn’t want it to include itself in the final output. I had to ask it to update the script to use the source subfolder. CoPilot provided a modified script. Woo-hoo.
I opened PowerShell and navigated to the folder I wanted to run the script in and attempted to execute it. But my machine can’t execute scripts! I’ve dealt with this before, and I think a recent update may have reset the permissions. CoPilot gave me instructions on how to either temporarily bypass the restriction or to remove the restriction since I had admin rights.
Ok, so now I was able to run the script. But it errored out and couldn’t find the content of the files. This was because it didn’t give me a script that would take special characters or spaces in the file names into account. So it gave me an updated copy of the script that did. And finally, it worked!
It created a huge 11mb file that combined all my source exports into one file with headers ahead of the content of each file to say what the file name was.
And now I was finally able to upload that single file (yes, it uploaded, even 11mb, although it transparently put the file on my OneDrive and actually got it from there). But it was able to analyze the file and answer my question about improving efficiency. Check out the final results after all my hard work:
My prompt:
Yay, it’s working. Um… let’s see, I forgot my question. Oh yeah, so when I open the db it automatically runs a routine to import files. This routine takes a long time. Can you give me some tips on how I could speed it up?
CoPilot’s answer:
Thanks for your patience! Based on the analysis of your uploaded file, here’s what I found regarding the routine that runs automatically on database open to import files:
🔍 What’s Happening on Startup
The routine that runs on startup appears to be:
Public Sub LoadGentranFilesToDb()
This subroutine is defined in the module M_EDI_GentranFileLoader
and is responsible for:
- Opening the
GentranFiles
table. - Iterating through file paths defined in
APPopt.ImportFolders
. - Checking for new files not already in the database.
- Calling
ProcessFile
for each new file, which:- Validates the file.
- Updates file properties and hash.
- Stages the file.
- Updates or inserts records in the database.
🐢 Why It Might Be Slow
Here are some common performance bottlenecks in routines like this:
- Large Number of Files: If the import folder contains many files, especially large ones, it will take time to process each.
- Repeated Hashing and Property Updates:
UpdateFilePropertiesAndHash
andStageFile
might be doing intensive I/O or string parsing. - Database Lookups:
rsGentran.FindFirst
is called for each file — this can be slow if the table is large and not indexed properly. - No Parallelism: Everything is done sequentially.
- Unnecessary Reprocessing: If files are rechecked even when unchanged, that adds overhead.
🚀 Tips to Speed It Up
Here are some actionable ideas:
✅ 1. Index the GentranFiles
Table
Ensure that FileName
and FilePath
are indexed. This will speed up the FindFirst
operation.
✅ 2. Skip Already Processed Files Earlier
Instead of using FindFirst
on a recordset, consider using a dictionary or hash set to store known file names and paths for faster lookup.
✅ 3. Avoid Rehashing Unchanged Files
If the file’s LastModified
timestamp hasn’t changed, skip reprocessing it.
✅ 4. Batch File Processing
If possible, process files in batches or use CurrentDb.Execute
with INSERT INTO ... SELECT
for bulk operations.
✅ 5. Log and Profile
Add logging around:
- File count
- Time per file
- Time per major step (e.g., hashing, staging, DB update)
This will help identify the slowest parts.
Would you like help rewriting or optimizing part of the LoadGentranFilesToDb
or ProcessFile
routine? I can suggest a refactored version or help you implement some of these improvements.