Rescan updated directories
- see 'skulls'
- don't see images which have been added to a folder after importing it to DT
This can only be solved by deleting the directory from DT and readding it manually.
It should be possible to rescan a folder (and its subfolders) from DT's UI.
A good place for the rescan option would be the folder's context menu (in the /folders/ view of the /collect images/ module).
A script to purge non existing images can already be found here:
#1 Updated by Jesko N about 4 years ago
Additionally: After rescanning a directory all empty film roles in this directory (or its subdirectories) should be removed from the DB as well. (Currently empty film roles are displayed crossed out in the folders view and it's even impossible to remove them manually by clicking the "remove" option of the context menu of a folder in the folders view)
#4 Updated by Joel T over 3 years ago
Could the behavior of purge_non_existing_images.sh be incorporated into re-importing a folder (but only for that folder)?
I use darktable on two machines, both of which access the same images from a common nfs share. In order to refresh changes from the other machine, I have to:
1. Purge missing images
a. Quite darktable
b. Run purge_non_existing_images.sh (open terminal, et)
c. Re-open darktable
2. Refresh changes, add new images
a. Re-import folder
Instead, the desired behavior should be:
1. Refresh changes, add new images, purge missing images
a. Re-import folder
#5 Updated by Tobias Ellinghaus over 3 years ago
- System set to all
- bitness set to 64-bit
- Affected Version set to git development version
- Category set to Lighttable
I guess that could be added to the crawler we have in master. It already looks for updated XMP files, checking the raw files should be possible, too. That way it would give you a list with all images that got removed (*) on startup, no need to reimport anything.
Would that help?
(*) It will however not look for newly added images.
#6 Updated by Joel T over 3 years ago
Yes, that would be perfect!
Actually, one thought: would that delay program startup? Hopefully the master crawler runs in parallel and therefore would not increase the start time.
I have a relatively small collection, but already the purge script takes ~10 seconds...
[joel@manjaro ~]$ time ./purge_non_existing_images.sh
#7 Updated by Tobias Ellinghaus over 3 years ago
- Target version set to Future
- % Done changed from 0 to 20
- Status changed from New to Triaged
- Assignee set to Tobias Ellinghaus
No, it doesn't run in parallel, since subsequent parts of initializing darktable have to run after the crawler is done. If you are on git master you can just enable it in preferences and see for yourself how long it takes.
#8 Updated by Joel T over 3 years ago
Sorry, I haven't had a chance to test this out since I am not currently on git master.
I appreciate your efforts on this, and I think the best place for it would be on Import -> Folder instead of on startup.
What if a user's drive is temporarily disconnected (network, USB unplugged, etc)? In that case, simply opening Darktable could purge quite a lot unexpectedly.
However, if a user tries to (re-)import a folder, then it would be clear if the folder was not available, and it would also be expected that the folder contents should be refreshed.