Project

General

Profile

Bug #9849

problems with handling BIG files..

Added by Nicolas Houdelot over 6 years ago. Updated over 6 years ago.

Status:
Triaged
Priority:
Low
Assignee:
-
Category:
General
Target version:
Start date:
03/13/2014
Due date:
% Done:

20%

Estimated time:
Affected Version:
1.4.1
System:
all
bitness:
64-bit
hardware architecture:
amd64/x86

Description

Hello everyone
new user of Darktable, I just downloaded the v1.4.1 2 days ago from the PPA on Ubuntu 12.04.

Darktable was very interreting until i choose to add a 3.1 GB folder with some tiffs.
those pictures are 3200 DPI scan from my flatbed scanner. Old 6*3 pics from the 30's that can be ~ 50 Mpixels.
Each negative is a 215 Mb file in .tif/16 bit

Usually, creating thumnail from those files don't make any problems, except that it's longer.

On Darktable, Thumbnail creation seems to be multithreaded. That's a good thing except when you have a Core I7, and Darktable try to load 8 files in a time, making your computer running out of memory. (and hanging it for... long time.. did 4 resets as I loose patience to try to find what's happening)
Of course, I can tune this to have only one thread. But I will loose all the benefit of my 8 thread for just making the light table work.

There should be a way for Darktable to avoid full memory to happen. I don't know.. something like : drop a thread of thumbnail creation when memory is at a certain level.

thanks

History

#1 Updated by Tobias Ellinghaus over 6 years ago

  • System changed from Ubuntu to all
  • % Done changed from 0 to 20
  • Status changed from New to Triaged

#2 Updated by Ulrich Pegelow over 6 years ago

Sounds like darktable starts to allocate more memory than is available as physical RAM. If that happens your system will go into swapping mode which is so slow that the system seemingly is hanging.

Question: does this also happen if you import big JPEG files? Reason I'm asking is the following report [[http://sourceforge.net/p/darktable/mailman/message/32077903/]] which indicates that there might be some issues with memory leaks in our TIFF reading code.

#3 Updated by Nicolas Houdelot over 6 years ago

humm no.
I converted a pic to jpeg, duplicate it 156 times, and loaded the folder.
4.8G used without any thumbnail, during the thumbnail creation it goes up to 8G, but after i'm back to 4.8G..

with tiff and exr, (even little files), started at 4.3G memory used, ended at 5.7G after thumbnail creation

#4 Updated by Ulrich Pegelow over 6 years ago

Looks like the problem is related to the tiff/exr reading code (memory leak). Hope I find some time to have a look.

#5 Updated by Ulrich Pegelow over 6 years ago

I did a few checks with TIFF and JPEG film rolls. According to my tests there is no difference in memory consumption between the two. In both cases memory usage goes up to something like 8GB with a number of 6 background threads. This contradicts my assumption of a memory leak in the tiff reading code.

#6 Updated by Nicolas Houdelot over 6 years ago

I thought the problem was for tiff.. but it's also for jpeg

Here is a little video showing the problem with a simple process that is slightly different : http://youtu.be/KT48zuyFGPY

on the right you can see (in hd) the memory increasing.

the picture i was using https://dl.dropboxusercontent.com/u/15417023/Scan-110918-0011_.jpg

#7 Updated by Ulrich Pegelow over 6 years ago

  • % Done changed from 20 to 10
  • Priority changed from Low to High
  • Status changed from Triaged to Confirmed

I confirm your findings. With each re-importing of the file the system loses about 900MB of memory. This brings my 16GB system really down after about 10 cycles. Looks like a major memory leak.

An important point to note: this happens for an image that has no history stack - and as it's a JPEG there is a bare minimum of modules enabled.

I have put some other devs in copy with the hope that someone has an idea where to start.

#8 Updated by Ulrich Pegelow over 6 years ago

  • % Done changed from 10 to 20
  • Target version set to Future
  • Priority changed from High to Low
  • Status changed from Confirmed to Triaged

It's some time ago that this topic was discussed. I can't offer a perfect solution but at least there is a workaround. For each image that darktable loads darktable needs to allocate one full sized image buffer as a minimum. If you have huge images and if you have multiple parallel background threads you can easily exceed even a well equipped system. The workaround is setting the number of parallel background threads to a value low enough that you don't run out of memory. This depends on the image size you are typically working with and needs some experiments. If in doubt go to only one background thread.

This low number of parallel threads sounds worse than it is. darktable makes intensive use of multi-threading in its internal pixel pipeline. So even with only one background thread all of your CPUs are kept busy. Certainly you lose some performance but that's mainly due to disk I/O. According to a few tests here on my system thumbnail generation slows down maximum by a factor of 2.

Also available in: Atom PDF

Go to top