Project

General

Profile

Bug #8767

Memory usage with jpg

Added by Timothé Bronkhorst over 7 years ago. Updated about 5 years ago.

Status:
Fixed
Priority:
Critical
Assignee:
-
Category:
General
Start date:
06/11/2012
Due date:
% Done:

100%

Estimated time:
Affected Version:
git development version
System:
Ubuntu
bitness:
64-bit
hardware architecture:
amd64/x86

Description

My PC caracteristics : Intel Proc 64 bits, 4Go Ram, 4Go Swap, NVidia graphic card, Ubuntu 12.04

The problem : since the 1.0.0 version, if I work with jpegs, DT saturates the RAM memory and uses about 50 % of my Swap, making the PC really slow... It happens when I'm using the darkroom, or when I'm exporting JPG.
If I'm working with RAW, there's no problem. DT uses about 90% of my ram, but never uses the Swap, so the PC is still usable fine even if I'm exporting RAW to jpg or if I'm working with RAW in the darkroom.
I took 2 screenshots of the monitor, one when I'm exporting JPG, the other when I'm exporting RAW.

Export JPG.png (344 KB) Export JPG.png Exporting JPG to disk Timothé Bronkhorst, 06/11/2012 01:34 AM
Export Raw.png (357 KB) Export Raw.png Exporting RAW to disk Timothé Bronkhorst, 06/11/2012 01:34 AM
valgrind.txt (25.9 KB) valgrind.txt Jesper Pedersen, 07/04/2012 12:00 PM
valgrind-jpg.txt.gz (108 KB) valgrind-jpg.txt.gz Jesper Pedersen, 07/04/2012 01:02 PM
dtcli_valgrind.txt.bz2 (32 KB) dtcli_valgrind.txt.bz2 Tobias Ellinghaus, 07/04/2012 03:00 PM
008.jpg (853 KB) 008.jpg Patrick Gui, 04/20/2014 01:19 AM
009.jpg (862 KB) 009.jpg Patrick Gui, 04/20/2014 01:19 AM
010.jpg (812 KB) 010.jpg Patrick Gui, 04/20/2014 01:19 AM
011.jpg (765 KB) 011.jpg Patrick Gui, 04/20/2014 01:19 AM
darktable.log (730 KB) darktable.log Patrick Gui, 04/20/2014 10:36 PM
Capture du 2014-04-20 21_24_41.png (69.9 KB) Capture du 2014-04-20 21_24_41.png Patrick Gui, 04/20/2014 10:36 PM
jpegs.csv (7.93 KB) jpegs.csv Patrick Gui, 04/20/2014 10:47 PM
create_noise_jpegs.py (144 Bytes) create_noise_jpegs.py Torsten Bronger, 04/23/2014 10:19 PM
create_noise_jpegs.py (235 Bytes) create_noise_jpegs.py parallised Torsten Bronger, 04/23/2014 10:25 PM

Related issues

Has duplicate darktable - Bug #8811: Memory leak when loading imagesDuplicate07/03/2012

Associated revisions

Revision bc4b6cf5 (diff)
Added by Roman Lebedev about 5 years ago

Let libjpeg release memory. Fixes #8767

History

#1 Updated by Simon Spannagel over 7 years ago

  • Priority changed from Critical to Medium
  • Category set to General

Adding informations from the website comments:

*Wangtim on March 17, 2012 at 16:30 said: Edit *
This looks great, but unfortunately the new release can’t load my previous thumbnails and crashes, shows skullheads instead of the pictures and tells me to buy new ram (I have 4 Go…). I didn’t manage to make it work.
Any ideas how to solve this ?
By the way, thank you for your hard work, I love your software !

Wangtim on March 24, 2012 at 16:57 said:
I managed to make DT work (a little bit). I increased my swap memory and, even if it was really slow, I was able to re-import my old pictures.
Now DT is working fine for the first pictures I’m working on, but begins to be really slow after a few pictures.
The problem comes from the RAM management, DT is unable to free my RAM memory after I open a new picture. I took a screenshot of the Ubuntu Monitor as I open around 6 pictures. You can clearly see the ram use increasing each time I open a picture; after reaching the RAM limit (4 Go), it uses the swap and begins lagging…
Here is the sceenshot : http://my.opera.com/Wangtim/albums/showpic.dml?album=9739912&picture=148132402
Should I file a bug on sourceforge ? Is there a way to solve this problem ?
Thank you !

Wangtim on March 27, 2012 at 21:44 said:
It’s a Ubuntu-64 bit, 4Go RAM, 4Go Swap.
All of them are Jpegs.
I tested the latest unstable version but it did not solve anything. It’s quite annoying since it’s making the software almost unusable…
Thank you for your help !

Wangtim on March 28, 2012 at 23:49 said:
dartable -d memory at startup :
[memory] at startup
[memory] max address space (vmpeak): 164536 kB
[memory] cur address space (vmsize): 164536 kB
[memory] max used memory (vmhwm ): 5744 kB
[memory] cur used memory (vmrss ): 5744 kB
[memory] after cache initialization
[memory] max address space (vmpeak): 390528 kB
[memory] cur address space (vmsize): 377172 kB
[memory] max used memory (vmhwm ): 42344 kB
[memory] cur used memory (vmrss ): 42344 kB
[memory] after cache initialization
[memory] max address space (vmpeak): 509316 kB
[memory] cur address space (vmsize): 495960 kB
[memory] max used memory (vmhwm ): 161380 kB
[memory] cur used memory (vmrss ): 161380 kB
[memory] after cache initialization
[memory] max address space (vmpeak): 589688 kB
[memory] cur address space (vmsize): 576332 kB
[memory] max used memory (vmhwm ): 161400 kB
[memory] cur used memory (vmrss ): 161400 kB
[memory] after cache initialization
[memory] max address space (vmpeak): 670048 kB
[memory] cur address space (vmsize): 656692 kB
[memory] max used memory (vmhwm ): 161408 kB
[memory] cur used memory (vmrss ): 161408 kB
[memory] after cache initialization
[memory] max address space (vmpeak): 750404 kB
[memory] cur address space (vmsize): 737048 kB
[memory] max used memory (vmhwm ): 161412 kB
[memory] cur used memory (vmrss ): 161412 kB
[memory] after cache initialization
[memory] max address space (vmpeak): 830760 kB
[memory] cur address space (vmsize): 817404 kB
[memory] max used memory (vmhwm ): 161416 kB
[memory] cur used memory (vmrss ): 161416 kB
[memory] after cache initialization
[memory] max address space (vmpeak): 911116 kB
[memory] cur address space (vmsize): 897760 kB
[memory] max used memory (vmhwm ): 161420 kB
[memory] cur used memory (vmrss ): 161420 kB
[memory] after successful startup
[memory] max address space (vmpeak): 1516016 kB
[memory] cur address space (vmsize): 1502660 kB
[memory] max used memory (vmhwm ): 241556 kB
[memory] cur used memory (vmrss ): 241556 kB
[memory] before pixelpipe process
[memory] max address space (vmpeak): 2515512 kB
[memory] cur address space (vmsize): 2502156 kB
[memory] max used memory (vmhwm ): 773196 kB
[memory] cur used memory (vmrss ): 709380 kB
darktable -d memory after the RAM is full :
[memory] before pixelpipe process
[memory] max address space (vmpeak): 4716372 kB
[memory] cur address space (vmsize): 4067788 kB
[memory] max used memory (vmhwm ): 3201696 kB
[memory] cur used memory (vmrss ): 2858916 kB
darktable -d cache at startup :
[image_cache] has 262144 entries
[mipmap_cache_init] cache has 1024 entries for mip 0 ( 78.48 MB).
[mipmap_cache_init] cache has 256 entries for mip 1 ( 78.47 MB).
[mipmap_cache_init] cache has 64 entries for mip 2 ( 78.47 MB).
[mipmap_cache_init] cache has 16 entries for mip 3 ( 78.47 MB).
[mipmap_cache_init] cache has 4 entries for mip 4 ( 78.47 MB).
darktable -d cache after opening a few pictures on the lighttable :
[mipmap_cache] level 0 fill 32.34/70.64 MB (45.79% in 422/1024 buffers)
[mipmap_cache] level 1 fill 0.00/70.63 MB (0.00% in 0/256 buffers)
[mipmap_cache] level 2 fill 0.00/70.62 MB (0.00% in 0/64 buffers)
[mipmap_cache] level 3 fill 0.00/70.62 MB (0.00% in 0/16 buffers)
[mipmap_cache] level 4 fill 0.00/70.62 MB (0.00% in 0/4 buffers)
[mipmap_cache] level [full] fill 1/3 slots (33.33% in 1/16 buffers)
Is it helping ? I’m not that familiar with console…

Wangtim on April 4, 2012 at 03:01 said:
I checked the “width/height of image drawing area” and it’s not that big : 1300 * 1000 (my screen resolution is 1920*1200).
I made a few tests and it seems the problem is only happening with jpegs, and not with raws. I shooted a lot of raws and the ram memory was quite stable; but as soon as I work with jpeg, the memory increase each time I open a new picture and never comes down (not until I close DT).
Hope it helps solving the bug :)

Wangtim on June 6, 2012 at 02:22 said:
I still have this nasty memory bug with jpg (I reported it in the comments of the 1.0 version and the 1.03 version). I have 4 Go of RAM and 4 Go of Swap.
I made a few more tests : when I export jpg to the disk, the memory raises until it uses about 90% of the 4 Go, then uses the Swap until it reaches around 1.7 Go. DT, and the computer in general, are painfully slow when the Swap is used.
A screenshot of the Ubuntu Monitor : http://my.opera.com/Wangtim/albums/showpic.dml?album=9739912&picture=155824102#bigimg
If I export RAW to the disk, the memory raises until it uses 90% of the 4 Go, but then don’t uses the Swap. The computer works fine when exporting the pictures…
Here’s the screenshot : http://my.opera.com/Wangtim/albums/showpic.dml?album=9739912&picture=155824112#bigimg
Is there something to do to solve this bug ? It could be a problem with my configuration, but the fact that it’s working nicely with RAW makes me think that it’s related to DT…
Thank you for your help and for your nice software !

Adding Johannes to the watchers list.
Setting severity to "Medium" since it "only" affects the JPEG path. ;)

#2 Updated by Richard Wonka over 7 years ago

One more affected version: Git master is also sluggish when exporting from jpegs.

#3 Updated by Simon Spannagel over 7 years ago

Okay, thanks for the note, I leave affected version at 1.0.4 since that's severer.

#4 Updated by Simon Spannagel over 7 years ago

  • % Done changed from 0 to 20
  • Target version set to Candidate for next patch release
  • Priority changed from Medium to High
  • Status changed from New to Triaged

#5 Updated by Jesper Pedersen over 7 years ago

Valgrind report of simple darktable session (lightroom, 1 image import, darkroom, enable/disable a couple of different modules, lightroom, export). This was for .CR2 -> .JPG.

I removed the non-darktable stuff... SQLLite has some serious issues.

Command: valgrind --tool=memcheck --leak-check=yes darktable
Version: 1.0.4+27~g34af8b4

#6 Updated by Jesper Pedersen over 7 years ago

Similar to above, just import and export of 2 .JPG. Contains more leak information.

Didn't clean up between 16 bytes and 7840 bytes.

#7 Updated by Tobias Ellinghaus over 7 years ago

Attached is the output of valgrind when run against the cli tool, exporting from a jpeg file to a jpeg file with a xmp file containing a non-empty history stack (IIRC just monochrome enabled).

#8 Updated by Torsten Bronger over 7 years ago

I cannot reproduce this bug anymore with git-master. Any news on this?

#9 Updated by Tobias Ellinghaus over 7 years ago

  • Status changed from Triaged to Incomplete

#10 Updated by Timothé Bronkhorst over 7 years ago

The bug is still not solved in version 1.0.5 (tested with ubuntu 12.10 and 10.04).

#11 Updated by Simon Spannagel about 7 years ago

  • System set to unknown
  • Affected Version changed from 1.0.4 to 1.0.5
  • Priority changed from High to Critical
  • Status changed from Incomplete to Triaged

#12 Updated by Pascal Obry about 6 years ago

  • bitness set to 64-bit

Does this still reproduce? Nobody else has reported this issue which is 1.5 year old! Is it worth keeping it open?

#13 Updated by Torsten Bronger about 6 years ago

I can reproduce it now with importing a directory of 245 JPEGs, each between 5 and 10 MB in file size, and 24mp in image size. DT allocates 14GB+ virtual memory, the computer (8GB RAM) swaps heavily, and I have to kill DT before the OS does.

#14 Updated by Patrick Gui almost 6 years ago

I reproduce this problem with the latest versions (ppa unstable + git).
Not only during an import process, but also in ligthtable mode when I select a existing collection with JPEG files.

Maybe a clue: these are JPEGs produced with the Ubuntu tool "Simple Scan".
Tell me if you want more precisions.

#15 Updated by Tobias Ellinghaus almost 6 years ago

  • System changed from unknown to Ubuntu
  • Affected Version changed from 1.0.5 to git development version

Could you provide a sample JPEG that triggers this (possibly by copying it dozens of time under different names to the same folder)? I'd also like to know if you are on a 32bit system or 64bit.

#16 Updated by Patrick Gui almost 6 years ago

Here come the files !

My conf: Ubuntu 13.10 64bits + 4Go ram + 8Go swap

I have 2 darktable installed : ppa unstable 1.4.1.xxxx + git 1.5.780 .
The first one can't launch anymore :
jupalian@ubuntu3:~$ darktable
[init] error: can't open new database schema
ERROR : cannot open database

Because of a new database schema with the git version ?

With the second one, I have to reboot the computer.
The default collection loaded at darktable startup is full of jpeg files.

#17 Updated by Tobias Ellinghaus almost 6 years ago

Even with 100 copies of each of your JPEGs (i.e., 400 images in total) I don't see any unbounded growing of used memory. I tried with 1 background thread and 2. I guess someone else has to track this down.

#18 Updated by Patrick Gui almost 6 years ago

I send you the traces issued from the command :
./darktable -d all | tee ~/darktable.log
and also a screen capture with the memory usage.

I launched dt, and then selected the film with these damned jpegs.
I can't use the computer during a time lapse; after that it's ok but the physical memory and swap are quite completely used (see the screen capture)

#19 Updated by Patrick Gui almost 6 years ago

And a CSV extract from the sqlite table "images" with the jpeg files.
Hope it will be helpful

#20 Updated by Tobias Ellinghaus almost 6 years ago

It's not that I wouldn't believe you. I have heard reports about memory issues with JPEGs several times in the past. The only problem is that I can't trigger that on my local machine, so there is no way I personally can fix it. I hope that someone else with insight into our code can reproduce the bug and have a look. I hope that it's not a matter of external libraries having a memleak in some version (libjpeg, libexiv2, those are the most likely candidates, or libraries they depend on).

Did you compile darktable as Release, RelWithDebInfo, Debug or something else?

#21 Updated by Patrick Gui almost 6 years ago

With "build.sh".

Some libs were missing on my system :

-- Looking for external programs
-- Found perl
-- Found intltool-merge
-- Found xsltproc
-- Found xmllint
-- All external programs found
-- Found Gettext
-- Found msgfmt to convert language file. Translation enabled
-- Found Glib
-- Found LibXml2: /usr/lib/x86_64-linux-gnu/libxml2.so (found suitable version "2.9.1", minimum required is "2.6")
-- checking for module 'libwebp'
-- package 'libwebp' not found
-- Could NOT find WEBP (missing: WEBP_LIBRARY WEBP_INCLUDE_DIR) (Required is at least version "0.3.0")
-- Found GIO
-- Found Cairo
-- Found GDK-PixBuf
-- Found LibXml2: /usr/lib/x86_64-linux-gnu/libxml2.so (found version "2.9.1")
-- Internationalization: Enabled
-- checking for module 'json-glib-1.0'
-- package 'json-glib-1.0' not found
-- checking for module 'libopenjpeg1'
-- package 'libopenjpeg1' not found
-- Could NOT find OpenJPEG (missing: OPENJPEG_LIBRARY OPENJPEG_INCLUDE_DIR)
-- checking for module 'GraphicsMagick'
-- package 'GraphicsMagick' not found
-- checking for one of the modules 'lua5.2;lua-5.2;lua'
-- Lua support: System library not found (to use darktable's version use DDONT_USE_INTERNAL_LUA=Off)
-
checking for one of the modules 'libsoup-2.4;libsoup2'
-- checking for one of the modules 'libsoup-2.2;libsoup2'
-- Map mode: disabled, please install libsoup2
-- checking for module 'colord'
-- package 'colord' not found
-- Could NOT find SDL (missing: SDL_LIBRARY SDL_INCLUDE_DIR)
-- Could NOT find OpenGL (missing: OPENGL_gl_LIBRARY)
--
-- Could NOT find Java (missing: Java_JAR_EXECUTABLE Java_JAVAC_EXECUTABLE Java_JAVAH_EXECUTABLE Java_JAVADOC_EXECUTABLE) (found version "1.7.0.51")
-- Configuring done
-- Generating done
-- Build files have been written to: /home/jupalian/darktable/build

Now I got this :

-- Looking for external programs
-- Found perl
-- Found intltool-merge
-- Found xsltproc
-- Found xmllint
-- All external programs found
-- Found Gettext
-- Found msgfmt to convert language file. Translation enabled
-- Found Glib
-- Found LibXml2: /usr/lib/x86_64-linux-gnu/libxml2.so (found suitable version "2.9.1", minimum required is "2.6")
-- Found GIO
-- Found Cairo
-- Found GDK-PixBuf
-- Found LibXml2: /usr/lib/x86_64-linux-gnu/libxml2.so (found version "2.9.1")
-- Internationalization: Enabled
-- Found JsonGlib
-- checking for module 'libopenjpeg1'
-- package 'libopenjpeg1' not found
-- OpenJPEG version 1.3.0 found. Only 1.5 and newer support reading of icc profiles.
-- Found GraphicsMagick
-- Lua support: Enabled
-- checking for one of the modules 'libsoup-2.2;libsoup2'
-- Map mode: enabled
-- Could NOT find SDL (missing: SDL_LIBRARY SDL_INCLUDE_DIR)
-- Could NOT find OpenGL (missing: OPENGL_gl_LIBRARY OPENGL_INCLUDE_DIR)
--
-- Could NOT find Java (missing: Java_JAR_EXECUTABLE Java_JAVAC_EXECUTABLE Java_JAVAH_EXECUTABLE Java_JAVADOC_EXECUTABLE) (found version "1.7.0.51")
-- Configuring done
-- Generating done
-- Build files have been written to: /home/jupalian/darktable/build

#22 Updated by Torsten Bronger almost 6 years ago

Possibly one can reproduce this problem with the attached script. It calls ImageMagick to create 400 6000x4000 JPEGs which only contains noise.

#23 Updated by Torsten Bronger almost 6 years ago

I attached a parallelised version of the script.

#24 Updated by Tobias Ellinghaus almost 6 years ago

Even with these files (I didn't wait for convert to create 400 files but duplicated 1 file 400 times) darktable's memory consumption stays at about 7 GB VIRT and 2 GB RES. Once the images are imported and thumbnails are created it goes back down to 2 GB VIRT and 340 MB RES.

#25 Updated by Torsten Bronger almost 6 years ago

How many threads do you have?

ps -eLf | grep darktable | wc

during the import phase results in 71 for me.

#26 Updated by Tobias Ellinghaus almost 6 years ago

The import is over quite fast, but during thumbnail generation it goes up to something like 50 or 60 or so.

#27 Updated by Torsten Bronger almost 6 years ago

Then, there is much more (or something completely different) than a memory leak problem. On my computer, four i7 cores are fully occupied during the import of the 400 JPEGs for 5 minutes (real time).

Peak RES is 7.5GB and peak VIRT is 14GB. But the values are stable, it does not grow constantly during the import.

I don't have an SSD but this isn't the bottleneck anyway. Rather, that I don't have OpenCL support for my graphics card. Maybe this is the problem? But then, why does a mere thumbnail generation for 400 JPEGs need so much computation power?!

#28 Updated by Tobias Ellinghaus almost 6 years ago

I don't have OpenCL either, so that's not the issue. Do these images have XMP files already when you import them? Maybe there is some expensive processing applied? Or a default preset that gets applied?

#29 Updated by Torsten Bronger almost 6 years ago

No XMP files, and I call DT with

#!/bin/bash
rm -Rf /tmp/darktable_tmp
mkdir -p /tmp/darktable_tmp
darktable --library /tmp/darktable_tmp/library.db \
  --configdir /tmp/darktable_tmp --cachedir /tmp/darktable_tmp \
  --tmpdir /tmp/darktable_tmp --conf write_sidecar_files=FALSE $*

to make sure that it uses a pure default configuration.

#30 Updated by Patrick Gui over 5 years ago

Hello,
I still have the memory problem during import of JPEG files :

1 file import :

[memory] before pixelpipe process
[memory] max address space (vmpeak): 5039964 kB
[memory] cur address space (vmsize): 4316312 kB
[memory] max used memory (vmhwm ): 1277220 kB
[memory] cur used memory (vmrss ): 1043704 kB

then a second file :

[memory] before pixelpipe process
[memory] max address space (vmpeak): 5039964 kB
[memory] cur address space (vmsize): 4732588 kB
[memory] max used memory (vmhwm ): 1323416 kB
[memory] cur used memory (vmrss ): 1267364 kB

a third file :

[memory] before pixelpipe process
[memory] max address space (vmpeak): 5039964 kB
[memory] cur address space (vmsize): 4973212 kB
[memory] max used memory (vmhwm ): 1578988 kB
[memory] cur used memory (vmrss ): 1522828 kB

... and so on

With darktable 1.5+1628~g119536e

#31 Updated by Patrick Gui over 5 years ago

Solved after :
delete of ~/.cache/darktable + ~/.config/darktable and a global import
I have kept the standard memory presets and I have not configured Lua.
It's ok now with Jpegs

#32 Updated by Roman Lebedev about 5 years ago

  • % Done changed from 20 to 100
  • Status changed from Triaged to Fixed

Also available in: Atom PDF

Go to top