Project

General

Profile

Bug #9479

saturated blue are reproduced as black for Nikon nef files when identity transformations are applied

Added by Jozef Sivek almost 7 years ago. Updated over 6 years ago.

Status:
Fixed
Priority:
Medium
Assignee:
-
Category:
Darkroom
Target version:
-
Start date:
06/23/2013
Due date:
% Done:

100%

Estimated time:
Affected Version:
1.4
System:
all
bitness:
64-bit
hardware architecture:
amd64/x86

Description

When the following modules are enabled (in default state) the saturated blues are turned to complete blacks:

  • shadows and highlights
  • levels
  • monochrome (even when blue is included in the selection)

This issue was observed for nef file from Nikon D600, but it will probably be presented also in other nef files from e.g. Nikon D7000, D800, D7100 etc. The attached png image illustrates the issue and for complete record the original nef file with the side file are attached.

The used base curve, i.e. "nikon like" or "nikon like alternate", does not alter effect.

example_preview.png (227 KB) example_preview.png Jozef Sivek, 06/23/2013 11:23 PM
DSC_1985.NEF (27.4 MB) DSC_1985.NEF Jozef Sivek, 06/23/2013 11:23 PM
DSC_1985.NEF.xmp (7.97 KB) DSC_1985.NEF.xmp Jozef Sivek, 06/23/2013 11:23 PM
screenshot_levels_01.png (1.39 MB) screenshot_levels_01.png Markus Kanet, 06/30/2013 11:14 PM
d600.c (3.29 KB) d600.c Ulrich Pegelow, 01/19/2014 06:00 PM
d600.c (41.7 KB) d600.c Ulrich Pegelow, 01/20/2014 09:30 PM

Related issues

Has duplicate darktable - Bug #10619: 'difficult' lighting causes completely black file exportIncomplete08/26/2015

History

#1 Updated by Ulrich Pegelow almost 7 years ago

  • % Done changed from 0 to 10
  • Status changed from New to Confirmed

Thanks for reporting. The issue is known but tends to be a complicated one. As you can see from the histogram the image is overexposed in the circular region, although the colors can be reproduced. These out-of-bound pixels - typically in terms of Lab lightness as well as Lab color - need to be dealt with. Many modules clamp their input or output in order to allow only values within a reasonable range, else they tend to produce artifacts. That's what you see when dark saturated pixels go from blue to black. Removing the clamping and allowing "insane" values might introduce other artifacts, unfortunately.

As a short-term remedy. In our current development branch we introduced some new blend modes. Among others "Lab lightness". In this mode only the lightness of a pixel will be affected (for modules acting in Lab) - a and b are left untouched. At least for modules shadows&highlights and levels this should work with your specific image.

Step by step we might find better ways to deal with these out-of-bound pixels in out modules...

#2 Updated by Markus Kanet almost 7 years ago

Jozef Sivek wrote:

When the following modules are enabled (in default state) the saturated blues are turned to complete blacks:

Had the same problems for Canon EOS7D and EOS 6D for a while now (more then one year) with photos taken on stage with various blue lights and background colors. The new LAB blend modes seem to fix that.

Maybe there should be presets or an Option for L-only on the levels module.

Attached is a screenshot with levels-module snapshot and default settings on top, means levels just activated, not modified and disabled levels in the lower area of the snapshot.

With the comment of Pegelow i now at least know what the problem is and how to fix it temporary. Thx.

#3 Updated by Tobias Ellinghaus over 6 years ago

  • System changed from other GNU/Linux to all
  • % Done changed from 10 to 20
  • Status changed from Confirmed to Triaged

#4 Updated by Ulrich Pegelow over 6 years ago

  • bitness set to 64-bit
  • Affected Version changed from 1.2.1 to 1.4

I had a closer look at the issue once again as it is stil present in current stable version. In addition we had some discussion
on it on IRC.

The analysis shows that some intensive blue areas of the photo are transferred into nonsensical Lab values like (-2.0, 50.0, -100.0) with close to zero
or even negative lightness data although the affected areas are far away from being dark. It's not too surprising that several modules of darktable
have problems with these values, especially when it comes to automatic scaling of color contrast.

Further investigation shows that the origin of the problem seems to be the color matrix of that camera:

{ "NIKON D600", 0, 0x3e07,
{ 8178,-2245,-609,-4857,12394,2776,-1207,2086,7298 } },

When I tentatively replaced it with some other matrix (e.g. from a Nikon D700) the artifacts are gone. So it seems that this matrix is somehow broken
or at least not usable for our purpose. The source of that matrix is Adobe's DNG converter - in fact more or less all RAW converters use that matrix.

I can't right now fix that issue as I don't have a better matrix. One option would be to ask someone with that camera to generate one for us -
darktable already comes with enhanced matrices for several camera models, unfortunately not for the D600. On the user side one could try to
find camera specific ICC profiles (vendor supplied raw converters typically ship them) and avoid the use of the standard matrix in critical cases.

#5 Updated by Pascal de Bruijn over 6 years ago

I doubt the matrix is the actual root cause of the problem. I'm guessing that matrix just exposes an issue somewhere in our codebase.

#6 Updated by Ulrich Pegelow over 6 years ago

Attached is a small program that simulates the RGB->Lab conversion as done in darktable. I took as an example an RGB value of (1, 5, 40) - out of [0; 255] - which is a typical pixel value of the problematic area of that image. It is the outmost part of the blue light source. I picked this value by using the
color picker within parametric mask of module "denoise (bilateral)" which comes directly before "colorin".

The result of the consversion is an Lab value of -2.252634, 44.309505, -78.312416 which is fully bogus IMHO.

#7 Updated by Ulrich Pegelow over 6 years ago

OK, forget about the previous version of my little test program. mat3inv does not work in-place. I attached a hopefully fixed
version. However, the issues remain and I get from my example some Lab values like -23.665804, 105.045662, -126.400986.

I am by far not an expert in these color conversions, but an L value of -23 does not look right to me.

BTW all calculations here are done by lcms2. Therefore I assume that the problem is not darktable specific - unless it lies in
the way we generate an profile from the Nikon matrix.

#8 Updated by Ulrich Pegelow over 6 years ago

OK, one step further. According to this article [[http://www.littlecms.com/CIC18_UnboundedCMM.pdf]] producing virtual Lab values seems to be quite common for camera profiles and it seems generally to be an perferred way of handling them in an unbound way of processing, i.e. without clamping. You can read more about that here: [[http://ninedegreesbelow.com/photography/lcms2-unbounded-mode.html]]. darktable uses lcms2 only in some cases but the general principle of our codepath is the same: we try to avoid clamping.

I mostly agree with that approach but we have to admit that some modules are not able to handle these cases properly. One example is the levels module which somehow needs to rely on the fact that L denotes the lightness and a/b only represent color. In unbound mode with L potentially being negative and high a/b values at the same time resulting in saturated blues there is no way for levels to deal that correctly. At least I found no solution after trying many different ideas.

More on the positive side: I managed to tweak shadows&highlights so that the artifacts should be gone there.

I see two options to go from here:

a) we tell users to avoid certain modules (like levels) if they see artifacts. We could warn them that this mostly happens with saturated blue highlights, e.g. stage photography. All to be written in the usermanual.

b) we offer an option in the colorin module to "normalize", i.e. confine to non-virtual Lab values. I implemented a first idea in branch "normalize". Still very slow as I only offer the lcms2 codepath if the normalize option is on.

Comments?

#9 Updated by Ulrich Pegelow over 6 years ago

  • % Done changed from 20 to 100
  • Status changed from Triaged to Fixed

No further comments, so I implemented the gamut clipping option to the colorin module.

In case of artifacts - modules levels and monochrome are the ones affected most likely - you should clip the input color range to one of the offered RGB color spaces. Select the one with the widest gamut that does not generate artifacts.

#10 Updated by Roman Lebedev over 4 years ago

  • Has duplicate Bug #10619: 'difficult' lighting causes completely black file export added

Also available in: Atom PDF

Go to top