Feature #8784

SCALING: Cubic B Spline with Prefilter

Added by Troy James Sobotka about 8 years ago. Updated almost 6 years ago.

Target version:
Start date:
Due date:
% Done:


Estimated time:
Affected Version:
git development version
hardware architecture:


As requested via Twitter, this is a request to implement a best of breed scaling algorithm.

As well known, not all algorithms for scaling are equal. Cubic splines tend to over blur, SinC etc. variants over sharpen, and others result in pixelated residue.

Cubic B Spline with pre filter is a bicubic scale applied after a pre filtered pass. The pre filter is an infinite wave across the image that compliments the cubic scale.

The result is a remarkably high quality scale that results in optimal source image sharpness with an extremely organic feel.

More information can be found at

If someone is able to cite the correct files, I can likely patch it myself.

It should be quite clear to any artist that the result is remarkable.


#1 Updated by Simon Spannagel about 8 years ago

  • Target version set to Candidate for next minor release
  • Category set to General

#2 Updated by Simon Spannagel about 8 years ago

  • Status changed from New to Triaged


not to leave you without a comment:
I hoped to see Edouard around the last days to point him to this issue but he didn't show up so far. So:

Sounds cool, having another scaling algorithm won't hurt anyone - so I'd guess this is something we would accept for inclusion in darktable.

Best thing to do for you wold be to talk to Edouard (GomGom on IRC) since he's the guy that implemented the Lanzos etc. scaling lately. He should be able to give you some information to get started.


#3 Updated by Johannes Hanika about 8 years ago

or, if you can't wait to get started:

git log --author=Edouard --oneline | grep interpolation

should be a very good starting point to find the files involved in writing a new interpolation function.

#4 Updated by Edouard Gomez about 8 years ago

  • % Done changed from 0 to 20

I've read the description at

The patch that got reviewed does not show enough for me to simply port the code.

Can you provide me more mathematical/dsp articles about that interpolation type ? frequency response for both interpolation and downsampling would be great too, so that i know what i can expect from it.

Btw i can begin guiding you on how interpolation is done is dt. The more important file is src/common/interpolation.c. At the moment, the code supports only using/computing kernels independent from the data being actually filtered (kernel coeffs depend only on sample position). If need be to use the filtered data to alter kernel coefficients, the current framework may require refactoring.

#5 Updated by Troy James Sobotka over 7 years ago

Sorry folks, been busy here.

The basic outline is:

1) Roll the image through the frequency domain prefilter approach.
2) Roll the image through a standard cubic b spline interpolation.

It is nothing more than a compliment to the standard cubic b interpolation.

It has some remarkable attributes however, and from the perspective of imager, it is unparalleled in capabilities.

I could probably code this up myself if you feel there is a method to pass a prefilter to the various computing kernels. The prefilter requires a minimum number of pixels to sample for each horizontal line pass, then vertical pass, so it would likely require full image access prior to the passing to the standard cubic b. Does this sound feasible with current Darktable implementation?

Proof of principle created with the Blender patch. Designed for HD evaluation, so please load HD versions:

Full credit goes to Matthias Fauconneau, who authored the following implementation:

EDIT: The actual research paper is located at:

Author's sample code:

CUDA implementation:

A GPU implementation:

#6 Updated by Johannes Hanika over 7 years ago

wait, the paper is about interpolation (so it's a magnification filter), right? dt mostly downsamples large input resolutions.

so where do you see the usecase for this? interpolation is required in crop/rotate/keystone and lens corrections. unfortunately those need point wise lookups (random access), which probably makes the ingestion of full-buffer code a bit harder (may still be doable).

#7 Updated by Troy James Sobotka over 7 years ago

Johannes Hanika wrote:

so where do you see the usecase for this? interpolation is required in crop/rotate/keystone and lens corrections. unfortunately those need point wise lookups (random access), which probably makes the ingestion of full-buffer code a bit harder (may still be doable).

If you peek at the paper at, you'll see just how dramatic an effect the prefilter has on cubic b transforms. The prefilter effectively creates the perfect other half of a cubic b scale.

This means that rotations, scales (both down and up), and other such transforms have a tremendous increase in quality.

As a general rule, scaling algorithms tend toward either A) pixelated with the linears, B) blurry with splines / cubic b no prefilters, or C) sharps with Lanczos / Sinc for example.

Cubic B with prefilter, being a 'compliment', rests balanced. It preserves overall contrast while not trending toward sharpening or blurring.

There is not a doubt in my mind that it is a best in breed scale, having tested it on many different imaging needs. It simply destroys other scales.

So in short, if the lens corrections for example, use a cubic b interpolation, they too would see a huge increase in quality using the prefiltered approach.

It would require a cached prefiltered image however, especially if threaded, to allow for individual pixel sample reads.

#8 Updated by Pascal de Bruijn almost 6 years ago

  • bitness set to 64-bit
  • System set to all
  • Affected Version set to git development version
  • Target version deleted (Candidate for next minor release)
  • Priority changed from Medium to Low

Also available in: Atom PDF

Go to top