Rawtherapee: Support Fuji SN EXR mode RAFs

Created on 23 Sep 2018  路  20Comments  路  Source: Beep6581/RawTherapee

Hi,

Ingo suggested to open an issue for supporting Fuji SN EXR RAFs, which contain two frames with the same exposure, which could be combined by RT for less noise. If needed, I could provide such a shot, but it had to be taken first, as I didn't find one in my archive (because even with SN, the images are noisy and often not worth keeping).

Thanks,
Fl枚ssie

file format enhancement

All 20 comments

@Floessie I have such shots from other exr cameras.

Here's a first quick and dirty hack. It gives the combination of both frames when Sub-image 1 is selected and the single second frame when Sub-image 2 is selected.

diff --git a/rtengine/rawimagesource.cc b/rtengine/rawimagesource.cc
index 1396ae245..7ff7dbb69 100644
--- a/rtengine/rawimagesource.cc
+++ b/rtengine/rawimagesource.cc
@@ -1784,6 +1784,19 @@ void RawImageSource::preprocess  (const RAWParams &raw, const LensProfParams &le
                 copyOriginalPixels(raw, riFrames[i], rid, rif, *rawDataFrames[i]);
             }
         }
+    } else if (numFrames == 2) {
+        if(!rawDataBuffer[0]) {
+            rawDataBuffer[0] = new array2D<float>;
+        }
+        rawDataFrames[1] = rawDataBuffer[0];
+        copyOriginalPixels(raw, riFrames[1], rid, rif, *rawDataFrames[1]);
+        copyOriginalPixels(raw, ri, rid, rif, rawData);
+
+        for (int i = 0; i < H; ++i) {
+            for (int j = 0; j < W; ++j) {
+                rawData[i][j] = (rawData[i][j] + (*rawDataFrames[1])[i][j]) * 0.5f;
+            }
+        }
     } else {
         copyOriginalPixels(raw, ri, rid, rif, rawData);
     }

Of course we need a better solution for the gui. My hack is just to allow comparisons.

I have Fuji F770EXR and came across with this issue in pictures taken with SP > Party mode for low light situations.
The JPG generated by the camera looks fine, but the (first sub-) image I see in RawTherapee is very noisy (Somewhat similar to the examples on this page but the reason might be different) and even with denoise on the raw file, the camera's JPG looks much better.

I can provide sample RAF files if you need them.

Fuji F770EXR
(+ other EXR cameras)

Please contribute a normal raw, PLUS such a two-image raw (of the same scene) to https://raw.pixls.us/

Fuji F770EXR
(+ other EXR cameras)

Please contribute a normal raw, PLUS such a two-image raw (of the same scene) to https://raw.pixls.us/

I don't think I can make a normal raw plus two-image raw of the exact same scene --this requires shooting two photos very quickly while holding the camera steady, all the while changing camera's settings in between two photos.

Would a camera-generated JPG along with the RAF file be sufficient?

Fuji F770EXR
(+ other EXR cameras)

Please contribute a normal raw, PLUS such a two-image raw (of the same scene) to https://raw.pixls.us/

I don't think I can make a normal raw plus two-image raw of the exact same scene --this requires shooting two photos very quickly while holding the camera steady, all the while changing camera's settings in between two photos.

It doesn't have to be pixel-for-pixel-value-identical, just visually identical.
Practically all the other sets (from cameras that can produce more than one raw mode)
match this "requirement". Just put the camera on tripod and take a pictures of some
not too fastly changing scene - daylight nature landscape works well e.g.
(just no people)

Would a camera-generated JPG along with the RAF file be sufficient?

No.

@heckflosse I tried your patch on some RAFs that contain two subimages with the same exposure. I'm quite sure they were taken in DR mode with an auto dynamic range of 100%. Subimage 2 then contains more detail than subimage 1.

@Floessie The patch is not for DR mode. It's for the SN mode.

@heckflosse I know. I just wanted to let you know about the effect on a DR, as I don't have an SN at hand. But an "Average" entry in the subimage selection would be cool...

@Floessie

diff --git a/rtengine/rawimagesource.cc b/rtengine/rawimagesource.cc
index f551eb0dc..79922adbd 100644
--- a/rtengine/rawimagesource.cc
+++ b/rtengine/rawimagesource.cc
@@ -1793,6 +1793,19 @@ void RawImageSource::preprocess  (const RAWParams &raw, const LensProfParams &le
                 copyOriginalPixels(raw, riFrames[i], rid, rif, *rawDataFrames[i]);
             }
         }
+    } else if (numFrames == 2 && currFrame == 2) { // average the frames
+        if(!rawDataBuffer[0]) {
+            rawDataBuffer[0] = new array2D<float>;
+        }
+        rawDataFrames[1] = rawDataBuffer[0];
+        copyOriginalPixels(raw, riFrames[1], rid, rif, *rawDataFrames[1]);
+        copyOriginalPixels(raw, ri, rid, rif, rawData);
+
+        for (int i = 0; i < H; ++i) {
+            for (int j = 0; j < W; ++j) {
+                rawData[i][j] = (rawData[i][j] + (*rawDataFrames[1])[i][j]) * 0.5f;
+            }
+        }
     } else {
         copyOriginalPixels(raw, ri, rid, rif, rawData);
     }
diff --git a/rtengine/rawimagesource.h b/rtengine/rawimagesource.h
index a23e9c3cb..2c83b9bd3 100644
--- a/rtengine/rawimagesource.h
+++ b/rtengine/rawimagesource.h
@@ -201,8 +201,13 @@ public:
     static void init ();
     static void cleanup ();
     void setCurrentFrame(unsigned int frameNum) override {
-        currFrame = std::min(numFrames - 1, frameNum);
-        ri = riFrames[currFrame];
+        if (numFrames == 2 && frameNum == 2) { // special case for averaging of two frames
+            currFrame = frameNum;
+            ri = riFrames[0];
+        } else  {
+            currFrame = std::min(numFrames - 1, frameNum);
+            ri = riFrames[currFrame];
+        }
     }
     int getFrameCount() override {return numFrames;}
     int getFlatFieldAutoClipValue() override {return flatFieldAutoClipValue;}
diff --git a/rtgui/bayerprocess.cc b/rtgui/bayerprocess.cc
index 74cf27dde..10f42856b 100644
--- a/rtgui/bayerprocess.cc
+++ b/rtgui/bayerprocess.cc
@@ -730,7 +730,10 @@ void BayerProcess::FrameCountChanged(int n, int frameNum)
                 entry << i;
                 imageNumber->append(entry.str());
             }
-            imageNumber->set_active(std::min(frameNum, n - 1));
+            if (n == 2) {
+                imageNumber->append("1.5");
+            }
+            imageNumber->set_active(std::min(frameNum, n == 2 ? n : n - 1));
             if (n == 1) {
                 imageNumberBox->hide();
             } else {


You can test with the bottom left raf from here
https://www.photographyblog.com/reviews/fujifilm_finepix_f800exr_review/sample_images

@Beep6581 Thanks for testing :+1: It's expected that averaging frames with different exposure does not work correctly. This issue is about SN mode EXR files, which have equal exposure for both frames.

In all cases, the info panel in RT shows "Exif data not available" when viewing sub-image 2 or the blend.

Same as before patch. I will try to solve that.

@heckflosse FramesData::hasExif() returns false for the second frame, because there is only one frame of FramesData. One could of course return the data for the first frame if the requested frame isn't available...

One could of course return the data for the first frame if the requested frame isn't available...

Here's a patch for that:

diff --git a/rtengine/imagedata.cc b/rtengine/imagedata.cc
index 892a9efed..9b2de3e3b 100644
--- a/rtengine/imagedata.cc
+++ b/rtengine/imagedata.cc
@@ -16,6 +16,7 @@
  *  You should have received a copy of the GNU General Public License
  *  along with RawTherapee.  If not, see <http://www.gnu.org/licenses/>.
  */
+#include <functional>
 #include <strings.h>
 #include <glib/gstdio.h>
 #include <tiff.h>
@@ -43,6 +44,22 @@ Glib::ustring to_utf8 (const std::string& str)
     }
 }

+template<typename T>
+T getFromFrame(
+    const std::vector<std::unique_ptr<FrameData>>& frames,
+    std::size_t frame,
+    const std::function<T (const FrameData&)>& function
+)
+{
+    if (frame < frames.size()) {
+        return function(*frames[frame]);
+    }
+    if (!frames.empty()) {
+        return function(*frames[0]);
+    }
+    return {};
+}
+
 }

 FramesMetaData* FramesMetaData::fromFile (const Glib::ustring& fname, std::unique_ptr<RawMetaDataLocation> rml, bool firstFrameOnly)
@@ -900,74 +917,196 @@ procparams::IPTCPairs FramesData::getIPTCData (unsigned int frame) const
     }
 }

-bool FramesData::hasExif (unsigned int frame) const
+bool FramesData::hasExif(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? false : frames.at(frame)->hasExif ();
+    return getFromFrame<bool>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.hasExif();
+        }
+    );
 }
-bool FramesData::hasIPTC (unsigned int frame) const
+
+bool FramesData::hasIPTC(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ?  false : frames.at(frame)->hasIPTC ();
+    return getFromFrame<bool>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.hasIPTC();
+        }
+    );
 }

-tm FramesData::getDateTime (unsigned int frame) const
+tm FramesData::getDateTime(unsigned int frame) const
 {
-    if (frames.empty() || frame >= frames.size() ) {
-        return {};
-    } else {
-        return frames.at(frame)->getDateTime ();
-    }
+    return getFromFrame<tm>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getDateTime();
+        }
+    );
 }
+
 time_t FramesData::getDateTimeAsTS(unsigned int frame) const
 {
-     return frames.empty() || frame >= frames.size()  ? 0 : frames.at(frame)->getDateTimeAsTS ();
+    return getFromFrame<time_t>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getDateTimeAsTS();
+        }
+    );
 }
-int FramesData::getISOSpeed (unsigned int frame) const
+
+int FramesData::getISOSpeed(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? 0 : frames.at(frame)->getISOSpeed ();
+    return getFromFrame<int>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getISOSpeed();
+        }
+    );
 }
-double FramesData::getFNumber  (unsigned int frame) const
+
+double FramesData::getFNumber(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? 0. : frames.at(frame)->getFNumber ();
+    return getFromFrame<double>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getFNumber();
+        }
+    );
 }
-double FramesData::getFocalLen (unsigned int frame) const
+
+double FramesData::getFocalLen(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? 0. : frames.at(frame)->getFocalLen ();
+    return getFromFrame<double>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getFocalLen();
+        }
+    );
 }
-double FramesData::getFocalLen35mm (unsigned int frame) const
+
+double FramesData::getFocalLen35mm(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? 0. : frames.at(frame)->getFocalLen35mm ();
+    return getFromFrame<double>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getFocalLen35mm();
+        }
+    );
 }
-float FramesData::getFocusDist (unsigned int frame) const
+
+float FramesData::getFocusDist(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? 0.f : frames.at(frame)->getFocusDist ();
+    return getFromFrame<float>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getFocusDist();
+        }
+    );
 }
-double FramesData::getShutterSpeed (unsigned int frame) const
+
+double FramesData::getShutterSpeed(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? 0. : frames.at(frame)->getShutterSpeed ();
+    return getFromFrame<double>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getShutterSpeed();
+        }
+    );
 }
-double FramesData::getExpComp  (unsigned int frame) const
+
+double FramesData::getExpComp(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? 0. : frames.at(frame)->getExpComp ();
+    return getFromFrame<double>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getExpComp();
+        }
+    );
 }
-std::string FramesData::getMake     (unsigned int frame) const
+
+std::string FramesData::getMake(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? std::string() : frames.at(frame)->getMake ();
+    return getFromFrame<std::string>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getMake();
+        }
+    );
 }
-std::string FramesData::getModel    (unsigned int frame) const
+
+std::string FramesData::getModel(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? std::string() : frames.at(frame)->getModel ();
+    return getFromFrame<std::string>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getModel();
+        }
+    );
 }
-std::string FramesData::getLens     (unsigned int frame) const
+
+std::string FramesData::getLens(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? std::string() : frames.at(frame)->getLens ();
+    return getFromFrame<std::string>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getLens();
+        }
+    );
 }
-std::string FramesData::getSerialNumber (unsigned int frame) const
+
+std::string FramesData::getSerialNumber(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? std::string() : frames.at(frame)->getSerialNumber ();
+    return getFromFrame<std::string>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getSerialNumber();
+        }
+    );
 }
-std::string FramesData::getOrientation (unsigned int frame) const
+
+std::string FramesData::getOrientation(unsigned int frame) const
 {
-    return frames.empty() || frame >= frames.size()  ? std::string() : frames.at(frame)->getOrientation ();
+    return getFromFrame<std::string>(
+        frames,
+        frame,
+        [](const FrameData& frame_data)
+        {
+            return frame_data.getOrientation();
+        }
+    );
 }


@Floessie I pushed my changes. Please push yours as well. They work fine

Done and close?

Confirmed, done and close.

@Floessie @heckflosse which camera models support SN mode?

I'd guess that most if not all EXR cameras support this mode, but I can only say for sure that the F600EXR and the X-S1 do have SN mode. Not all EXR cameras support RAW, though.

The FinePix S200EXR does not save a dual frame RAF in SN mode, only an already averaged single raw frame. For this camera, a support for frame averaging is more or less superfluous. But if there are other models which explicitly save both raw frames, this feature will be useful for their owners.

Was this page helpful?
0 / 5 - 0 ratings