React Native Environment Info:
System:
OS: macOS 10.14
CPU: x64 Intel(R) Core(TM) i5-5257U CPU @ 2.70GHz
Memory: 47.52 MB / 8.00 GB
Shell: 3.2.57 - /bin/bash
Binaries:
Node: 8.11.1 - /usr/local/bin/node
Yarn: 1.10.1 - /usr/local/bin/yarn
npm: 6.4.1 - /usr/local/bin/npm
Watchman: 4.9.0 - /usr/local/bin/watchman
SDKs:
iOS SDK:
Platforms: iOS 12.1, macOS 10.14, tvOS 12.1, watchOS 5.1
IDEs:
Android Studio: 3.1 AI-173.4907809
Xcode: 10.1/10B61 - /usr/bin/xcodebuild
npmPackages:
react: 16.5.0 => 16.5.0
react-native: 0.57.1 => 0.57.1
npmGlobalPackages:
react-native-cli: 2.0.1
react-native-create-library: 3.1.2
react-native-git-upgrade: 0.2.7
Image.getSize does not return the correct width and height of the image. See the example below:
// uses RNCamera from 'react-native-camera'
takePicture = async (camera: any) => {
const options = {
quality: 0.5,
base64: false,
}
const metadata = await camera.takePictureAsync(options)
const { uri, width, height } = metadata
console.log({ width, height })
if (isAndroid) {
Image.getSize(uri, (w, h) => {
console.log({ width: w, height: h })
})
}
}
console.log results:
metadata from takePictureAsync:
{ width: 2592, height: 1944}
result from Image.getSize:
{ width: 1296, height: 972 }
I confirmed after exporting the picture that the real dimensions are the ones from takePictureAsync.
see the example
Could this be related to density? Image.getSize is often used to specify a views size in the layout, which is dp vs raw pixels.
You can test this by using PixelRatio to get the real pixel value from the returned pixel value from Image.getSize. If the getPixelSizeForLayoutSize call on the returned value of getSize returns the correct pixel value, you are looking at dp instead of px values.
If it is the case that you actually get dp from Image.getSize, the (docs)[https://facebook.github.io/react-native/docs/image#getsize] are wrong or the underlying implementation is wrong. Fixing the implementation might cause people's apps to break though, as I know a few apps that lean on the current implementation of Image.getSize.
Have you also tested this on iOS? For some manual math, could you post the output of PixelRatio.get() for this specific Android device?
Hi @bartolkaruza, thanks for your reply. That also came to my mind, but according to the results below, I see no relationship between PixelRatio and the difference in both dimension.
With an Android 6.0.1 - One plus One:
With an Android 7.0 - Huawei MediaPad T3 10
On ios, both takePictureAsync and Image.getSize return the same width and height values. Just tested it with an iPhone 5s and an iPad Air 2.
Ok, it looks really inconsistent for Android! The RN functionality for Image.getSize just calls into the Fresco library. The relevant code in React Native is here.
Would you be so kind as to raise this issue on the Fresco repo and link back here? I think we'll get this cleared up there much faster.
I can verify this inconsistency for Android phones: LG G6 (H870), Huawei P10 (VTR-L29) and Huawei P8 Lite (ALE-L21). Workaround was to multiply width & height with PixelRatio for users with those phones.
@Fsarmento can you check if this comment is true? The images you are using seem larger than the criteria for 'huge' of 2048px. It looks to me like a basic OOM protection by Fresco. I don't think we should tinker with that value on React Native side, as large bitmaps are the most common source of crashes on Android.
The value returned by Image.getSize is the actual image size your app will effectively deal with because the image is forced downsampled by Fresco if you use it in an Image component. This means that the size corresponds to the bitmap as used by RN. Although confusing in your particular case of reading out camera image size directly, I think this behavior is what it should be for most apps.
Yes, it only happens with huge images.
I could not dig deeper to confirm if that was it, but most likely it is.
In my use case, I need to crop the image using ImageEditor.cropImage. This method expects a size: { width: number, height: number } parameter which defines the number of px to crop. These values are based on the real dimensions and not the ones given byImage.getSize in these cases.
Right know we are using the picture dimensions given by takePictureAsync metadata, although there are a few bugs related with delays while capturing the pictures in Android.
Using the Image.getSize was a workaround to avoid blocking the screen for 2 to 3 seconds after capturing a picture.
Is this fixed on the latest versions?
0.57.5
Yes, it only happens with huge images.
I could not dig deeper to confirm if that was it, but most likely it is.In my use case, I need to crop the image using
ImageEditor.cropImage. This method expects asize: { width: number, height: number }parameter which defines the number of px to crop. These values are based on the real dimensions and not the ones given byImage.getSizein these cases.Right know we are using the picture dimensions given by
takePictureAsync metadata, although there are a few bugs related with delays while capturing the pictures in Android.
Using the Image.getSize was a workaround to avoid blocking the screen for 2 to 3 seconds after capturing a picture.
I am also using this for cropping.Any work around?
So the use case is that you are cropping image A to the size of image B and you need exact dimensions of image B for this? If the cropped image is going to be displayed inside your app, it is not a problem to use the Image.getSize dimensions, because that is the actual size the image would be, if displayed in an <Image /> component. Image.getSize corresponds to <Image />, which makes sense if you think about it.
If the goal is to do image processing for use outside of your app (send to a server or something like that), I recommend writing the native code using the Android Bitmap class. There are lots of examples around for getting the size of a Bitmap on Android and the process for making a small native module wrapping your custom getSize is not too complicated either. When you are dealing with large images and constrained memory space, dropping down to native level becomes a natural choice for your functionality, because you have much more control over bitmap allocation and disposal.
I don't think this will ever be 'fixed' in RN. Fixing it will cause memory issues for many apps, with no benefit outside of your use case which can be solved in other ways.
@kelset close
I also encountered the same problem in 0.57.5, how is resolve?
This seems to be a regression in 0.57.
Only tested on Android so far but for an image from the camera on an HTC Desire 650 getSize on 0.56 returns the correct image dimensions of 3136x4224 and on 0.57 it returns 784x1056 (PixelRatio of 2 and downsampled by fresco?).
I can't see mention of a breaking change in the release notes.
@leighman I understand your point of view, but this is a change that was introduced here (I suspect): https://github.com/facebook/react-native/commit/b6f2aad9c0119d11e52978ff3fa9c6f269f04a14
From the React Native perspective, it was a minor version bump of the Fresco library from 1.9.0 > 1.10.0. Even if that version bump was mentioned in the release notes; Would anyone have been able to draw info from that? I can't see this change in the Fresco release notes either: https://github.com/facebook/fresco/releases
I don't think Image.getSize should be considered a general purpose image utility, but a utility belonging to the Image component. If that assumption breaks, things go wrong as in this issue. If anyone feels like implementing a general purpose getSize, the logical place for that would be on the ImageEditor component.
I don't agree it is a regression, as the Image.getSize works as expected in the context of <Image /> in 0.57.x on both platforms, because on Android both use Fresco.
It is a big problem.
Please add a new method like the old Image.getSize.
So I had to do it.
react-native-image-size:
Thank you @eXist-FraGGer. Your library works perfect for my use-case!
Most helpful comment
So I had to do it.
react-native-image-size: