I've tried
Image.getSize('./myimage.png', (width, height) => {
console.log(width)
});
But doesn't work. It seems to only work with network images. It always complains with this: Failed to get size for image: ./myimage.png
Hey manask88, thanks for reporting this issue!
React Native, as you've probably heard, is getting really popular and truth is we're getting a bit overwhelmed by the activity surrounding it. There are just too many issues for us to manage properly.
react-native or for more real time interactions, ask on Discord in the #react-native channel.Related to #2180.
I think it was only implemented for network images since you should know the size of the image if it is bundled with your app. If you'd like to have that feature, I suggest you submit a PR.
@facebook-github-bot feature
Hey @manask88! Thanks for opening the issue, however it looks like a feature request. As noted in the Issue template we'd like to use the GitHub issues to track bugs only. Can you implement the feature as a standalone npm module? If not consider sending a pull request or a create an entry on Product Pains. It has a voting system and if the feature gets upvoted enough it might get implemented. Will close this issue.
It seems that no one created a feature request. So if you also want that feature in RN vote here https://react-native.canny.io/feature-requests/p/support-static-images-in-imagegetsize
[...] only implemented for network images since you should know the size of the image if it is bundled with your app.
This would be a nightmare to keep in sync on a medium to large project. I agree that calculating it asynchronously at runtime is not ideal. What I really want is a way to calculate this at BUILD time:
import image, {width, height} from './images/goat.png';
Does anything like this exist in the ecosystem? Is there any way to plug into the build system to get this info?
[Edit] I now realize the undocumented resolveAssetSource built-in to ReactNative basically does exactly this:
import resolveAssetSource from 'resolveAssetSource';
import image from './images/goat.png';
const {width, height} = resolveAssetSource(image);
[...] only implemented for network images since you should know the size of the image if it is bundled with your app.
This would be a nightmare to keep in sync on a medium to large project.
Yup, it really would be a nightmare.... I doubt the answer is to ask the framework users to submit a PR, not to say that none of us would but for those who work on RN full time this deserves attention.
To add another scenario to this situation - I have images that are not network images, and I can't reasonably be expected to know the size of them. They're being retrieved via an API where I need to authenticate (so I can't just plumb it in as a URI into getSize), so it's being pulled down to the device. getSize works on iOS, but not android, funnily enough. If there's something I'm missing here, it certainly doesn't seem to be anything documented.
Will give the resolveAssetSource approach a try :/
This should be re-opened, as there's still no documented way of displaying a static image, that's been bundled with the app (e.g. android drawable folder), like so <Image ... source={{ uri: this.state.selectedQuestion.image }} />, without knowing its size in advance.
Not setting the width and height properties, lead to the image not being rendered at all; also, Image.getSize() throws a warning, that the image was not found. Setting the width and height to some values allow the image to be displayed, but it is either cropped or with white-space.
For example, in my app I have lots of (3000+) multiple choice questions stored in an SQLLite db, each being linked to an image (via a column in the db). The images differ in size and are stored (in android) in the drawable folder.
I've been trying to crop images taken from react-native-camera basically forever... It has never worked. Just trying to get a square image out of the camera. None of the approaches have worked and its disappointing that the latest try involves the Image.get_size and ImageEditor. My images are on the device, having just been taken by the camera. They aren't static images and they aren't coming down from the network. Anyone have any idea how to do this after all this time? If we could get some uniformity in regard to image processing in react-native would be awesome...
To be clear, don't want an interface, just want to take a jpg, crop off the top and bottom and return a square result.
Most helpful comment
This would be a nightmare to keep in sync on a medium to large project. I agree that calculating it asynchronously at runtime is not ideal. What I really want is a way to calculate this at BUILD time:
Does anything like this exist in the ecosystem? Is there any way to plug into the build system to get this info?
[Edit] I now realize the undocumented
resolveAssetSourcebuilt-in to ReactNative basically does exactly this: