Sorry, this isn't an issue - I just have some questions.
And 2 requests:
...I'd be happy to submit a PR if you're open to the ideas
Requests:
"could the original be added to the response also?" - i didn't mean base64 data, just a reference to the original image, basically just the data that the camera roll would have returned had you used it. The reason I ask is that you could use the original image to grab the metadata.
hmm, then maybe we could do it in a way that you add metatada key to the response with all metadata you discovered?
So maybe add a lump to the response called 'original' or something, and in it include the image data and all the exif meta data?
still didn't get you what you mean by "original". Currently in iOS is hard or impossible to get original image url which will work later on upload. That is the reason why I decided to save NSData to tmp location and return tmp path to the user. My idea is to attach "metadata" key along with existing response keys, and "metadata" key would be key/value pairs of extracted exif data
By "original" I just mean the data that would have been returned if the user had just picked an image from the Camera Roll. This is the image that will have all the meta data in tact, and in iOS it includes some GPS data that isn't encoded into the underlying image object as EXIF data.
Looks like this on android:
{
node: {
timestamp: 1474222489,
group_name: 'Pictures',
type: 'image/jpeg',
image: {
height: 3264,
width: 2448,
uri: 'content://media/external/images/media/417'
}
}
}
and this on iOS:
{
"node": {
"timestamp": 1474208243.123169,
"group_name": "All Photos",
"type": "ALAssetTypePhoto",
"image": {
"isStored": true,
"height": 4032,
"uri": "assets-library://asset/asset.JPG?id=F6FB6066-1E4D-4505-82EB-B8AC9F32C816&ext=JPG",
"width": 3024
},
"location": {
"speed": 0,
"latitude": 51.49944666666666,
"longitude": -0.0811195,
"heading": 0,
"altitude": 3.892108508014796
}
}
}
as you can see we are already returning a lot of this data to the user. It would be great if you could create key/value metadata object in response, and fill it with exif and missing location data (if available). This would be actually very cool
Adding the original asset URI to the existing response would be enough I think. It could be used to look up the original object in the camera roll and from there you could inspect the original for its metadata.
Maybe thats a good start?
hm, in my opinion would be much easier for the users if we extract data automatically without this extra step. How can we from react native look at the original image metadata (if we have url)?
There are APIs for getting the meta data out, but you need the original url first. Can you add this in? I'm able to get the example app working so don't feel comfortable submitting a PR that I can't even test.
You could add metadata to this project, but after some though I'm not sure it belongs here. But without the original URL, you can't get anything.
Looks like this code is very similar to react-native-image-picker which already returns the original url in the response. Could just borrow the code from them, I'm sure its just a couple of lines.
as far as I can see they put origURL just for ios. Can you please give me some link to the react-native api which allows you to access metadata via original url? Just curious
It's the uri field in react-native-image-picker:
I'm facing the same issue that I can't access meta data with the ios version. Like with @npomfret, a property with the original uri would likely fit my needs.
Can you please give me some link to the react-native api which allows you to access metadata via original url? Just curious
react-native-exif works for me for android but I couldn't test it for ios since I have no access to the original uris. The react-native-image-picker library also extracts exif data but it looks like they're only copying gps data.
+1
I hope you don't mind me asking another question here. I didn't want to create another issue.
The readme states
Module is creating tmp images which are going to be cleaned up automatically somewhere in the future
I hope you can elaborate what this means as I can interpret this as follows;
I currently suspect it's (2). In that case can you spare some additional words 'when' this cleaning is bound to happen? How long after obtaining them can I expect the images to still be present to operate on from other screens?
Thank you in advance.
@ixje the current code is not deleting files but stores them inside the app's temp directory. The temp dir can be purged by the system at will (see here https://stackoverflow.com/questions/25062375/when-does-ios-clean-the-local-app-tmp-directories).
Thank you @nico1510 . Answers in that stackoverflow gives useful information like that according to the documentation the app should do the house keeping regardless if the system at some point will purge it.
Was any solution found to getting the exif/geolocation data out of images on iOS? This works on Android but seems to be impossible on iOS. @npomfret @ivpusic @nico1510 @markhaasjes
Most helpful comment
+1