So far I've seen the AsyncBatchFilesRequest for batching the images stored in GCS. But when you have to batch the images from local directory. There isn't any option on google python client.
The below snippet is set for GCS Urls, how do we send batch requests with images that are read as bytes in memory with the google-vision-client python.
input_config={
"gcs_source":<url>
}
output_config = {
gcs_destination : <url>,
batch_size:int
}
What would be the right way to do batch uploading of images with the client.
Any annotation request contains an instance of google.cloud.vision_v1.types.Image, which can have either its source property set (containing the URL of the image, e.g. in GCS) or its content property (containing the actual bytes of the image). So, for instance:
from google.cloud import vision
from google.cloud.vision import enums
from google.cloud.vision import types
features = [
types.Feature(type=enums.Feature.Type.TEXT_DETECTION),
types.Feature(type=enums.Feature.Type.FACE_DETECTION),
]
requests = []
for filename in ['foo.png', 'bar.jpg', 'baz.gif']:
with open(filename, 'rb') as image_file:
image = types.Image(
contents = image_file.read())
request = types.AnnotateImageRequest(
image=image, features=features)
requests.append(request)
response = client.batch_annotate_images(requests)
for annotation_response in response.responses:
do_something_with(annotation_response)
Most helpful comment
Any annotation request contains an instance of
google.cloud.vision_v1.types.Image, which can have either itssourceproperty set (containing the URL of the image, e.g. in GCS) or itscontentproperty (containing the actual bytes of the image). So, for instance: