Support batched conditional generation from GPT-2
Currently the method to generate text from GPT-2 conditioned on an input sequence only supports either 1) a single input at a time, or 2) a batch of inputs where the conditioning input sequence is the same length. It would be great (for efficiency) if this method could be updated to support a batch with conditional inputs of varying length, done by ignoring padding in the input_ids.
Unlikely to have time to code this, but will submit a PR if I do.
This is known to not work at the moment with generate(). I have to think a bit about the cleanest way to implement it :-) Code suggestions are very welcome!
Very interested in this! Came here from #3021 (many hours after wondering why my batch generation was not working...)
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Most helpful comment
This is known to not work at the moment with
generate(). I have to think a bit about the cleanest way to implement it :-) Code suggestions are very welcome!