Transformers: [Feature request] Support batched conditional generation from GPT-2

Created on 2 Jun 2020  路  4Comments  路  Source: huggingface/transformers

馃殌 Feature request

Support batched conditional generation from GPT-2

Motivation

Currently the method to generate text from GPT-2 conditioned on an input sequence only supports either 1) a single input at a time, or 2) a batch of inputs where the conditioning input sequence is the same length. It would be great (for efficiency) if this method could be updated to support a batch with conditional inputs of varying length, done by ignoring padding in the input_ids.

Your contribution

Unlikely to have time to code this, but will submit a PR if I do.

wontfix

Most helpful comment

This is known to not work at the moment with generate(). I have to think a bit about the cleanest way to implement it :-) Code suggestions are very welcome!

All 4 comments

This is known to not work at the moment with generate(). I have to think a bit about the cleanest way to implement it :-) Code suggestions are very welcome!

Very interested in this! Came here from #3021 (many hours after wondering why my batch generation was not working...)

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

adigoryl picture adigoryl  路  3Comments

yspaik picture yspaik  路  3Comments

fyubang picture fyubang  路  3Comments

iedmrc picture iedmrc  路  3Comments

zhezhaoa picture zhezhaoa  路  3Comments