Pytorch: Can we use dynamic MaxPool1d over the max length of the input.

Created on 14 Dec 2017  ·  1Comment  ·  Source: pytorch/pytorch

_MaxPool1d_ have a MUST-REQUIRED parameter named _kernel_size_ , But when our input have different length in different batch. We should padding all the Batches with a fixed length to set a static size of _kernel_size_ before we feed our data to this models!

conv = nn.Sequential(
            nn.Conv1d(in_channels = self.embedding_dim,
                      out_channels = self.content_dim,
                      kernel_size = self.kernel_size),
            nn.ReLU(),
            nn.MaxPool1d(**kernel_size = (self.max_seq_len - self.kernel_size + 1)**)
        )

Instead, we may have a alternative method if we have a dynamic MaxPool1d without pre-set the max_seq_len.

conv = nn.Sequential(
            nn.Conv1d(in_channels = self.embedding_dim,
                      out_channels = self.content_dim,
                      kernel_size = self.kernel_size),
            nn.ReLU(),
            nn.MaxPool1d()
        )

We do not have any padding operation for my input. It means more efficient than before!

Most helpful comment

>All comments

Was this page helpful?
0 / 5 - 0 ratings

Related issues

kdexd picture kdexd  ·  3Comments

bartolsthoorn picture bartolsthoorn  ·  3Comments

dablyo picture dablyo  ·  3Comments

soumith picture soumith  ·  3Comments

ikostrikov picture ikostrikov  ·  3Comments