Transformers: 馃専 BigBird

Created on 29 Jul 2020  路  16Comments  路  Source: huggingface/transformers

馃専 New model addition

Model description

Paper : https://arxiv.org/pdf/2007.14062.pdf

Abstract :

Transformers-based models, such as BERT, have been one of the most successful deep learning
models for NLP. Unfortunately, one of their core limitations is the quadratic dependency
(mainly in terms of memory) on the sequence length due to their full attention mechanism.
To remedy this, we propose, BigBird, a sparse attention mechanism that reduces this
quadratic dependency to linear. We show that BigBird is a universal approximator of
sequence functions and is Turing complete, thereby preserving these properties of the
quadratic, full attention model. Along the way, our theoretical analysis reveals some of the
benefits of having O(1) global tokens (such as CLS), that attend to the entire sequence
as part of the sparse attention mechanism. The proposed sparse attention can handle
sequences of length up to 8x of what was previously possible using similar hardware. As
a consequence of the capability to handle longer context, BigBird drastically improves
performance on various NLP tasks such as question answering and summarization. We also
propose novel applications to genomics data.

Open source status

  • [ ] the model implementation is available: No
  • [ ] the model weights are available: No
  • [ ] who are the authors: ?
New model

Most helpful comment

I am planning to start a small tight group of individuals who will work on implementing research papers for proper business use cases.
Please let me know if anyone is interested for the same.
Project 1 : BigBert for Genomics Research

All 16 comments

When will be getting this model?

Until the weights and code are not published I think we won't focus too much on adding the model

I am planning to start a small tight group of individuals who will work on implementing research papers for proper business use cases.
Please let me know if anyone is interested for the same.
Project 1 : BigBert for Genomics Research

I am planning to start a small tight group of individuals who will work on implementing research papers for proper business use cases.
Please let me know if anyone is interested for the same.
Project 1 : BigBert for Genomics Research

I'll be up for this project

I'll be up for this project too. I got a slightly different use case idea, tho. :)

@sathvikask0
I am super interesting about the BigBird for Genomics Research. Are you planning to release the fixed-length embedding part as well?

I'm also doing some research on using Google BigBird for genomics research. There's a competition going on right now and we can definitely leverage BigBird for genomics sequencing.

@sathvikask0 @nikhilbyte @seduerr91
What if we could meet together and talk about the BigBert implementation for Genomics Research?

Sure do you want to set up a google meet?

I'm in.

Hello @nikhilbyte @seduerr91 @ptynecki are we still doing this, I want to be a part of it!

Hello @nikhilbyte @seduerr91 @ptynecki are we still doing this, I want to be a part of it!

I'm up for this. Let me know how to connect with you.

@patrickvonplaten actually you can read on the paper (appendix E, section E.4) that for summarization, "For the large size model, we lift weight from the state-of-the-art Pegasus model [107], which is pretrained using an objective designed for summarization task". Do you think it would be possible to include the new architecture, using the weights already available of google/pegasus-large?

Is there an official code base by now?

As soon as weights and codebase is out, we'll integrate! But it does not make much sense IMO to do it before that

I am planning to start a small tight group of individuals who will work on implementing research papers for proper business use cases.
Please let me know if anyone is interested for the same.
Project 1 : BigBert for Genomics Research

I would like to join the effort as well

Was this page helpful?
0 / 5 - 0 ratings

Related issues

ereday picture ereday  路  3Comments

0x01h picture 0x01h  路  3Comments

iedmrc picture iedmrc  路  3Comments

siddsach picture siddsach  路  3Comments

adigoryl picture adigoryl  路  3Comments