Transformers: Tabert

Created on 5 Jul 2020  路  4Comments  路  Source: huggingface/transformers

馃専 New model addition

Model description

a pre-trained language model for learning joint representations of natural language utterances and (semi-)structured tables for semantic parsing. TaBERT is pre-trained on a massive corpus of 26M Web tables and their associated natural language context, and could be used as a drop-in replacement of a semantic parsers original encoder to compute representations for utterances and table schemas (columns).

Open source status

New model wontfix

Most helpful comment

In their readme they say that the implementation to this project is still WIP, but they said this on release.
It should be really easy to add it here while they are already using this library in their implementation.
Does the huggingface team has more information about this, if the community can open an PR for it or waiting for the original authors ?

All 4 comments

In their readme they say that the implementation to this project is still WIP, but they said this on release.
It should be really easy to add it here while they are already using this library in their implementation.
Does the huggingface team has more information about this, if the community can open an PR for it or waiting for the original authors ?

Looking forward to this new model.

Hello all!

I looked at the list of models in the transformers site and TaBERT is still not listed. Does anyone know when it is going to be ready?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

yspaik picture yspaik  路  3Comments

adigoryl picture adigoryl  路  3Comments

alphanlp picture alphanlp  路  3Comments

rsanjaykamath picture rsanjaykamath  路  3Comments

fyubang picture fyubang  路  3Comments