Transformers: Add new PET Model

Created on 24 Sep 2020  路  2Comments  路  Source: huggingface/transformers

馃専 New model addition

Model description

A new article just landed on ArXiv: https://arxiv.org/pdf/2009.07118.pdf
An implementation will eventually be available at https://github.com/timoschick/pet

Authors are @timoschick and Hinrich Schutze.

I didn't see any pre-trained models linked on the GitHub README, but the model is pretty small and easy to train.

Open source status

  • [ ] the model implementation is available: (give details)
  • [ ] the model weights are available: (give details)
  • [x] who are the authors: (mention them, if possible by @gh-username)
New model

Most helpful comment

Looks like the authors updated the repo and added the necessary model.

All 2 comments

The readme in the repo still says this:

:rotating_light: This repository does not yet contain the modifications to PET introduced in "It's Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners" but will be updated soon.

Looks like the authors updated the repo and added the necessary model.

Was this page helpful?
0 / 5 - 0 ratings