Transformers: MPNet: Masked and Permuted Pre-training for Language Understanding

Created on 12 May 2020  ยท  10Comments  ยท  Source: huggingface/transformers

๐ŸŒŸ New model addition

Model description

MPNet: Masked and Permuted Pre-training for Language Understanding
use both Masked Language Model and Permuted Language Model

Open source status

  • [MPNet ] the model implementation is available: (https://github.com/microsoft/MPNet)
  • [MPNet pretrain weight ] the model weights are available: (https://modelrelease.blob.core.windows.net/pre-training/MPNet/mpnet.example.pt)
  • [@tobyoup@StillKeepTry ] who are the authors: (mention them, if possible by @gh-username)
New model

Most helpful comment

We are actively preparing the integration codes, and will be ready these weeks.

All 10 comments

Hi huggingface/transformers,

I am Xu Tan from Microsoft Research. Thanks for that you mentioned MPNet in huggingface/transformers repo.

MPNet is an advanced pre-trained models for language understanding, which inherits the advantages of BERT and XLNet, while well address their problems. We have MPNet pre-trained models, codes (the version for huggingface/transformers) and downstream task results ready.

How can we proceed to incorporate MPNet into huggingface/transformers repo? Thanks very much!

Best,
Xu
From: RyanHuangNLP notifications@github.com
Sent: Tuesday, May 12, 2020 5:48 PM
To: huggingface/transformers transformers@noreply.github.com
Cc: Xu Tan Xu.Tan@microsoft.com; Mention mention@noreply.github.com
Subject: [huggingface/transformers] MPNet: Masked and Permuted Pre-training for Language Understanding (#4308)

๐ŸŒŸ New model addition
Model description

MPNet: Masked and Permuted Pre-training for Language Understandinghttps://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Farxiv.org%2Fpdf%2F2004.09297.pdf&data=02%7C01%7CXu.Tan%40microsoft.com%7C36a49d494f604784087708d7f6598986%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637248737639246486&sdata=%2BE5kPD%2F1nyY9bgRtqr%2FCFf2YzsI8E%2BcTzWX0gcUSnWs%3D&reserved=0
use both Masked Language Model and Permuted Language Model

Open source status

โ€”
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHubhttps://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fhuggingface%2Ftransformers%2Fissues%2F4308&data=02%7C01%7CXu.Tan%40microsoft.com%7C36a49d494f604784087708d7f6598986%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637248737639276475&sdata=t3iSTAMR54mEsK%2Blgc9pjxuYZmnYQcwJjtje%2BQ05yeU%3D&reserved=0, or unsubscribehttps://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fnotifications%2Funsubscribe-auth%2FALQV7L4JWDDOBVHDVZYRCD3RRELMPANCNFSM4M6VMZPQ&data=02%7C01%7CXu.Tan%40microsoft.com%7C36a49d494f604784087708d7f6598986%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637248737639276475&sdata=4496eTEWR5xHALKY9MGu6Aal%2Fx%2BU1dn87FkpASm72ME%3D&reserved=0.

  • RyanHuangNLP

From: Xu Tan
Sent: Sunday, July 5, 2020 6:58 PM
To: huggingface/transformers reply@reply.github.com; huggingface/transformers transformers@noreply.github.com
Subject: RE: [huggingface/transformers] MPNet: Masked and Permuted Pre-training for Language Understanding (#4308)

Hi huggingface/transformers,

I am Xu Tan from Microsoft Research. Thanks for that you mentioned MPNet in huggingface/transformers repo.

MPNet is an advanced pre-trained models for language understanding, which inherits the advantages of BERT and XLNet, while well address their problems. We have MPNet pre-trained models, codes (the version for huggingface/transformers) and downstream task results ready.

How can we proceed to incorporate MPNet into huggingface/transformers repo? Thanks very much!

Best,
Xu
From: RyanHuangNLP notifications@github.com>
Sent: Tuesday, May 12, 2020 5:48 PM
To: huggingface/transformers transformers@noreply.github.com>
Cc: Xu Tan Xu.Tan@microsoft.com>; Mention mention@noreply.github.com>
Subject: [huggingface/transformers] MPNet: Masked and Permuted Pre-training for Language Understanding (#4308)

๐ŸŒŸ New model addition
Model description

MPNet: Masked and Permuted Pre-training for Language Understandinghttps://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Farxiv.org%2Fpdf%2F2004.09297.pdf&data=02%7C01%7CXu.Tan%40microsoft.com%7C36a49d494f604784087708d7f6598986%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637248737639246486&sdata=%2BE5kPD%2F1nyY9bgRtqr%2FCFf2YzsI8E%2BcTzWX0gcUSnWs%3D&reserved=0
use both Masked Language Model and Permuted Language Model

Open source status

โ€”
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHubhttps://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fhuggingface%2Ftransformers%2Fissues%2F4308&data=02%7C01%7CXu.Tan%40microsoft.com%7C36a49d494f604784087708d7f6598986%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637248737639276475&sdata=t3iSTAMR54mEsK%2Blgc9pjxuYZmnYQcwJjtje%2BQ05yeU%3D&reserved=0, or unsubscribehttps://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fnotifications%2Funsubscribe-auth%2FALQV7L4JWDDOBVHDVZYRCD3RRELMPANCNFSM4M6VMZPQ&data=02%7C01%7CXu.Tan%40microsoft.com%7C36a49d494f604784087708d7f6598986%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637248737639276475&sdata=4496eTEWR5xHALKY9MGu6Aal%2Fx%2BU1dn87FkpASm72ME%3D&reserved=0.

Hi @tobyoup, it's very cool that you want to contribute your architecture!

You can follow this checklist to add your model. Once you open a PR, we'll be very happy to help you finish the integration!

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@tobyoup @LysandreJik Is there any update about MPNET in transformers?

We are actively preparing the integration codes, and will be ready these weeks.

@tobyoup Is MPNET. ready in transformers? Hello, any updates?

@tobyoup @LysandreJik Is there any update about MPNET in transformers?

~The PR for the MPNET integration has not seen any progress yet.~ @tobyoup, don't hesitate to ping us if you need help or pointers towards the integration.

There are ongoing efforts to implement MPNet on https://github.com/huggingface/transformers/pull/5804 by @StillKeepTry

Thanks for the reminder! We are adapting the codes these days and we expect to give the first PR by this weekend.

Thanks
Xu
From: Lysandre Debut notifications@github.com
Sent: Friday, November 13, 2020 1:27 AM
To: huggingface/transformers transformers@noreply.github.com
Cc: Xu Tan Xu.Tan@microsoft.com; Mention mention@noreply.github.com
Subject: Re: [huggingface/transformers] MPNet: Masked and Permuted Pre-training for Language Understanding (#4308)

The PR for the MPNET integration has not seen any progress yet. @tobyouphttps://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Ftobyoup&data=04%7C01%7CXu.Tan%40microsoft.com%7C9c1b4ed7fdb847535a2908d887302e38%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637407988352073356%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=Q2NWAAxvh5LXn6d24FW7Uy4jBfs2%2BsodWvfPY3F%2BZVc%3D&reserved=0, don't hesitate to ping us if you need help or pointers towards the integration.

โ€”
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHubhttps://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fhuggingface%2Ftransformers%2Fissues%2F4308%23issuecomment-726222432&data=04%7C01%7CXu.Tan%40microsoft.com%7C9c1b4ed7fdb847535a2908d887302e38%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637407988352073356%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=fnG9MzRgWBxG4tGKsoRJd0GCSHEH8AMvpNZFYNhOcJs%3D&reserved=0, or unsubscribehttps://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fnotifications%2Funsubscribe-auth%2FALQV7L6OVO3TCAHLX3UDVVLSPQLGXANCNFSM4M6VMZPQ&data=04%7C01%7CXu.Tan%40microsoft.com%7C9c1b4ed7fdb847535a2908d887302e38%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637407988352083352%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=kFPFzyRHfpImvd55Kr33XodHoQ8HtMlT8LhC5WDf4qA%3D&reserved=0.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

rsanjaykamath picture rsanjaykamath  ยท  3Comments

fyubang picture fyubang  ยท  3Comments

alphanlp picture alphanlp  ยท  3Comments

siddsach picture siddsach  ยท  3Comments

lcswillems picture lcswillems  ยท  3Comments