Hi, I'm trying to train a model for Dependency Parsing Task. I found a way to encode the labels of dependency parsing task (using connlu format) and use such that representation to perform a sequence labeling on them (https://github.com/mstrise/dep2label). The problem is that the number of labels is too high.
Which approach did you follow?
Thank you
@louismartin (and benjamin, can't find his github)
@benjamin-mlr any ideas ?
Hi,
The way we did it is to append a graph predictor following Dozat et al. work (https://arxiv.org/abs/1611.01734) on top of CamemBERT.
Then using greedy decoding, we predict the parse tree.
We will release the code soon but meanwhile.
Benjamin
Hello @benjamin-mlr
Do you still plan to release the code?
Hi @GuillaumeEmvista,
We do plan to release the code.
This might take longer than we originally planned but I will keep you updated when we have a clear schedule for the release of the fine-tuned models and the code.
Hi @nlp-ensae,
Any update on this matter ?
Thanks
Hi @nlp-ensae,
is there any news?
Thank you!
The code for fine-tuning has not been integrated with fairseq, It is
based on a fork from the transformer library: Here is it
https://github.com/benjamin-mlr/camembert_finetune
On Tue, Nov 10, 2020 at 1:01 PM Elidor00 notifications@github.com wrote:
Hi @nlp-ensae https://github.com/nlp-ensae,
is there any news?
Thank you!—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/pytorch/fairseq/issues/1601#issuecomment-724686718,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AEHOJY2LIFQG3UMIHTWUQMDSPE2RXANCNFSM4KEGX3FQ
.
--
Benjamin Muller
Ms in Data Science, specialised in Deep Learning applied to NLP
Benjamin Muller
Ms in Data Science, specialised in Deep Learning applied to NLP
www.linkedin.com/in/ http://www.linkedin.com/in/benjamin-muller-19796191
Most helpful comment
@benjamin-mlr any ideas ?