Transformers: Differences between facebook/bart-base and facebook/bart-large?

Created on 23 Oct 2020  ยท  12Comments  ยท  Source: huggingface/transformers

โ“ Questions & Help

Is there some more difference between facebook/bart-base and facebook/bart-large (other than dimensions, heads and layers)?

Who can help

@sshleifer @WiseDoge

Environment info

  • transformers version: 3.3.1
  • Python version: 3.6.12
  • PyTorch version (GPU?): 1.4.0 GPU-version

Command:

I'm using the seq2seq/finetune.py script to finetune both BARTs.

python finetune.py \
--data_dir=${DATA_DIR} \
--learning_rate=3e-5 \
--num_train_epochs 5 \
--task summarization \
--model_name_or_path=${MODEL} \
--train_batch_size=4 \
--eval_batch_size=4 \
--gpus 1 \
--output_dir=$OUTPUT_DIR \
--max_source_length=256 \
--max_target_length=256 \
--val_max_target_length=256 \
--test_max_target_length=256 \
--eval_max_gen_length=256 \
--do_train --do_predict \
--eval_beams 5

${MODEL} model can be facebook/bart-base or facebook/bart-large

Details

When I finetune facebook/bart-base, it works well:

"input_ids": " <s> ( report :ARG1 ( station :ARG1 ( troop :mod ( country :wiki Russia :name ( name :op1 Russia ) ) :ARG0-of ( withdraw :ARG2 ( country :quant 3 :location ( sea :wiki Baltic_Sea :name ( name :op1 Baltic :op2 Sea ) ) ) ) ) :ARG2 ( and :op1 ( state :wiki - :name ( name :op1 Jalininggele ) :location country ) :op2 ( state :wiki - :name ( name :op1 Simolingsike ) ) :op3 ( city :wiki - :name ( name :op1 Yelinia ) :location ( relative-position :op1 ( city :wiki Moscow :name ( name :op1 Moscow ) ) :quant ( distance-quantity :quant 300 :unit ( kilometer ) ) ) ) ) :mod ( respective ) ) )</s><pad><pad><pad>",
        "labels": "<s> It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.</s>",
        "decoder_input_ids": "</s><s> It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.",
        "generated_ids": "</s><s> Russian troops reported to be stationed in the 3 Baltic Sea countries of Jalininggele, Simolingsike and Yelinia 300 kilometers (110 miles) from Moscow.</s><pad><pad><pad><pad><pad><pad><pad>"

When I finetune facebook/bart-large, it did not generate a reasonable output:

"input_ids": "<s> ( report :ARG1 ( station :ARG1 ( troop :mod ( country :wiki Russia :name ( name :op1 Russia ) ) :ARG0-of ( withdraw :ARG2 ( country :quant 3 :location ( sea :wiki Baltic_Sea :name ( name :op1 Baltic :op2 Sea ) ) ) ) ) :ARG2 ( and :op1 ( state :wiki - :name ( name :op1 Jalininggele ) :location country ) :op2 ( state :wiki - :name ( name :op1 Simolingsike ) ) :op3 ( city :wiki - :name ( name :op1 Yelinia ) :location ( relative-position :op1 ( city :wiki Moscow :name ( name :op1 Moscow ) ) :quant ( distance-quantity :quant 300 :unit ( kilometer ) ) ) ) ) :mod ( respective ) ) )</s><pad><pad><pad>",
        "labels": "<s> It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.</s>",
        "decoder_input_ids": "</s><s> It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.",
        "generated_ids": "</s><s><s><s><s><s><s><s><s><s><s> ... <s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s></s>"

I'm using the same code, but only facebook/bart-base model works. In a previous transformer version, both worked, but not in this one (3.3.1).

Most helpful comment

cc @patil-suraj @stas00 @patrickvonplaten for awareness of a very sneaky bug.

All 12 comments

If you look at

https://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-large/config.json
and
https://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-base/config.json
(how to do this for any model: go to model hub and click see raw config file)

you will see different task_specific_params. These are used for fine-tuning by default so bart-large
is forced to generate at least 56 tokens.

There are many ways to fix. Easiest is to comment out this line https://github.com/huggingface/transformers/blob/master/examples/seq2seq/finetune.py#L66

More involved would be to make a local copy of the config and insert the generation parameters you want. You can pass it to finetune.py with --config_name.

I will think about how to update bart-base and bart-large to have more reasonable task_specific_params.

cc @patil-suraj @stas00 @patrickvonplaten for awareness of a very sneaky bug.

@sshleifer , thank you very much for your reply. Indeed, I have checked those configurations. So I changed the parameters for the generate method to consider min_length=0:

        generated_ids = self.model.generate(
            batch["input_ids"],
            attention_mask=batch["attention_mask"],
            use_cache=True,
            decoder_start_token_id=self.decoder_start_token_id,
            num_beams=self.eval_beams,
            no_repeat_ngram_size=0,
            min_length=0,
            max_length=self.eval_max_length,
            length_penalty=1.0
        )

I used this code for both facebook/bart-base and facebook/bart-large. And the outputs for bart-large are as I mentioned. I have been trying to figure out the reason in the last days without success. Maybe I'm doing some wrong, but I could not discover what it is yet.

Another point is that the generation for bart-large is much slower than bart-base. Maybe it is because the model is generating tokens until the limit (max_length).

How did you call generate to produce the outputs in your Issue Description?
Your change to finetune.py will not change the config.

This is my _generative_step method:

    def _generative_step(self, batch: dict) -> dict:
        t0 = time.time()

        generated_ids = self.model.generate(
            batch["input_ids"],
            attention_mask=batch["attention_mask"],
            use_cache=True,
            decoder_start_token_id=self.decoder_start_token_id,
            num_beams=self.eval_beams,
            no_repeat_ngram_size=0,
            min_length=0,
            max_length=self.eval_max_length,
            length_penalty=1.0
        )
        gen_time = (time.time() - t0) / batch["input_ids"].shape[0]
        preds: List[str] = self.ids_to_clean_text(generated_ids)
        target: List[str] = self.ids_to_clean_text(batch["labels"])

        a = self.tokenizer.batch_decode(batch["input_ids"].tolist())
        b = self.tokenizer.batch_decode(batch["labels"].tolist())
        c = self.tokenizer.batch_decode(generated_ids)
        pad_token_id = self.tokenizer.pad_token_id
        tgt_ids = batch["labels"]
        if isinstance(self.model, T5ForConditionalGeneration):
            decoder_input_ids = self.model._shift_right(tgt_ids)
        else:
            decoder_input_ids = shift_tokens_right(tgt_ids, pad_token_id)
        e = self.tokenizer.batch_decode(decoder_input_ids.tolist())

        loss_tensors = self._step(batch)
        base_metrics = {name: loss for name, loss in zip(self.loss_names, loss_tensors)}
        rouge: Dict = self.calc_generative_metrics(preds, target)
        summ_len = np.mean(lmap(len, generated_ids))
        base_metrics.update(gen_time=gen_time, gen_len=summ_len, preds=preds, target=target, a=a, b=b, c=c, e=e, **rouge)
        return base_metrics

_step method:

    def _step(self, batch: dict) -> Tuple:
        pad_token_id = self.tokenizer.pad_token_id
        src_ids, src_mask = batch["input_ids"], batch["attention_mask"]
        tgt_ids = batch["labels"]
        if isinstance(self.model, T5ForConditionalGeneration):
            decoder_input_ids = self.model._shift_right(tgt_ids)
        else:
            decoder_input_ids = shift_tokens_right(tgt_ids, pad_token_id)
        if not self.already_saved_batch:  # This would be slightly better if it only happened on rank zero
            batch["decoder_input_ids"] = decoder_input_ids
            self.save_readable_batch(batch)

        outputs = self(src_ids, attention_mask=src_mask, decoder_input_ids=decoder_input_ids, use_cache=False)
        lm_logits = outputs[0]
        if self.hparams.label_smoothing == 0:
            # Same behavior as modeling_bart.py, besides ignoring pad_token_id
            ce_loss_fct = torch.nn.CrossEntropyLoss(ignore_index=pad_token_id)

            assert lm_logits.shape[-1] == self.vocab_size
            loss = ce_loss_fct(lm_logits.view(-1, lm_logits.shape[-1]), tgt_ids.view(-1))
        else:
            lprobs = torch.nn.functional.log_softmax(lm_logits, dim=-1)
            loss, nll_loss = label_smoothed_nll_loss(
                lprobs, tgt_ids, self.hparams.label_smoothing, ignore_index=pad_token_id
            )
        return (loss,)

This is my validation_epoch_end:

    def validation_epoch_end(self, outputs, prefix="val") -> Dict:
        self.step_count += 1
        losses = {k: torch.stack([x[k] for x in outputs]).mean() for k in self.loss_names}
        loss = losses["loss"]
        generative_metrics = {
            k: np.array([x[k] for x in outputs]).mean() for k in self.metric_names + ["gen_time", "gen_len"]
        }
        metric_val = (
            generative_metrics[self.val_metric] if self.val_metric in generative_metrics else losses[self.val_metric]
        )
        metric_tensor: torch.FloatTensor = torch.tensor(metric_val).type_as(loss)
        generative_metrics.update({k: v.item() for k, v in losses.items()})
        losses.update(generative_metrics)
        all_metrics = {f"{prefix}_avg_{k}": x for k, x in losses.items()}
        all_metrics["step_count"] = self.step_count
        self.metrics[prefix].append(all_metrics)  # callback writes this to self.metrics_save_path
        preds = flatten_list([x["preds"] for x in outputs])

        val_outputs_folder = "val_outputs"
        os.system("mkdir -p " + os.path.join(self.hparams.output_dir, val_outputs_folder))

        if "preds" in outputs[0]:
            tb_all = {}
            idx_tb = 0
            for output_batch in outputs:
                a,b,c,e = output_batch["a"], output_batch["b"], output_batch["c"], output_batch["e"]


                for aa,bb,ee,cc in zip(a,b,e,c):
                    tb_all[idx_tb] = {}
                    tb_all[idx_tb]['input_ids'] = aa
                    tb_all[idx_tb]['labels'] = bb
                    tb_all[idx_tb]['decoder_input_ids'] = ee
                    tb_all[idx_tb]['generated_ids'] = cc
                    idx_tb += 1

            file_debug = os.path.join(self.hparams.output_dir, val_outputs_folder,
                                      "debug_" +
                                      str(self.step_count) + ".json")
            save_json(tb_all, file_debug)

        return {
            "log": all_metrics,
            "preds": preds,
            f"{prefix}_loss": loss,
            f"{prefix}_{self.val_metric}": metric_tensor,
        }

So I use the debug_k.json file to check the outputs. Sorry for the variable names.

One example for bart-base:

    "1366": {
        "input_ids": "<s> ( report :ARG1 ( station :ARG1 ( troop :mod ( country :wiki Russia :name ( name :op1 Russia ) ) :ARG0-of ( withdraw :ARG2 ( country :quant 3 :location ( sea :wiki Baltic_Sea :name ( name :op1 Baltic :op2 Sea ) ) ) ) ) :ARG2 ( and :op1 ( state :wiki - :name ( name :op1 Jalininggele ) :location country ) :op2 ( state :wiki - :name ( name :op1 Simolingsike ) ) :op3 ( city :wiki - :name ( name :op1 Yelinia ) :location ( relative-position :op1 ( city :wiki Moscow :name ( name :op1 Moscow ) ) :quant ( distance-quantity :quant 300 :unit ( kilometer ) ) ) ) ) :mod ( respective ) ) )</s><pad><pad><pad>",
        "labels": "<s> It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.</s>",
        "decoder_input_ids": "</s><s> It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.",
        "generated_ids": "</s><s> Russian troops withdrawing from 3 Baltic Sea countries are reported to have respectively been stationed in the Baltic Sea states of Jalininggele,Simolingsike and Yelinia 300 kilometers away from Moscow.</s>"
    },

one example for bart-large:

    "1366": {
        "input_ids": "<s> ( report :ARG1 ( station :ARG1 ( troop :mod ( country :wiki Russia :name ( name :op1 Russia ) ) :ARG0-of ( withdraw :ARG2 ( country :quant 3 :location ( sea :wiki Baltic_Sea :name ( name :op1 Baltic :op2 Sea ) ) ) ) ) :ARG2 ( and :op1 ( state :wiki - :name ( name :op1 Jalininggele ) :location country ) :op2 ( state :wiki - :name ( name :op1 Simolingsike ) ) :op3 ( city :wiki - :name ( name :op1 Yelinia ) :location ( relative-position :op1 ( city :wiki Moscow :name ( name :op1 Moscow ) ) :quant ( distance-quantity :quant 300 :unit ( kilometer ) ) ) ) ) :mod ( respective ) ) )</s><pad><pad><pad><pad><pad><pad><pad><pad><pad><pad><pad><pad><pad><pad><pad><pad>",
        "labels": "<s> It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.</s>",
        "decoder_input_ids": "</s><s> It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.",
        "generated_ids": "</s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s><s></s>"
    },

@sshleifer I have changed the code (3.3.1) version in order to use the same processed decoder input for the model as the one used in transformer version 2.11.0 and it worked for both BARTs! Both BARTs (facebook/bart-base and facebook/bart-large) give good BLEU scores and generate good outputs!

The changed code:

    def _step(self, batch: dict) -> Tuple:
        pad_token_id = self.tokenizer.pad_token_id
        src_ids, src_mask = batch["input_ids"], batch["attention_mask"]
        if isinstance(self.model, T5ForConditionalGeneration):
            tgt_ids = batch["labels"]
            decoder_input_ids = self.model._shift_right(tgt_ids)
        else:
            #decoder_input_ids = shift_tokens_right(tgt_ids, pad_token_id)
            y = batch["labels"]
            decoder_input_ids = y[:, :-1].contiguous()
            tgt_ids = y[:, 1:].clone()
        if not self.already_saved_batch:  # This would be slightly better if it only happened on rank zero
            batch["decoder_input_ids"] = decoder_input_ids
            self.save_readable_batch(batch)

        outputs = self(src_ids, attention_mask=src_mask, decoder_input_ids=decoder_input_ids, use_cache=False)
        lm_logits = outputs[0]
        if self.hparams.label_smoothing == 0:
            # Same behavior as modeling_bart.py, besides ignoring pad_token_id
            ce_loss_fct = torch.nn.CrossEntropyLoss(ignore_index=pad_token_id)

            assert lm_logits.shape[-1] == self.vocab_size
            loss = ce_loss_fct(lm_logits.view(-1, lm_logits.shape[-1]), tgt_ids.view(-1))
        else:
            lprobs = torch.nn.functional.log_softmax(lm_logits, dim=-1)
            loss, nll_loss = label_smoothed_nll_loss(
                lprobs, tgt_ids, self.hparams.label_smoothing, ignore_index=pad_token_id
            )
        return (loss,)

an example generated by facebook/bart-base using the new code:

    "1366": {
        "input_ids": "<s> ( report :ARG1 ( station :ARG1 ( troop :mod ( country :wiki Russia :name ( name :op1 Russia ) ) :ARG0-of ( withdraw :ARG2 ( country :quant 3 :location ( sea :wiki Baltic_Sea :name ( name :op1 Baltic :op2 Sea ) ) ) ) ) :ARG2 ( and :op1 ( state :wiki - :name ( name :op1 Jalininggele ) :location country ) :op2 ( state :wiki - :name ( name :op1 Simolingsike ) ) :op3 ( city :wiki - :name ( name :op1 Yelinia ) :location ( relative-position :op1 ( city :wiki Moscow :name ( name :op1 Moscow ) ) :quant ( distance-quantity :quant 300 :unit ( kilometer ) ) ) ) ) :mod ( respective ) ) )</s><pad><pad><pad>",
        "labels": " It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.</s>",
        "decoder_input_ids": "<s> It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.",
        "generated_ids": "</s> Russian troops withdrawing from 3 Baltic Sea countries have been reported to be stationed respectively in Jalininggele, Simolingsike and Yelinia 300 kilometers (200 miles) from Moscow.</s><pad><pad>"
    },

an example generated by facebook/bart-large using the new code:

    "1366": {
        "input_ids": "<s> ( report :ARG1 ( station :ARG1 ( troop :mod ( country :wiki Russia :name ( name :op1 Russia ) ) :ARG0-of ( withdraw :ARG2 ( country :quant 3 :location ( sea :wiki Baltic_Sea :name ( name :op1 Baltic :op2 Sea ) ) ) ) ) :ARG2 ( and :op1 ( state :wiki - :name ( name :op1 Jalininggele ) :location country ) :op2 ( state :wiki - :name ( name :op1 Simolingsike ) ) :op3 ( city :wiki - :name ( name :op1 Yelinia ) :location ( relative-position :op1 ( city :wiki Moscow :name ( name :op1 Moscow ) ) :quant ( distance-quantity :quant 300 :unit ( kilometer ) ) ) ) ) :mod ( respective ) ) )</s><pad><pad><pad>",
        "labels": " It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.</s>",
        "decoder_input_ids": "<s> It is reported that the Russian troops that withdrew from the three Baltic Sea countries will be stationed respectively in the Russian state of Jalininggele, the state of Simolingsike and Yelinia city which is 300 kilometers away from Moscow.",
        "generated_ids": "</s> The Russian troop stations were respectively located in Jalininggele, Simolingsike and Yelinia located 300 kilometers (250 miles) away from Moscow in 3 countries on the Baltic Sea.</s><pad><pad><pad><pad><pad><pad><pad><pad><pad><pad><pad>"
    },

What I don't understand is why the previous version only works for bart-base, in my experiments. Another question is what is the correct/better way to use the model (to use shift_tokens_right or another approach?)

Interesting.
shift_tokens_right has always done better on my datasets, but it's interesting that you have the opposite experience. The old code tgt_ids = y[:, 1:].clone() doesn't work well for tokenizers (Marian, Pegasus, T5) that don't add a <s> token to the beginning of the sequence, because it deletes a token.

If you can replicate the results on a small/shareable dataset I would be happy to try to understand what's going on more deeply.

I can see a changing behavior of bart-large between v3.0.2 and v3.1.0, which seems to be linked to your findings. Here's a minimal example for language generation:

import transformers

from transformers import (
    BartTokenizer,
    BartForConditionalGeneration,
)

print(f'** transformers v{transformers.__version__} **')

tokenizer = BartTokenizer.from_pretrained('facebook/bart-large')
model = BartForConditionalGeneration.from_pretrained('facebook/bart-large')

input_txt = 'This is <mask> sentence.'
print(f'Input: "{input_txt}"')

inputs = tokenizer.encode(input_txt, return_tensors='pt')
outputs = model.generate(inputs)
output_txt = tokenizer.decode(outputs[0], skip_special_tokens=True)

print(f'Output: "{output_txt}"')

For v3.0.2, it correctly produces

** transformers v3.0.2 **
Some weights of BartForConditionalGeneration were not initialized from the model checkpoint at facebook/bart-large and are newly initialized: ['final_logits_bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Input: "This is <mask> sentence."
Output: "This is a partial sentence."

while v3.1.0 repeats the first token:

** transformers v3.1.0 **
Input: "This is <mask> sentence."
Output: "ThisThis is a sentence."

Digging a bit deeper, I can trace the issue back to this line https://github.com/huggingface/transformers/blob/4b3ee9cbc53c6cf6cee6bfae86cc2c6ec0778ee5/src/transformers/modeling_bart.py#L1114
and, in turn, the default value of force_bos_token_to_be_generated:
https://github.com/huggingface/transformers/blob/4b3ee9cbc53c6cf6cee6bfae86cc2c6ec0778ee5/src/transformers/configuration_bart.py#L140

To restore behavior from v3.0.2, we can change that value manually

...
config = BartConfig.from_pretrained('facebook/bart-large')
config.force_bos_token_to_be_generated = True

tokenizer = BartTokenizer.from_pretrained('facebook/bart-large')
model = BartForConditionalGeneration.from_pretrained('facebook/bart-large', config=config)
...

which gives

** transformers v3.1.0 **
Input: "This is <mask> sentence."
Output: "This is a partial sentence."

and even

** transformers v3.4.0 **
Input: "This is <mask> sentence."
Output: "This is a partial sentence."

@sshleifer What's the best approach to fix this? Modify bart-large's config.json?

Your solution is awesome, great catch!

I think the right fix is to

  • Update the docs
  • add task_specific_params : {'fill_mask': {'force_bos_token_to_be_generated': 'true'} to bart-base and bart-large configs.

I am hesitant to change the default because force_bos_token_to_be_generated = False seems to be optimal for many fine-tuning tasks.

Added a mask filling example to the docs in #8421 .

:+1: Brilliant, thanks a lot @sshleifer !

Was this page helpful?
0 / 5 - 0 ratings

Related issues

guanlongtianzi picture guanlongtianzi  ยท  3Comments

lemonhu picture lemonhu  ยท  3Comments

zhezhaoa picture zhezhaoa  ยท  3Comments

fabiocapsouza picture fabiocapsouza  ยท  3Comments

yspaik picture yspaik  ยท  3Comments