在 Transformers 库中,Pegasus 模型的单词和/或句子的最大输入长度是多少?我在 Pegasus 研究论文中读到最大值是 512 个标记,但那是多少个单词和/或句子?另外,能不能增加512个 token 的最大数量?
最佳答案
In the Transformers library, what is the maximum input length of words and/or sentences of the Pegasus model? It actually depends on your pretraining. You can create a pegagsus model that supports a length of 100 tokens or 10000 tokens. For example the model
google/pegasus-cnn_dailymail
supports 1024 tokens, whilegoogle/pegasus-xsum
supports 512:
from transformers import PegasusTokenizerFast
t = PegasusTokenizerFast.from_pretrained("google/pegasus-xsum")
t2 = PegasusTokenizerFast.from_pretrained("google/pegasus-cnn_dailymail")
print(t.max_len_single_sentence)
print(t2.max_len_single_sentence)
输出:
511
1023
由于添加到每个序列的特殊 token ,数字减少了一个。
I read in the Pegasus research paper that the max was 512 tokens, but how many words and/or sentences is that?
这取决于你的词汇量。
from transformers import PegasusTokenizerFast
t = PegasusTokenizerFast.from_pretrained("google/pegasus-xsum")
print(t.tokenize('This is a test sentence'))
print("I know {} tokens".format(len(t)))
输出:
['▁This', '▁is', '▁a', '▁test', '▁sentence']
I know 96103 tokens
一个单词可以是一个token,也可以拆分成几个token:
print(t.tokenize('neuropsychiatric conditions'))
输出:
['▁neuro', 'psych', 'i', 'atric', '▁conditions']
Also, can you increase the maximum number of 512 tokens?
是的,您可以针对不同的输入长度训练具有 Pegasus 架构的模型,但这成本很高。
关于python - Transformers 库中 Pegasus 模型的单词/句子的最大输入长度,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/66703229/