如何将文档(例如,段落、书籍等)分解为句子。
例如,"The dog ran. The cat jumped"
变成 ["The dog ran", "The cat jumped"]
with spacy?
最佳答案
最新的答案是这样的:
from __future__ import unicode_literals, print_function
from spacy.lang.en import English # updated
raw_text = 'Hello, world. Here are two sentences.'
nlp = English()
nlp.add_pipe(nlp.create_pipe('sentencizer')) # updated
doc = nlp(raw_text)
sentences = [sent.string.strip() for sent in doc.sents]
关于python - 如何使用 Spacy 按句子分解文档,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46290313/