我正在尝试跟踪我已抓取的帖子中的链接,以便保存文本。我部分地在那里。我只需要调整一些事情,这就是我在这里的原因。我得到的不是不同的帖子,而是重复的帖子。不仅如此,它们还像这样被括在括号中
[[<div class="article-body" id="image-description"><p>Kanye West premiered
the music video for "Famous" off his "The Life of Pablo" album to a
sold out audience in Los Angeles. The video features nude versions of George W. Bush.
Donald Trump. Anna Wintour. Rihanna. Chris Brown. Taylor Swift.
Kanye West. Kim Kardashian. Ray J. Amber Rose. Caitlyn Jenner.
Bill Cosby (in that order).</p></div>],
这是我的代码
def sprinkle():
url_two = 'http://www.example.com'
html = requests.get(url_two, headers=headers)
soup = BeautifulSoup(html.text, 'html5lib')
titles = soup.find_all('div', {'class': 'entry-pos-1'})
def make_soup(url):
the_comments_page = requests.get(url, headers=headers)
soupdata = BeautifulSoup(the_comments_page.text, 'html5lib')
comment = soupdata.find_all('div', {'class': 'article-body'})
return comment
comment_links = [url_two + link.a.get('href') for link in titles]
soup = [make_soup(comments) for comments in comment_links]
# soup = make_soup(comments)
# print(soup)
entries = [{'href': url_two + div.a.get('href'),
'src': url_two + div.a.img.get('data-original'),
'text': div.find('p', 'entry-title').text,
'comments': soup
} for div in titles][:6]
return entries
我感觉我已经很接近了。这对我来说是全新的。任何帮助都会很棒。
最佳答案
我明白了
def sprinkle():
url_two = 'http://www.vladtv.com'
html = requests.get(url_two, headers=headers)
soup = BeautifulSoup(html.text, 'html5lib')
titles = soup.find_all('div', {'class': 'entry-pos-1'})
def make_soup(url):
the_comments_page = requests.get(url, headers=headers)
soupdata = BeautifulSoup(the_comments_page.text, 'html5lib')
comment = soupdata.find('div', {'class': 'article-body'})
para = comment.find_all('p')
return para
entries = [{'href': url_two + div.a.get('href'),
'src': url_two + div.a.img.get('data-original'),
'text': div.find('p', 'entry-title').text,
'comments': make_soup(url_two + div.a.get('href'))
} for div in titles][:6]
return entries
我正在尝试从结果中删除括号
关于python - 我如何跟踪特定帖子的链接并从该帖子中抓取数据,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/38025466/