python - 抓取 wsj.com

标签 python json beautifulsoup

我想从 wsj.com 抓取一些数据并打印出来。实际网址为:https://www.wsj.com/market-data/stocks?mod=md_home_overview_stk_main数据为纽约证券交易所发行股数上涨、下跌以及纽约证券交易所股票成交量上涨、下跌。

我在观看 YouTube 视频后尝试使用 beautifulsoup,但我无法让任何类在 body 内返回值。

这是我的代码:

from bs4 import BeautifulSoup
import requests


source = requests.get('https://www.wsj.com/market-data/stocks?mod=md_home_overview_stk_main').text

soup = BeautifulSoup(source, 'lxml')

body = soup.find('body')

adv = body.find('td', class_='WSJTables--table__cell--2dzGiO7q WSJTheme--table__cell--1At-VGNg ')


print(adv)

此外,在检查网络中的元素时,我注意到该数据也可以作为 JSON 提供。

这是链接:https://www.wsj.com/market-data/stocks?id=%7B%22application%22%3A%22WSJ%22%2C%22marketsDiaryType%22%3A%22overview%22%7D&type=mdc_marketsdiary

所以我编写了另一个脚本来尝试使用 JSON 解析此数据,但它再次不起作用。

这是代码:

import json

import requests

url = 'https://www.wsj.com/market-data/stocks?id=%7B%22application%22%3A%22WSJ%22%2C%22marketsDiaryType%22%3A%22overview%22%7D&type=mdc_marketsdiary'

response = json.loads(requests.get(url).text)

print(response)

我得到的错误是:

 File "C:\Users\User\Anaconda3\lib\json\decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None

JSONDecodeError: Expecting value

我还尝试了 this link 中的几种不同方法似乎都不起作用。

您能否告诉我如何抓取这些数据的正确路径?

最佳答案

from bs4 import BeautifulSoup
import requests
import json


params = {
    'id': '{"application":"WSJ","marketsDiaryType":"overview"}',
    'type': 'mdc_marketsdiary'
}

headers = {
    "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:73.0) Gecko/20100101 Firefox/73.0"
}
r = requests.get(
    "https://www.wsj.com/market-data/stocks", params=params, headers=headers).json()


data = json.dumps(r, indent=4)

print(data)

输出:

{
    "id": "{\"application\":\"WSJ\",\"marketsDiaryType\":\"overview\"}",
    "type": "mdc_marketsdiary",
    "data": {
        "instrumentSets": [
            {
                "headerFields": [
                    {
                        "value": "name",
                        "label": "Issues"
                    }
                ],
                "instruments": [
                    {
                        "name": "Advancing",
                        "NASDAQ": "169",
                        "NYSE": "69"
                    },
                    {
                        "name": "Declining",
                        "NASDAQ": "3,190",
                        "NYSE": "2,973"
                    },
                    {
                        "name": "Unchanged",
                        "NASDAQ": "24",
                        "NYSE": "10"
                    },
                    {
                        "name": "Total",
                        "NASDAQ": "3,383",
                        "NYSE": "3,052"
                    }
                ]
            },
            {
                "headerFields": [
                    {
                        "value": "name",
                        "label": "Issues At"
                    }
                ],
                "instruments": [
                    {
                        "name": "New Highs",
                        "NASDAQ": "53",
                        "NYSE": "14"
                    },
                    {
                        "name": "New Lows",
                        "NASDAQ": "1,406",
                        "NYSE": "1,620"
                    }
                ]
            },
            {
                "headerFields": [
                    {
                        "value": "name",
                        "label": "Share Volume"
                    }
                ],
                "instruments": [
                    {
                        "name": "Total",
                        "NASDAQ": "4,454,691,895",
                        "NYSE": "7,790,947,818"
                    },
                    {
                        "name": "Advancing",
                        "NASDAQ": "506,192,012",
                        "NYSE": "219,412,232"
                    },
                    {
                        "name": "Declining",
                        "NASDAQ": "3,948,035,191",
                        "NYSE": "7,570,377,893"
                    },
                    {
                        "name": "Unchanged",
                        "NASDAQ": "464,692",
                        "NYSE": "1,157,693"
                    }
                ]
            }
        ],
        "timestamp": "4:00 PM EDT 3/09/20"
    },
    "hash": "{\"id\":\"{\\\"application\\\":\\\"WSJ\\\",\\\"marketsDiaryType\\\":\\\"overview\\\"}\",\"type\":\"mdc_marketsdiary\",\"data\":{\"instrumentSets\":[{\"headerFields\":[{\"value\":\"name\",\"label\":\"Issues\"}],\"instruments\":[{\"name\":\"Advancing\",\"NASDAQ\":\"169\",\"NYSE\":\"69\"},{\"name\":\"Declining\",\"NASDAQ\":\"3,190\",\"NYSE\":\"2,973\"},{\"name\":\"Unchanged\",\"NASDAQ\":\"24\",\"NYSE\":\"10\"},{\"name\":\"Total\",\"NASDAQ\":\"3,383\",\"NYSE\":\"3,052\"}]},{\"headerFields\":[{\"value\":\"name\",\"label\":\"Issues At\"}],\"instruments\":[{\"name\":\"New Highs\",\"NASDAQ\":\"53\",\"NYSE\":\"14\"},{\"name\":\"New Lows\",\"NASDAQ\":\"1,406\",\"NYSE\":\"1,620\"}]},{\"headerFields\":[{\"value\":\"name\",\"label\":\"Share Volume\"}],\"instruments\":[{\"name\":\"Total\",\"NASDAQ\":\"4,454,691,895\",\"NYSE\":\"7,790,947,818\"},{\"name\":\"Advancing\",\"NASDAQ\":\"506,192,012\",\"NYSE\":\"219,412,232\"},{\"name\":\"Declining\",\"NASDAQ\":\"3,948,035,191\",\"NYSE\":\"7,570,377,893\"},{\"name\":\"Unchanged\",\"NASDAQ\":\"464,692\",\"NYSE\":\"1,157,693\"}]}],\"timestamp\":\"4:00 PM EDT 3/09/20\"}}"
}

Note: You can access it as dict print(r.keys()).

关于python - 抓取 wsj.com,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/60606633/

相关文章:

python - 在python中创建二维数组的两种方法

python - 每次我提交时,Web UI 中的 Selenium 类都会被调用

python - 在 3D 数据中寻找多条路径

json - 异步问题: Calling Web Api from Swift

Javascript:确定未知数组长度并动态映射

python - BS4 + Python3 : unable to crawl tree: 'NavigableString' object has no attribute 'has_attr'

python - 如何在Python中分开打印系数和指数?

json - Angular Json 管道类型错误 : Converting circular structure to JSON

python - beautifulsoup .get_text() 对我的 HTML 解析不够具体

python-3.x - 如何高效的提取出这个类内部最里面的内容呢?