python - 当连续发送多个请求时请求停止

标签 python scrapy python-requests

我正在使用 scrapy 和 python-requests 解析在线商店,在获得所有信息后,我又发出一个请求以通过 python-requests 获取数量,几分钟后蜘蛛停止工作 我不知道是什么造成了麻烦。有什么建议吗?

Scrapy 日志:

2014-05-08 15:27:57+0300 [scrapy] DEBUG: Start adding sku1270594 to a cart.
INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): www.sds.com.au
DEBUG:requests.packages.urllib3.connectionpool:"GET /product/trefoil-tee-by-adidas-in-black-camo-grey HTTP/1.1" 200 20223
INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): www.sds.com.au
DEBUG:requests.packages.urllib3.connectionpool:"POST /common/ajaxResponse.jsp;jsessionid=34E95C7662D0F5084FF971CC5693E6E8.store-node1?_DARGS=/browse/product.jsp.addToCartForm HTTP/1.1" 200 146
2014-05-08 15:27:59+0300 [scrapy] DEBUG: End adding sku1270594 to a cart.
2014-05-08 15:27:59+0300 [scrapy] DEBUG: Success. quantity of sku1270594 is 16.
2014-05-08 15:28:00+0300 [sds] DEBUG: Updating  product info sku1270594
2014-05-08 15:28:00+0300 [sds] DEBUG: Added new price sku1270594
2014-05-08 15:28:00+0300 [sds] DEBUG: Scraped from <200 http://www.sds.com.au/product/trefoil-tee-by-adidas-in-black-camo-grey>
2014-05-08 15:28:00+0300 [sds] DEBUG: Updating  product info sku901159
2014-05-08 15:28:00+0300 [sds] DEBUG: Added new price sku901159
2014-05-08 15:28:00+0300 [sds] DEBUG: Scraped from <200 http://www.sds.com.au/product/two-palm-tee-by-folke-in-chalk>
2014-05-08 15:28:00+0300 [sds] DEBUG: Updating  product info sku901163
2014-05-08 15:28:00+0300 [sds] DEBUG: Added new price sku901163
2014-05-08 15:28:00+0300 [sds] DEBUG: Scraped from <200 http://www.sds.com.au/product/two-palm-tee-by-folke-in-chalk>
2014-05-08 15:28:00+0300 [scrapy] DEBUG: Start adding sku1270591 to a cart.
INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): www.sds.com.au
DEBUG:requests.packages.urllib3.connectionpool:"GET /product/trefoil-tee-by-adidas-in-black-camo-grey HTTP/1.1" 200 20225
INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): www.sds.com.au

就是这样。控制台中不再发生任何事情。 这是获取数量的函数:

def get_qty(self, item):
    r = requests.get(item['url'])
    cookie_cart_user = dict(r.cookies)
    sel = Selector(text=r.text, type="html")
    session = sel.xpath('//input[@name="_dynSessConf"]/@value').extract()[0]
    # print session
    # print cookie_cart_user
    add_to_cart_url = 'http://www.sds.com.au/common/ajaxResponse.jsp;jsessionid=%s?_DARGS=/browse/product.jsp.addToCartForm' % cookie_cart_user['JSESSIONID']
    # ok, so we're adding one item
    log.msg("Adding %s to a cart." % item['internal_id'], log.DEBUG)
    headers = {
        'User-Agent': USER_AGENT,
        'Accept': 'application/json, text/javascript, */*; q=0.01',
        'Connection': 'close',
    }
    s = requests.session()
    s.keep_alive = False
    r = requests.post(add_to_cart_url,
                      data=self.generate_form_data(item, 10000, session),
                      cookies=cookie_cart_user,
                      headers=headers,
                      timeout=10)
    response = r.json()
    r.close()
    try:
        quantity = int(re.findall(u'\d+', response['formErrors'][0]['errorMessage'])[0])
        log.msg("Success. quantity of %s is %s." % (item['internal_id'], quantity), log.DEBUG)
        return quantity
    except Exception, e:
        log.msg('Error getting data-cart-item on product %s. Error: %s' % (item['internal_id'], str(e)), log.ERROR)
        with open("log/%s.html" % item['internal_id'], "w") as myfile:
            myfile.write('%s' % r.text.encode('utf-8'))

最佳答案

嗯,Jan Vlcinsky 建议深入记录请求,经过一番挖掘,我决定稍微重新组织我的代码,这给了我正确的答案,现在一切都很好。

def get_qty(self, item):
    log.msg("Start adding %s to a cart." % item['internal_id'], log.DEBUG)
    logging.basicConfig(level=logging.DEBUG)
    sess = requests.Session()
    sess.keep_alive = False
    adapter = HTTPAdapter(max_retries=50)
    sess.mount('http://', adapter)
    r = sess.get(item['url'])
    cookie_cart_user = dict(r.cookies)
    sel = Selector(text=r.text, type="html")
    session = sel.xpath('//input[@name="_dynSessConf"]/@value').extract()[0]
    add_to_cart_url = 'http://www.sds.com.au/common/ajaxResponse.jsp;jsessionid=%s?_DARGS=/browse/product.jsp.addToCartForm' % cookie_cart_user['JSESSIONID']
    headers = {
        'User-Agent': USER_AGENT,
        'Accept': 'application/json, text/javascript, */*; q=0.01',
    }
    r = sess.post(add_to_cart_url,
                      data=self.generate_form_data(item, 10000, session),
                      cookies=cookie_cart_user,
                      headers=headers,
                      )
    log.msg("End adding %s to a cart." % item['internal_id'], log.DEBUG)
    try:
        response = r.json()
        r.close()
        quantity = int(re.findall(u'\d+', response['formErrors'][0]['errorMessage'])[0])
        log.msg("Success. quantity of %s is %s." % (item['internal_id'], quantity), log.DEBUG)
        return quantity
    except Exception, e:
        log.msg('Error getting data-cart-item on product %s. Error: %s' % (item['internal_id'], str(e)), log.ERROR)
        with open("log/%s.html" % item['internal_id'], "w") as myfile:
            myfile.write('%s' % r.text.encode('utf-8'))

现在如果发生错误,日志会显示

2014-05-08 16:00:10+0300 [scrapy] DEBUG: Start adding sku1210352 to a cart.
INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): www.sds.com.au
WARNING:requests.packages.urllib3.connectionpool:Retrying (50 attempts remain) after connection broken by 'error(60, 'Operation timed out')': /product/startlet-gilet-fleece-jacket-by-zoo-york-in-black
INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (2): www.sds.com.au
DEBUG:requests.packages.urllib3.connectionpool:"GET /product/startlet-gilet-fleece-jacket-by-zoo-york-in-black HTTP/1.1" 200 20278
DEBUG:requests.packages.urllib3.connectionpool:"POST /common/ajaxResponse.jsp;jsessionid=EEA02CE768B288DD302896F6A8C4780F.store-node2?_DARGS=/browse/product.jsp.addToCartForm HTTP/1.1" 200 145
2014-05-08 16:01:14+0300 [scrapy] DEBUG: End adding sku1210352 to a cart.

之后,它重新绑定(bind),然后像什么都没发生一样继续

关于python - 当连续发送多个请求时请求停止,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23501545/

相关文章:

python - 根据日期和时区计算 tm_isdst

python - 如何在 Python 中基于 if 语句保存一个文档?

python - Scrapy集群分布式爬虫策略

python - 无法使我的脚本以正确的方式处理本地创建的服务器响应

python - 我想提取成员(member)链接

python import package - 子包不应该出现在符号表中

python - Scrapy/Python : run logic after yielded requests are finished

python - 未找到 '%s'“% url 的连接适配器

python - 使用 Python 3.4 和 BeautifulSoup 、Requests 抓取文章

python - Pandas 将一组中的行与前一组的行进行比较