python - 如何在 scrapy 爬虫中使用用户名/密码进行身份验证?

标签 python mongodb scrapy pymongo

我想在 scrapy 爬虫中使用用户名/密码对 Pymongo 客户端进行身份验证。使用这种方法但出现错误

   class MongoDBPipeline(object):
        def __init__(self):
            connection = pymongo.MongoClient(settings['MONGODB_HOST'], settings['MONGODB_PORT'])
            connection.the_database.authenticate(settings['MONGODB_USERNAME'],settings['MONGODB_PASSWORD'],source='$external', mechanism='PLAIN')
            db = connection[settings['MONGODB_DATABASE']]
            self.collection = db[settings['MONGODB_COLLECTION']]

错误:

 connection.the_database.authenticate(settings['MONGODB_USERNAME'],settings['MONGODB_PASSWORD'],source='$external', mechanism='PLAIN')
  File "/home/nikhil/.local/lib/python2.7/site-packages/pymongo/database.py", line 988, in authenticate
    connect=True)
  File "/home/nikhil/.local/lib/python2.7/site-packages/pymongo/mongo_client.py", line 397, in _cache_credentials
    sock_info.authenticate(credentials)
  File "/home/nikhil/.local/lib/python2.7/site-packages/pymongo/pool.py", line 287, in authenticate
    auth.authenticate(credentials, self)
  File "/home/nikhil/.local/lib/python2.7/site-packages/pymongo/auth.py", line 407, in authenticate
    auth_func(credentials, sock_info)
  File "/home/nikhil/.local/lib/python2.7/site-packages/pymongo/auth.py", line 329, in _authenticate_plain
    sock_info.command(source, cmd)
  File "/home/nikhil/.local/lib/python2.7/site-packages/pymongo/pool.py", line 184, in command
    codec_options, check, allowable_errors)
  File "/home/nikhil/.local/lib/python2.7/site-packages/pymongo/network.py", line 54, in command
    helpers._check_command_response(response_doc, msg, allowable_errors)
  File "/home/nikhil/.local/lib/python2.7/site-packages/pymongo/helpers.py", line 188, in _check_command_response
    raise OperationFailure(msg % errmsg, code, response)
pymongo.errors.OperationFailure: command SON([('saslStart', 1), ('mechanism', 'PLAIN'), ('payload', Binary('\x00nikhil\x00password', 0)), ('autoAuthorize', 1)]) on namespace $external.$cmd failed: no such cmd: saslStart

我想验证我的代码以将其写入 mongodb。

connection = pymongo.MongoClient("mongodb://localhost")

用户名和密码放在哪里?

最佳答案

scrapy-mongodb将易于使用。

但是,在正常情况下,您需要引用 authentication examples来自 mongodb 的 python 文档。

-- 引用--

支持用户名和密码中的特殊字符

如果您的用户名或密码包含特殊字符(例如“/”、“”或“@”),您必须对它们进行 %xx 转义,以便在 MongoDB URI 中使用。 PyMongo 使用 unquote_plus() 来解码它们。例如:

MongoClient('mongodb://user:' + password + '@127.0.0.1')

关于python - 如何在 scrapy 爬虫中使用用户名/密码进行身份验证?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32251000/

相关文章:

node.js - MongoDB 仅更新对象中更改的字段,而不是替换整个对象

python - 从 url 列表中下载 <very large> 页面的最佳方法是什么?

python - 如何在 Scrapy 框架中使用 RobotsTxtMiddleware?

python - 在运行时加载 python 代码

python - postgresql+python : how to adapt for copy_from?

javascript - MongoDB,如何创建一个空集合?

javascript - 使用 meteor 集合时从开发控制台收到 "method not found [404]"错误

python - python中后缀数组的变化

python - 如何定期更改 tkinter 图像?

python - Scrapy Django 限制爬取的链接