windows - 使用 PySpark 内核时出现 Jupyter Notebook 错误 : the code failed because of a fatal error: Error sending http request

标签 windows apache-spark pyspark jupyter-notebook anaconda

我和使用 jupyter notebook 的 PySpark 内核,我已经成功选择了 PySpark 内核,但我不断收到以下错误

The code failed because of a fatal error: Error sending http request and maximum retry encountered.. Some things to try:

a) Make sure Spark has enough available resources for Jupyter to create a Spark context.

b) Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.

c) Restart the kernel.



这也是日志
2019-10-10 13:37:43,741 DEBUG   SparkMagics Initialized spark magics.
2019-10-10 13:37:43,742 INFO    EventsHandler   InstanceId: 32a21583-6879-4ad5-88bf-e07af0b09387,EventName: notebookLoaded,Timestamp: 2019-10-10 10:37:43.742475
2019-10-10 13:37:43,744 DEBUG   python_jupyter_kernel   Loaded magics.
2019-10-10 13:37:43,744 DEBUG   python_jupyter_kernel   Changed language.
2019-10-10 13:37:44,356 DEBUG   python_jupyter_kernel   Registered auto viz.
2019-10-10 13:37:45,440 INFO    EventsHandler   InstanceId: 32a21583-6879-4ad5-88bf-e07af0b09387,EventName: notebookSessionCreationStart,Timestamp: 2019-10-10 10:37:45.440323,SessionGuid: d230b1f3-6bb1-4a66-bde1-7a73a14d7939,LivyKind: pyspark
2019-10-10 13:37:49,591 ERROR   ReliableHttpClient  Request to 'http://localhost:8998/sessions' failed with 'HTTPConnectionPool(host='localhost', port=8998): Max retries exceeded with url: /sessions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000013184159808>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))'
2019-10-10 13:37:49,591 INFO    EventsHandler   InstanceId: 32a21583-6879-4ad5-88bf-e07af0b09387,EventName: notebookSessionCreationEnd,Timestamp: 2019-10-10 10:37:49.591650,SessionGuid: d230b1f3-6bb1-4a66-bde1-7a73a14d7939,LivyKind: pyspark,SessionId: -1,Status: not_started,Success: False,ExceptionType: HttpClientException,ExceptionMessage: Error sending http request and maximum retry encountered.
2019-10-10 13:37:49,591 ERROR   SparkMagics Error creating session: Error sending http request and maximum retry encountered.

请注意,我正在尝试在 Windows 上进行配置。
多谢

最佳答案

我遇到了同样的问题,您可以通过不使用 PySpark 内核(笔记本)而是使用 Python 3 内核(笔记本)来解决它。我使用以下代码来设置 Spark 集群:

import pyspark # only run after findspark.init()
from pyspark.sql import SparkSession
# May take awhile locally
spark = SparkSession.builder.appName("test").getOrCreate()
spark

关于windows - 使用 PySpark 内核时出现 Jupyter Notebook 错误 : the code failed because of a fatal error: Error sending http request,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58321160/

相关文章:

scala - 如何将 RDD[List[String]] 转换为 String 并将其拆分

c++ - 使用 MinGW 对 DevIL 函数的 undefined reference

windows - pthread 条件变量与 win32 事件(linux 与 windows-ce)

windows - Apache 轮换访问和错误日​​志 Windows

hadoop - 使用 derby 进行 Hive 元存储配置

python - Pyspark - 将列转换为列表

windows - 导入 IIS 配置导致 IIS 挂起

apache-spark - 将多个系统属性传递给 google dataproc 集群作业

python - 如何使用 PySpark 将 SparseVector 中的前 X 个单词获取到字符串数组

python-2.7 - PySpark 使用统计信息写入 Parquet 二进制列(signed-min-max.enabled)