我有一个在亚马逊 ec2 上运行的 spark master。
我尝试使用 pyspark 从另一个 ec2 实例连接到它,如下所示:
spark = SparkSession.builder.appName("MyApp") \
.master("spark_url_as_obtained_in_web_ui") \
.getOrCreate()
以下是错误:To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2018-04-04 20:03:04 WARN Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
............
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
我尝试了此处描述的所有解决方案,但无济于事:
可能出了什么问题??
最佳答案
设置 spark.driver.bindAddress
到您的本地 IP,例如 127.0.0.1
.
pyspark -c spark.driver.bindAddress=127.0.0.1
关于python-3.x - 尝试连接到 spark master 时无法绑定(bind)随机可用端口错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49654050/