python - 安装 Pyspark 时遇到问题

标签 python apache-spark

我想使用 pyspark 在本地机器上运行 Spark。来自 here我使用命令:

sbt/sbt assembly
$ ./bin/pyspark 

安装完成,但是pyspark无法运行,导致如下错误(完整):

138:spark-0.9.1 comp_name$ ./bin/pyspark
Python 2.7.6 |Anaconda 1.9.2 (x86_64)| (default, Jan 10 2014, 11:23:15) 
[GCC 4.0.1 (Apple Inc. build 5493)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
  File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/shell.py", line 32, in <module>
    sc = SparkContext(os.environ.get("MASTER", "local"), "PySparkShell", pyFiles=add_files)
  File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/context.py", line 123, in __init__
    self._jsc = self._jvm.JavaSparkContext(self._conf._jconf)
  File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/java_gateway.py", line 669, in __call__
  File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/protocol.py", line 300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.UnknownHostException: 138.7.100.10.in-addr.arpa: 138.7.100.10.in-addr.arpa: nodename nor servname provided, or not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1466)
    at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:355)
    at org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:347)
    at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:347)
    at org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute(Utils.scala:348)
    at org.apache.spark.util.Utils$.localIpAddressHostname(Utils.scala:348)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:395)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:395)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:395)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:124)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:47)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:214)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:724)
Caused by: java.net.UnknownHostException: 138.7.100.10.in-addr.arpa: nodename nor servname provided, or not known
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:894)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1286)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1462)
    ... 22 more

知道我做错了什么吗?我不知道IP地址138.7.100.10是从哪里来的。 使用(或不使用)MAMP 创建本地主机时出现此错误。 提前致谢!

最佳答案

正确的解决方案是将 SPARK_LOCAL_IP 环境变量设置为 localhost 或任何您的主机名。

关于python - 安装 Pyspark 时遇到问题,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23353477/

相关文章:

python - 在Python中从Excel中获取列到数组中

python - Flask-Admin +(Flask-Login 和/或 Flask-Principal)

python - 值错误 : Unable to determine number of fit parameters. "Problem in curve fitting"

apache-spark - 使用 spark.sql.autoBroadcastJoinThreshold 时,Spark Driver 不释放内存

Scala 将 Json 文件读取为 Json

apache-spark - 使用 Spark 和斯坦福 NLP API 进行情感分析

python - Python中的"or die()"

python - 防止 blender 名称索引

sql - Scala 如何在使用 sqlContext 的查询中处理 isnull 或 ifnull

apache-spark - DataFrame na() 填充方法和不明确引用的问题