java - 如何找到Spark的安装目录?

标签 java ubuntu apache-spark

我想更改 spark-env.sh 。如何在ubuntu中找到安装目录??

我查看了 UI,但没有找到任何东西。

whereis spark 

结果: Spark :

这是定位命令的日志 定位 Spark

/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/blockmgr-db3a931b-7f1a-423e-b5da-b75a958a1909/11
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/blockmgr-db3a931b-7f1a-423e-b5da-b75a958a1909/13
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/httpd-16b4313e-72dc-4665-b4ac-df491869386d/files
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/httpd-16b4313e-72dc-4665-b4ac-df491869386d/jars
/home/sys6002/Desktop/diff spark hadoop.png
/home/sys6002/Desktop/sparkmain
/home/sys6002/Downloads/learning-spark-master.zip
/home/sys6002/Downloads/mongo-spark-master
/home/sys6002/Downloads/spark-1.5.1
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6 (2)
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6.tgz
/home/sys6002/Downloads/spark-1.5.1-bin-without-hadoop
/home/sys6002/Downloads/spark-cassandra-connector-master
/home/sys6002/Downloads/spark-core_2.9.3-0.8.0-incubati
home/sys6002/anaconda3/pkgs/odo-0.3.2-np19py34_0/lib/python3.4/site-packages/odo/backends/tests/__pycache__/test_sparksql.cpython-34.pyc
/home/sys6002/spark-example/a.txt
/home/sys6002/spark-example/a.txt~
/home/sys6002/spark-example/pom.xml
/home/sys6002/spark-example/pom.xml~
/home/sys6002/spark-example/src
/home/sys6002/spark-example/src/main
/home/sys6002/spark-example/src/test
/home/sys6002/spark-example/src/main/java
/home/sys6002/spark-example/src/main/java/com
/home/sys6002/spark-example/src/main/java/com/geekcap
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/App.java
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/WordCount.java~
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample/WordCount.java
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample/WordCount.java~

/home/sys6002/spark-example/src/test/java/com/geekcap/javaworld/AppTest.java
/usr/share/app-install/desktop/lightspark:lightspark.desktop
/usr/share/app-install/desktop/sparkleshare:sparkleshare-invite-opener.desktop
/usr/share/app-install/desktop/sparkleshare:sparkleshare.desktop

最佳答案

运行

echo 'sc.getConf.get("spark.home")' | spark-shell

片刻之后,您的 Spark 主页将被打印出来,您将看到如下内容:

scala> sc.getConf.get("spark.home")
res0: String = /usr/local/lib/python3.7/site-packages/pyspark

所以在这种情况下,我的 Spark 主页是 /usr/local/lib/python3.7/site-packages/pyspark

关于java - 如何找到Spark的安装目录?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33806450/

相关文章:

java - 通过 RequestDispatcher 检测 JSP 页面中的异常

java - 如何将 double 变量四舍五入到小数点后两位?

scala - 从 Spark 中的 Cassandra 中减去第一行和最后一行的值

amazon-web-services - aws : EMR cluster fails "ERROR UserData: Error encountered while try to get user data" on submitting spark job

java - CMS-initial-mark 花费了 800+ 毫秒,这正常吗?

ubuntu - localhost/phpmyadmin 中的错误 #2002。在 Ubuntu 中使用 XAMPP

linux - 为了访问我的 AWS EC2 实例,我还能尝试什么

mysql - 从 shell/命令提示符运行 MySQL 脚本时,空格会更改分隔符

apache-spark - Pyspark:拯救变压器

java - 将 Lucene 的 MoreLikeThis 限制为我文档的一个子集