我正在尝试使用“gcloud dataproc jobs submit spark”提交 spark 作业。要连接到 ES 集群,我需要传递信任库路径。
如果我将信任库文件复制到所有工作节点并给出如下绝对路径,则作业成功:
esSparkConf.put("es.net.ssl.truststore.location","file:///tmp/trust.jks");
但我不想这样做。如果worker节点比较多,复制到每个节点比较困难。
我尝试使用 --files 选项传递信任库文件,如下所示:
gcloud dataproc jobs submit spark --cluster=sprk-prd1 --region=<> --files=trust.jks --class=ESDumpJob --jars=gs://randome/jars/ESDump-jar-with-dependencies.jar
ESDumpJob 中的代码片段:
SparkConf sparkConf = new SparkConf(true).setAppName("My ES job");
sparkConf.set("spark.es.nodes.wan.only","true")
.set("spark.es.nodes", <es_nodes>)
.set("spark.es.net.ssl","true")
.set("spark.es.net.ssl.truststore.location","trust.jks"))
.set("spark.es.net.ssl.truststore.pass", "pass"))
.set("spark.es.net.http.auth.user","test")
.set("spark.es.net.http.auth.pass", "test"));
sparkSession = SparkSession
.builder().master("local")
.config(sparkConf)
.config("spark.scheduler.mode", "FAIR")
.getOrCreate();
JavaRDD<MyData> data = //create rdd
JavaEsSpark.saveToEs(data, "my_index", ImmutableMap.of("es.mapping.id", "id"));
在这种情况下,我遇到了以下错误
17:15:42 Caused by: org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Expected to find keystore file at [trust.jks] but was unable to. Make sure that it is available on the classpath, or if not, that you have specified a valid URI.
17:15:42 at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.loadKeyStore(SSLSocketFactory.java:195)
17:15:42 at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.loadTrustManagers(SSLSocketFactory.java:226)
17:15:42 at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSSLContext(SSLSocketFactory.java:173)
最佳答案
您需要使用 org.apache.spark.SparkFiles.get(fileName)
获取实际路径,并添加 file://
前缀。
sparkConf.set(
"spark.es.net.ssl.truststore.location",
"file://" + org.apache.spark.SparkFiles.get("trust.jks"))
参见 SparkFiles.get还有这个question .
关于java - Spark submit --files 无法将信任库文件复制到 google dataproc 中的工作节点,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/72043171/