我正在尝试使用 spark-submit 运行本地 jar 文件,它运行良好。这是命令-
spark-submit --class "SimpleApp" --master local myProject/target/scala-2.11/simple-project_2.11-1.0.jar
但是当我尝试使用 curl
curl -X POST --data '{
"file": "file:///home/user/myProject/target/scala-2.11/simple-project_2.11-1.0.jar",
"className": "SimpleApp",
}'
-H
"Content-Type: application/json"
http://server:8998/batches
它正在抛出错误
"requirement failed: Local path /home/user/myProject/target/scala-2.11/simple-project_2.11-1.0.jar cannot be added to user sessions."
这是我的 livy.conf 文件,因为一些文章建议更改一些内容。
# What host address to start the server on. By default, Livy will bind to all network interfaces.
livy.server.host = 0.0.0.0
# What port to start the server on.
livy.server.port = 8998
# What spark master Livy sessions should use.
livy.spark.master = local
# What spark deploy mode Livy sessions should use.
livy.spark.deploy-mode = client
# List of local directories from where files are allowed to be added to user sessions. By
# default it's empty, meaning users can only reference remote URIs when starting their
# sessions.
livy.file.local-dir-whitelist= /home/user/.livy-sessions/
这个你能帮我吗。
提前致谢。
最佳答案
我最近从 Apache Livy 获得了本地文件读取的解决方案,因为我使用 cURL 创建了错误的请求。我刚刚用 'local:/' 替换了来自 'file://' 的文件读取协议(protocol),这对我有用。
curl -X POST --data '{
"file": "local:/home/user/myProject/target/scala-2.11/simple-project_2.11-1.0.jar",
"className": "SimpleApp",
}'
-H
"Content-Type: application/json"
http://server:8998/batches
这是一个很小的错误,但仍然无法从 HDFS 访问我的 jar 文件。
谢谢大家帮忙。
关于scala - Apache Livy 不适用于本地 jar 文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/51038328/