我正在尝试在 Google Compute Engine 上设置 Hadoop 集群,我一直在关注 these instructions .在我运行之前,一切似乎都运行良好:
./compute_cluster_for_hadoop.py setup <project ID> <bucket name>
使用我创建的项目 ID 和存储桶名称。该脚本似乎无法访问某些内容并因 403 而崩溃;这是带有错误消息的输出的尾部:
Uploading ...kages/ca-certificates-java_20121112+nmu2_all.deb: 14.57 KB/14.57 KB
Uploading ...duce/tmp/deb_packages/libnspr4_4.9.2-1_amd64.deb: 316 B/316 B
Uploading ...e/tmp/deb_packages/libnss3-1d_3.14.3-1_amd64.deb: 318 B/318 B
Uploading ...dk-6-jre-headless_6b27-1.12.6-1~deb7u1_amd64.deb: 366 B/366 B
Uploading ...duce/tmp/deb_packages/libnss3_3.14.3-1_amd64.deb: 315 B/315 B
ResumableUploadAbortException: 403 Forbidden
AccessDeniedException: 403 Forbidden
AccessDeniedException: 403 Forbidden
AccessDeniedException: 403 Forbidden
AccessDeniedException: 403 Forbidden
ResumableUploadAbortException: 403 Forbidden
AccessDeniedException: 403 Forbidden
CommandException: 7 files/objects could not be transferred.
########## ERROR ##########
Failed to copy Hadoop and Java packages to Cloud Storage gs://<bucket name>/mapreduce/tmp/
###########################
Traceback (most recent call last):
File "./compute_cluster_for_hadoop.py", line 230, in <module>
main()
File "./compute_cluster_for_hadoop.py", line 226, in main
ComputeClusterForHadoop().ParseArgumentsAndExecute(sys.argv[1:])
File "./compute_cluster_for_hadoop.py", line 222, in ParseArgumentsAndExecute
params.handler(params)
File "./compute_cluster_for_hadoop.py", line 36, in SetUp
gce_cluster.GceCluster(flags).EnvironmentSetUp()
File "/Path/To/solutions-google-compute-engine-cluster-for-hadoop/gce_cluster.py", line 149, in EnvironmentSetUp
raise EnvironmentSetUpError('Environment set up failed.')
gce_cluster.EnvironmentSetUpError: Environment set up failed.
最佳答案
我建议您改用由 Google 更新、更实时和维护更多的“bdutil”包。您可以在 GCP Hadoop announcement forum 中找到详细信息.
如果您转到最新公告,您会找到指向最新“bdutil”包(当前为 0.36.4)的链接。它将简化您的集群部署,并支持 Hadoop 和 Spark 集群。
另外,我想推荐:
在 GCE 中的一台机器内部署集群。这将使过程更加快速和可靠。
在文件
bdutil_env.sh
中,将参数GCUTIL_SLEEP_TIME_BETWEEN_ASYNC_CALLS_SECONDS
从 0.1 更改为 0.5(对我来说,它修复了反复出现的部署错误)
关于hadoop - 无法将 Hadoop 和 Java 包复制到 Google Cloud Storage,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/27183082/