在 Ubuntu 上,使用 couchbase 2.5.1、cloudera cdh4、couchbase 的 hadoop 插件和 oracle jdk 6。一切安装都很好(表面上),我可以独立使用 hadoop 和 couchbase 没有问题,但是当我尝试使用插件如下
sqoop import --connect http://127.0.0.1:8091/ --table DUMP
我得到以下错误
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
14/04/11 11:44:08 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.6.0
14/04/11 11:44:08 INFO tool.CodeGenTool: Beginning code generation
14/04/11 11:44:08 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-0.20-mapreduce
Note: /tmp/sqoop-vagrant/compile/30e6774902d338663db059706cde5b12/DUMP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/04/11 11:44:09 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-vagrant/compile/30e6774902d338663db059706cde5b12/DUMP.jar
14/04/11 11:44:09 INFO mapreduce.ImportJobBase: Beginning import of DUMP
14/04/11 11:44:09 WARN util.Jars: No such class couchbase doesn't use a jdbc driver available.
14/04/11 11:44:11 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/04/11 11:44:12 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/04/11 11:44:13 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
知道我哪里出错了吗?或者我可以做些什么来找出答案?
最佳答案
看来我使用的语法是错误的。假设我们要将 beer-sample
存储桶从 couchbase 导入 HDFS,正确的语法如下,其中存储桶名称实际上作为 用户名
传递。
sqoop import --connect http://localhost:8091/pools --password password --username beer-sample --table DUMP
关于hadoop - Sqoop 从 couchbase 导入到 hadoop,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23011782/