我尝试使用Java API通过以下代码访问HDFS文件:
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.FSDataInputStream;
public static void main(args[]) {
Configuration conf = new Configuration();
conf.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
conf.addResource(new Path("/etc/hadoop/conf/hdfs-site.xml"));
try {
Path path = new Path("hdfs://mycluster/user/mock/test.txt");
FileSystem fs = FileSystem.get(path.toUri(), conf);
if (fs.exists(path)) {
FSDataInputStream inputStream = fs.open(path);
// Process input stream ...
}
else
System.out.println("File does not exist");
} catch (IOException e) {
System.out.println(e.getMessage());
FileSystem.get(path.toUri(), conf)
发生异常,指出Couldn't create proxy provider class org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider
由java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.Credentials
引起。我没有找到有关该错误的更多信息。是由于错误的API(
org.apache.hadoop.hdfs
而不是org.apache.hadoop.fs
)导致的问题?
最佳答案
1)您的类路径中是否有hadoop-hdfs-.jar?
2)如何下载依赖项? Maven /手动/其他
3)能否请您提供stacktrace?
关于java - 在Java中访问HDFS文件系统的异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37649130/