apache - hive 错误:java.io.IOException:无法创建代理提供程序类org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider

标签 apache hadoop hive hiveql hadoop-streaming

我有配置单元查询,该查询有时运行成功,但是最大时间给出了一个错误“java.io.IOException:无法创建代理提供程序类org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider”

以下是我的错误日志

java.lang.RuntimeException: java.io.IOException: Couldn't create proxy provider class org.apache.hadoop.hdfs.server.namenode.ha.Con\ figuredFailoverProxyProvider at org.apache.hadoop.mapred.lib.CombineFileInputFormat.isSplitable(CombineFileInputFormat.java:154) at org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getMoreSplits(CombineFileInputFormat.java:283) at org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:239) at org.apache.hadoop.mapred.lib.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:75) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getSplits(HadoopShimsSecure.java:336) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getSplits(HadoopShimsSecure.java:302) at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:435) at org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:525) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:517) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:399) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:564) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:559) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:559) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:550) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:420) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1516) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1283) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1101) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:924) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:914) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:269) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:221) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:431) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:367) at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:464) at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:474) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:756) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:694) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:633) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: java.io.IOException: Couldn't create proxy provider class org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverPr\ oxyProvider at org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:475) at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:148) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:632) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:570) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:147) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169) at org.apache.hadoop.mapred.lib.CombineFileInputFormat.isSplitable(CombineFileInputFormat.java:151) ... 45 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.GeneratedConstructorAccessor32.newInstance(Unknown Source) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:458) ... 53 more Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded at java.util.Arrays.copyOf(Arrays.java:2219) at java.util.ArrayList.grow(ArrayList.java:242) at java.util.ArrayList.ensureExplicitCapacity(ArrayList.java:216) at java.util.ArrayList.ensureCapacityInternal(ArrayList.java:208) at java.util.ArrayList.add(ArrayList.java:440) at java.lang.String.split(String.java:2288) at sun.net.util.IPAddressUtil.textToNumericFormatV4(IPAddressUtil.java:47) at java.net.InetAddress.getAllByName(InetAddress.java:1129) at java.net.InetAddress.getAllByName(InetAddress.java:1098) at java.net.InetAddress.getByName(InetAddress.java:1048) at org.apache.hadoop.security.SecurityUtil$StandardHostResolver.getByName(SecurityUtil.java:474) at org.apache.hadoop.security.SecurityUtil.getByName(SecurityUtil.java:461) at org.apache.hadoop.net.NetUtils.createSocketAddrForHost(NetUtils.java:235) at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:215) at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:163) at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:152) at org.apache.hadoop.hdfs.DFSUtil.getAddressesForNameserviceId(DFSUtil.java:677) at org.apache.hadoop.hdfs.DFSUtil.getAddressesForNsIds(DFSUtil.java:645) at org.apache.hadoop.hdfs.DFSUtil.getAddresses(DFSUtil.java:628) at org.apache.hadoop.hdfs.DFSUtil.getHaNnRpcAddresses(DFSUtil.java:727) at org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider.(ConfiguredFailoverProxyProvider.java:88) at sun.reflect.GeneratedConstructorAccessor32.newInstance(Unknown Source) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.hdfs.NameNodeProxies.createFailoverProxyProvider(NameNodeProxies.java:458) at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:148) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:632) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:570) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:147) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169) Job Submission failed with exception 'java.lang.RuntimeException(java.io.IOException: Couldn't create proxy provider class org.apac\ he.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider)'



谁能告诉我为什么会这样?

最佳答案

我本人偶然发现了一个类似的异常,并且增加配置单元客户端堆并没有帮助。我发现我可以通过在查询的where子句中添加一个分区列来清除OutOfMemory GC Overhead异常,因此我得出结论,大量拆分会导致此异常。我还没有深入研究代码,但是我相信我已经在字符串串接触发gc抖动的循环中看到了这种情况,在CombineHiveInputFormat.getSplits方法中可能会发生类似的情况。

关于apache - hive 错误:java.io.IOException:无法创建代理提供程序类org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31854577/

相关文章:

java - Apache HTTPClient 抛出 java.net.SocketException : Connection reset for many domains

apache - 如何从 apache 中删除日期 header ?

hadoop - 是否有一些 Pig 实时用例可用?

python - Spark/Hadoop 在 AWS EMR 上找不到文件

hadoop - java.lang.IncompatibleClassChangeError : Found interface org. apache.hadoop.mapreduce.JobContext,但类是预期的安装示例

json - 将 Json 转换为 HIVE 中的单独列

mysql - HiveQL:在一对多表中查找第 N 个值

apache - 为什么默认情况下Varnish sess_timeout如此之低?

angularjs - 如何托管 Angularjs 应用程序或 M.E.A.N. Apache 或 NodeJS 服务器上的客户端应用程序?

hadoop - 如何让 HIVE 中的 CREATE TABLE...AS SELECT 不填充数据?