Java Web 服务联系 Hive - DataNucleus ClassLoaderResolver 错误

标签 java hive jdo datanucleus

我有一个 Java Web 服务(使用我公司的专有技术制作),用于服务请求/响应,在处理请求时,它尝试与 Hadoop 的 Hive 通信并执行查询。但是,当我只是尝试初始化连接时,它立即失败。

这是失败的代码行。我主要使用 https://cwiki.apache.org/confluence/display/Hive/HiveClient 中的代码示例:

String connString = "jdbc:hive://";
Connection con = DriverManager.getConnection(connString, "", "");

这是堆栈跟踪:

javax.jdo.JDOFatalInternalException: Unexpected exception caught.
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1186)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:246)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:275)
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:208)
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:183)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:407)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:359)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:504)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:266)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:228)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:131)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:121)
    at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:76)
    at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at (...my package...).RemoteCtrbTest.kickOffRemoteTest(RemoteCtrbTest.java:52)

NestedThrowablesStackTrace:
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:246)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:275)
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:208)
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:183)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:407)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:359)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:504)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:266)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:228)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:131)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:121)
    at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:76)
    at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at (...my package...).RemoteCtrbTest.kickOffRemoteTest(RemoteCtrbTest.java:52)

Caused by: org.datanucleus.exceptions.NucleusUserException: Persistence process has been specified to use a ClassLoaderResolver of name "jdo" yet this has not been found by the DataNucleus plugin mechanism. Please check your CLASSPATH and plugin specification.
    at org.datanucleus.OMFContext.getClassLoaderResolver(OMFContext.java:319)
    at org.datanucleus.OMFContext.<init>(OMFContext.java:165)
    at org.datanucleus.OMFContext.<init>(OMFContext.java:137)
    at org.datanucleus.ObjectManagerFactoryImpl.initialiseOMFContext(ObjectManagerFactoryImpl.java:132)
    at org.datanucleus.jdo.JDOPersistenceManagerFactory.initialiseProperties(JDOPersistenceManagerFactory.java:363)
    at org.datanucleus.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:307)
    at org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:255)
    at org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    ... 35 more

我发现另一个问题也有类似的错误消息,但它是关于 Maven 的并且不包含 Hive(这是在其代码中使用 DataNucleus 的问题): Datanucleus, JDO and executable jar - how to do it?

我正在使用 hive-site.xml 文件来指定 hive 和 datanucleus 的一些属性。数据核心如下。最后两次我试图解决这个问题,当我更改为 datanucleus.classLoaderResolverName 指定的任何内容时,它会更改引号中的错误消息。

<property>
  <name>datanucleus.autoCreateSchema</name>
  <value>false</value>
</property>

<property>
  <name>datanucleus.fixedDatastore</name>
  <value>true</value>
</property>

<property>
  <name>datanucleus.classLoaderResolverName</name>
  <value>jdo</value>
</property>

<property>
  <name>javax.jdo.PersistenceManagerFactoryClass</name>
  <value>org.datanucleus.jdo.JDOPersistenceManagerFactory</value>
</property>

我无法弄清楚的部分是,服务是否以某种方式重新捆绑 jar,就像我上面链接到的另一个 stackoverflow 问题一样,弄乱了 plugin.xml 和/或 Manifest.mf 文件的位置。我也不确定插件文件如何与 hive-site 文件交互。

这里的类路径需要添加特定的 jar,而不仅仅是类路径。我正在使用以下 datanucleus jar: * datanucleus-connectionpool-2.0.3.jar * datanucleus-enhancer-2.0.3.jar * datanucelus-rdbms-2.0.3.jar * datanucleus-core-2.0.3.jar

如果您能提供任何帮助我的意见,我将不胜感激。如果您需要,我可以提供更多信息,请务必询问。

最佳答案

如果您在公司采用专有技术开发的应用程序中使用 Spring 框架,那么您可以利用 Spring-Hadoop 支持。

您所要做的就是在您的applicationContext中添加以下配置:

<hdp:configuration>
    fs.default.name=${fs.default.name.url}
    mapred.job.tracker=${mapred.job.tracker.url}
</hdp:configuration>

<hdp:hive-client-factory host="${hadoop.hive.host.url}" port="10000"
    xmlns="http://www.springframework.org/schema/hadoop" />

<hdp:hive-template />

之后, Autowiring HiveTemplate ,

@Autowired
HiveTemplate hiveTemplate;

然后查询Hive如下所示:

List<String> list = hiveTemplate.query(queryString, parameterMap);

关于Java Web 服务联系 Hive - DataNucleus ClassLoaderResolver 错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/17681913/

相关文章:

java - Java 中纪元时间格式的字符串到日期

java - 如何根据查询参数 (JAXB) 的存在将 URL 映射到不同的方法

hadoop - 如何使用Hadoop GIS框架加载空间数据

mysql - 从电影中选择职业,其中年份 = 1950 AND 评分 =5 AND MAX(观看次数);

仅使用运行时异常的 Java 数据对象规范?

java - 在 Java 中,为什么人们要在字段前面加上 `this` ?

java - 按以下方式给出值时出错

hive - 无法从 avro 文件创建 Hive 表

java - 如何使用 JDOQL 检索作为类属性的对象列表?

java - JDO 使用列表查询列表