hadoop - Hive:SHOW CONF 不遵守 hive-site.xml

标签 hadoop hive

我有以下配置单元站点配置:

[hadoop@ip-10-102-201-205 ~]$ cat /etc/hive/conf.dist/hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Licensed to the Apache Software Foundation (ASF) under one or more       -->
<!-- contributor license agreements.  See the NOTICE file distributed with    -->
<!-- this work for additional information regarding copyright ownership.      -->
<!-- The ASF licenses this file to You under the Apache License, Version 2.0  -->
<!-- (the "License"); you may not use this file except in compliance with     -->
<!-- the License.  You may obtain a copy of the License at                    -->
<!--                                                                          -->
<!--     http://www.apache.org/licenses/LICENSE-2.0                           -->
<!--                                                                          -->
<!-- Unless required by applicable law or agreed to in writing, software      -->
<!-- distributed under the License is distributed on an "AS IS" BASIS,        -->
<!-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -->
<!-- See the License for the specific language governing permissions and      -->
<!-- limitations under the License.                                           -->

<configuration>

<!-- Hive Configuration can either be stored in this file or in the hadoop configuration files  -->
<!-- that are implied by Hadoop setup variables.                                                -->
<!-- Aside from Hadoop setup variables - this file is provided as a convenience so that Hive    -->
<!-- users do not have to edit hadoop configuration files (that may be managed as a centralized -->
<!-- resource).                                                                                 -->

<!-- Hive Execution Parameters -->


<property>
  <name>hbase.zookeeper.quorum</name>
  <value>ip-10-102-201-205.ec2.internal</value>
  <description>http://wiki.apache.org/hadoop/Hive/HBaseIntegration</description>
</property>

<property>
  <name>hive.execution.engine</name>
  <value>mr</value>
</property>

  <property>
    <name>fs.defaultFS</name>
    <value>hdfs://ip-10-102-201-205.ec2.internal:8020</value>
  </property>

<property>
  <name>hive.metastore.uris</name>
  <value>thrift://ip-10-102-201-205.ec2.internal:9083</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

<property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://ip-10-102-201-205.ec2.internal:3306/hive?createDatabaseIfNotExist=true</value>
    <description>username to use against metastore database</description>
</property>

<property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>org.mariadb.jdbc.Driver</value>
    <description>username to use against metastore database</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>hive</value>
  <description>username to use against metastore database</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>nsqiBHEs6Wn8mz9b</value>
  <description>password to use against metastore database</description>
</property>

  <property>
    <name>datanucleus.fixedDatastore</name>
    <value>true</value>
  </property>

  <property>
    <name>mapred.reduce.tasks</name>
    <value>-1</value>
  </property>

  <property>
    <name>mapred.max.split.size</name>
    <value>256000000</value>
  </property>

  <property>
    <name>hive.metastore.connect.retries</name>
    <value>5</value>
  </property>

  <property>
    <name>hive.optimize.sort.dynamic.partition</name>
    <value>true</value>
  </property>

  <property>
    <name>hive.aux.jars.path</name>
    <value>file:///etc/hive/conf/jar/json-serde-1.3.8-SNAPSHOT-jar-with-dependencies.jar</value>
  </property>

</configuration>

这只是来自 AWS EMR 的默认 hive-site.xml,但在底部设置了“hive.aux.jars.path”。

当我开始一个 session 时,Hive 声称已经加载了正确的配置文件,但是配置没有设置,或者设置不正确:

[hadoop@ip-10-102-201-205 ~]$ echo 'SHOW CONF "hive.aux.jars.path"; EXIT;' | hive --hiveconf hive.root.logger=DEBUG,console
...
16/04/27 20:49:43 [main()]: DEBUG common.LogUtils: Using hive-site.xml found on CLASSPATH at /etc/hive/conf.dist/hive-site.xml
...

hive> SHOW CONF "hive.aux.jars.path"; EXIT;
    STRING  The location of the plugin jars that contain implementations of user defined functions and serdes.
Time taken: 0.741 seconds, Fetched: 1 row(s)
...

第一列(空)是值所在的位置。 “hive.optimize.sort.dynamic.partition”也是如此:它在配置中设置为“true”,但在我实际运行控制台时显示为“false”。 DEBUG 和 TRACE 日志不显示任何配置错误。为什么这些值未设置?

完整的调试输出,以防有人好奇:

[hadoop@ip-10-102-201-205 ~]$ echo 'SHOW CONF "hive.aux.jars.path"; EXIT;' | hive --hiveconf hive.root.logger=DEBUG,console
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
16/04/27 20:49:43 [main()]: DEBUG common.LogUtils: Using hive-site.xml found on CLASSPATH at /etc/hive/conf.dist/hive-site.xml

Logging initialized using configuration in file:/etc/hive/conf.dist/hive-log4j.properties
16/04/27 20:49:43 [main()]: INFO SessionState: 
Logging initialized using configuration in file:/etc/hive/conf.dist/hive-log4j.properties
16/04/27 20:49:43 [main()]: DEBUG parse.VariableSubstitution: Substitution is on: hive
16/04/27 20:49:43 [main()]: DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
16/04/27 20:49:43 [main()]: DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
16/04/27 20:49:43 [main()]: DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[GetGroups])
16/04/27 20:49:43 [main()]: DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
16/04/27 20:49:43 [main()]: DEBUG util.KerberosName: Kerberos krb5 configuration not found, setting default realm to empty
16/04/27 20:49:43 [main()]: DEBUG security.Groups:  Creating new Groups object
16/04/27 20:49:43 [main()]: DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
16/04/27 20:49:43 [main()]: DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
16/04/27 20:49:43 [main()]: DEBUG security.JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution
16/04/27 20:49:43 [main()]: DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
16/04/27 20:49:43 [main()]: DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
16/04/27 20:49:43 [main()]: DEBUG security.UserGroupInformation: hadoop login
16/04/27 20:49:43 [main()]: DEBUG security.UserGroupInformation: hadoop login commit
16/04/27 20:49:43 [main()]: DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop
16/04/27 20:49:43 [main()]: DEBUG security.UserGroupInformation: Using user: "UnixPrincipal: hadoop" with name hadoop
16/04/27 20:49:43 [main()]: DEBUG security.UserGroupInformation: User entry: "hadoop"
16/04/27 20:49:43 [main()]: DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
16/04/27 20:49:43 [main()]: INFO hive.metastore: Trying to connect to metastore with URI thrift://ip-10-102-201-205.ec2.internal:9083
16/04/27 20:49:43 [main()]: INFO hive.metastore: Connected to metastore.
16/04/27 20:49:44 [main()]: DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
16/04/27 20:49:44 [main()]: DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
16/04/27 20:49:44 [main()]: DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
16/04/27 20:49:44 [main()]: DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = 
16/04/27 20:49:44 [main()]: DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
16/04/27 20:49:44 [main()]: DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@258d79be
16/04/27 20:49:44 [main()]: DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@7fcf2fc1
16/04/27 20:49:44 [Finalizer()]: DEBUG azure.NativeAzureFileSystem: finalize() called.
16/04/27 20:49:44 [Finalizer()]: DEBUG azure.NativeAzureFileSystem: finalize() called.
16/04/27 20:49:44 [client DomainSocketWatcher()]: DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2@4ce9d42d: starting with interruptCheckPeriodMs = 60000
16/04/27 20:49:44 [main()]: DEBUG util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.
16/04/27 20:49:44 [main()]: DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
16/04/27 20:49:44 [main()]: DEBUG ipc.Client: The ping interval is 60000 ms.
16/04/27 20:49:44 [main()]: DEBUG ipc.Client: Connecting to ip-10-102-201-205.ec2.internal/10.102.201.205:8020
16/04/27 20:49:44 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop: starting, having connections 1
16/04/27 20:49:44 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #0
16/04/27 20:49:44 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #0
16/04/27 20:49:44 [main()]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 45ms
16/04/27 20:49:44 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #1
16/04/27 20:49:44 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #1
16/04/27 20:49:44 [main()]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/04/27 20:49:44 [main()]: DEBUG session.SessionState: HDFS root scratch dir: /tmp/hive, permission: rwx-wx-wx
16/04/27 20:49:44 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #2
16/04/27 20:49:44 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #2
16/04/27 20:49:44 [main()]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/04/27 20:49:44 [main()]: DEBUG nativeio.NativeIO: Initialized cache for IDs to User/Group mapping with a  cache timeout of 14400 seconds.
16/04/27 20:49:44 [main()]: INFO session.SessionState: Created local directory: /mnt/tmp/f113c3f9-3316-4998-8e80-9bd4c3fc079c_resources
16/04/27 20:49:44 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #3
16/04/27 20:49:44 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #3
16/04/27 20:49:44 [main()]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
16/04/27 20:49:44 [main()]: DEBUG hdfs.DFSClient: /tmp/hive/hadoop/f113c3f9-3316-4998-8e80-9bd4c3fc079c: masked=rwx------
16/04/27 20:49:44 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #4
16/04/27 20:49:44 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #4
16/04/27 20:49:44 [main()]: DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 3ms
16/04/27 20:49:44 [main()]: INFO session.SessionState: Created HDFS directory: /tmp/hive/hadoop/f113c3f9-3316-4998-8e80-9bd4c3fc079c
16/04/27 20:49:44 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #5
16/04/27 20:49:44 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #5
16/04/27 20:49:44 [main()]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
16/04/27 20:49:44 [main()]: INFO session.SessionState: Created local directory: /mnt/tmp/hadoop/f113c3f9-3316-4998-8e80-9bd4c3fc079c
16/04/27 20:49:44 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #6
16/04/27 20:49:44 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #6
16/04/27 20:49:44 [main()]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/04/27 20:49:44 [main()]: DEBUG hdfs.DFSClient: /tmp/hive/hadoop/f113c3f9-3316-4998-8e80-9bd4c3fc079c/_tmp_space.db: masked=rwx------
16/04/27 20:49:44 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #7
16/04/27 20:49:44 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #7
16/04/27 20:49:44 [main()]: DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 2ms
16/04/27 20:49:44 [main()]: INFO session.SessionState: Created HDFS directory: /tmp/hive/hadoop/f113c3f9-3316-4998-8e80-9bd4c3fc079c/_tmp_space.db
16/04/27 20:49:44 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #8
16/04/27 20:49:44 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #8
16/04/27 20:49:44 [main()]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/04/27 20:49:44 [main()]: INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
16/04/27 20:49:44 [main()]: DEBUG exec.Utilities: Use session specified class loader
16/04/27 20:49:44 [main()]: DEBUG exec.Utilities: Use session specified class loader
16/04/27 20:49:44 [main()]: DEBUG exec.Utilities: Use session specified class loader
<this line repeats many times>
hive> SHOW CONF "hive.aux.jars.path"; EXIT;
16/04/27 20:49:45 [main()]: INFO log.PerfLogger: <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:45 [main()]: INFO log.PerfLogger: <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:45 [main()]: INFO ql.Driver: Concurrency mode is disabled, not creating a lock manager
16/04/27 20:49:45 [main()]: INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:45 [main()]: DEBUG parse.VariableSubstitution: Substitution is on: SHOW CONF "hive.aux.jars.path"
16/04/27 20:49:45 [main()]: INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:45 [main()]: INFO parse.ParseDriver: Parsing command: SHOW CONF "hive.aux.jars.path"
16/04/27 20:49:45 [main()]: INFO parse.ParseDriver: Parse Completed
16/04/27 20:49:45 [main()]: INFO log.PerfLogger: </PERFLOG method=parse start=1461790185397 end=1461790185846 duration=449 from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:45 [main()]: DEBUG ql.Driver: Encoding valid txns info 9223372036854775807:
16/04/27 20:49:45 [main()]: INFO log.PerfLogger: <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:45 [main()]: INFO ql.Driver: Semantic Analysis Completed
16/04/27 20:49:45 [main()]: INFO log.PerfLogger: </PERFLOG method=semanticAnalyze start=1461790185852 end=1461790185931 duration=79 from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:45 [main()]: DEBUG exec.Utilities: Use session specified class loader
16/04/27 20:49:46 [main()]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[default, type, desc] columnTypes=[string, string, string] separator=[[B@1e63d216] nullstring=  lastColumnTakesRest=false
16/04/27 20:49:46 [main()]: INFO exec.ListSinkOperator: Initializing Self OP[0]
16/04/27 20:49:46 [main()]: DEBUG exec.Utilities: Use session specified class loader
16/04/27 20:49:46 [main()]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.DelimitedJSONSerDe initialized with: columnNames=[] columnTypes=[] separator=[[B@1280851e] nullstring=  lastColumnTakesRest=false
16/04/27 20:49:46 [main()]: INFO exec.ListSinkOperator: Operator 0 OP initialized
16/04/27 20:49:46 [main()]: INFO exec.ListSinkOperator: Initialization Done 0 OP
16/04/27 20:49:46 [main()]: DEBUG exec.Utilities: Use session specified class loader
16/04/27 20:49:46 [main()]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[default, type, desc] columnTypes=[string, string, string] separator=[[B@5e840abf] nullstring=  lastColumnTakesRest=false
16/04/27 20:49:46 [main()]: INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:default, type:string, comment:from deserializer), FieldSchema(name:type, type:string, comment:from deserializer), FieldSchema(name:desc, type:string, comment:from deserializer)], properties:null)
16/04/27 20:49:46 [main()]: INFO log.PerfLogger: </PERFLOG method=compile start=1461790185344 end=1461790186055 duration=711 from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:46 [main()]: INFO log.PerfLogger: <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:46 [main()]: INFO ql.Driver: Starting command: SHOW CONF "hive.aux.jars.path"
16/04/27 20:49:46 [main()]: INFO log.PerfLogger: </PERFLOG method=TimeToSubmit start=1461790185344 end=1461790186062 duration=718 from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:46 [main()]: INFO log.PerfLogger: <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:46 [main()]: INFO log.PerfLogger: <PERFLOG method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:46 [main()]: INFO ql.Driver: Starting task [Stage-0:DDL] in serial mode
16/04/27 20:49:46 [main()]: INFO log.PerfLogger: </PERFLOG method=runTasks start=1461790186062 end=1461790186078 duration=16 from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:46 [main()]: INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1461790186055 end=1461790186078 duration=23 from=org.apache.hadoop.hive.ql.Driver>
OK
16/04/27 20:49:46 [main()]: INFO ql.Driver: OK
16/04/27 20:49:46 [main()]: INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:46 [main()]: INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1461790186079 end=1461790186079 duration=0 from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:46 [main()]: INFO log.PerfLogger: </PERFLOG method=Driver.run start=1461790185344 end=1461790186079 duration=735 from=org.apache.hadoop.hive.ql.Driver>
16/04/27 20:49:46 [main()]: INFO Configuration.deprecation: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
16/04/27 20:49:46 [main()]: INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
16/04/27 20:49:46 [main()]: INFO lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 72c57a1c06c471da40827b432ecff0de6a5c6dcc]
16/04/27 20:49:46 [main()]: DEBUG mapred.FileInputFormat: Time taken to get FileStatuses: 3
16/04/27 20:49:46 [main()]: INFO mapred.FileInputFormat: Total input paths to process : 1
16/04/27 20:49:46 [main()]: DEBUG mapred.FileInputFormat: Total # of splits generated by getSplits: 1, TimeTaken: 12
16/04/27 20:49:46 [main()]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[default, type, desc] columnTypes=[string, string, string] separator=[[B@2dbd803f] nullstring=  lastColumnTakesRest=false
16/04/27 20:49:46 [main()]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[default, type, desc] columnTypes=[string, string, string] separator=[[B@3e48e859] nullstring=  lastColumnTakesRest=false
16/04/27 20:49:46 [main()]: DEBUG exec.FetchOperator: Creating fetchTask with deserializer typeinfo: struct<default:string,type:string,desc:string>
16/04/27 20:49:46 [main()]: DEBUG exec.FetchOperator: deserializer properties:
table properties: {columns=default,type,desc, serialization.null.format= , serialization.lib=org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, serialization.format=9, columns.types=string,string,string}
partition properties: {columns=default,type,desc, serialization.null.format= , serialization.lib=org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, serialization.format=9, columns.types=string,string,string}
    STRING  The location of the plugin jars that contain implementations of user defined functions and serdes.
16/04/27 20:49:46 [main()]: INFO exec.ListSinkOperator: 0 finished. closing... 
16/04/27 20:49:46 [main()]: INFO exec.ListSinkOperator: 0 Close done
16/04/27 20:49:46 [main()]: DEBUG ql.Driver: Shutting down query SHOW CONF "hive.aux.jars.path"
Time taken: 0.741 seconds, Fetched: 1 row(s)
16/04/27 20:49:46 [main()]: INFO CliDriver: Time taken: 0.741 seconds, Fetched: 1 row(s)
16/04/27 20:49:46 [main()]: DEBUG session.SessionState: Removing resource dir /mnt/tmp/f113c3f9-3316-4998-8e80-9bd4c3fc079c_resources
16/04/27 20:49:46 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #9
16/04/27 20:49:46 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #9
16/04/27 20:49:46 [main()]: DEBUG ipc.ProtobufRpcEngine: Call: delete took 3ms
16/04/27 20:49:46 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #10
16/04/27 20:49:46 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #10
16/04/27 20:49:46 [Thread-0()]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
16/04/27 20:49:46 [IPC Parameter Sending Thread #0()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop sending #11
16/04/27 20:49:46 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop got value #11
16/04/27 20:49:46 [Thread-0()]: DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
16/04/27 20:49:46 [Thread-0()]: DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@7fcf2fc1
16/04/27 20:49:46 [Thread-0()]: DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@7fcf2fc1
16/04/27 20:49:46 [Thread-0()]: DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@7fcf2fc1
16/04/27 20:49:46 [Thread-0()]: DEBUG ipc.Client: Stopping client
16/04/27 20:49:46 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop: closed
16/04/27 20:49:46 [IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop()]: DEBUG ipc.Client: IPC Client (1309129055) connection to ip-10-102-201-205.ec2.internal/10.102.201.205:8020 from hadoop: stopped, remaining connections 0

最佳答案

来自 Hive AdminManual :

you can display information about a configuration variable with the SHOW CONF command.

来自 Hive LanguageManual :

set <key>=<value> Sets the value of a particular configuration variable (key).

手册没有说清楚的是你也可以用set <key>显示当前值 .

底线:运行 set hive.aux.jars.path ;在 Hive CLI 中显示当前值,该值在先前的 set 中设置为 (a)命令,或作为命令行参数的 (b),或 hive-site.xml 中的 (c) ,或者 (d) 在硬编码默认值中失败了。

SHOW CONF 仅在您需要一些关于硬编码默认值的技术文档并且无法访问 Internet 时才有用...

关于hadoop - Hive:SHOW CONF 不遵守 hive-site.xml,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36900873/

相关文章:

amazon-web-services - EMR 主节点是否知道其集群 ID?

xml - 改变 mapred.reduce.tasks

function - Hive hour()函数返回NULL

hadoop - 有没有办法从HCATALOG访问配置单元元存储表?

hadoop - 将条件字段添加到 Hive 或 Impala 中的表

hadoop - Apache Flume 1.5 未在 Hadoop 2/自动故障转移集群配置中给出预期结果

java - 将 yamr 作业提交到远程集群时出现 ClassNotFoundException

java - 如何在没有 spark 或框架的情况下将 parquet 文件保存在 hdfs 中?

csv - 在 beeline hive 中导出为 csv

hadoop - 在配置单元表的顶部添加一些行