分享

hive输入show databases命令报错

问题背景:cdh5.2.1 总共30台主机,25个数据节点。有一台数据节点被重装了操作系统。我将其cm重新安装了之后正常将节点加入到集群中了。

但是启完角色后不久发现tasktracker被列入黑名单了。去到问题节点hive下执行show databases命令报错如下:


[root@node07 bin]# hive -e "show databases"
log4j:ERROR Could not find value for key log4j.appender.NullAppender
log4j:ERROR Could not instantiate appender named "NullAppender".
log4j:ERROR Could not find value for key log4j.appender.NullAppender
log4j:ERROR Could not instantiate appender named "NullAppender".
17/06/06 18:34:24 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.2.1-1.cdh5.2.1.p0.12/jars/hive-common-0.13.1-cdh5.2.1.jar!/hive-log4j.properties
OK
Failed with exception java.io.IOException:java.io.IOException: Cannot create an instance of InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
Time taken: 0.325 seconds

打开debug模式看有如下报错:

[root@node07 bin]# hive -hiveconf hive.root.logger=DEBUG,console
log4j:ERROR Could not find value for key log4j.appender.NullAppender
log4j:ERROR Could not instantiate appender named "NullAppender".
log4j:ERROR Could not find value for key log4j.appender.NullAppender
log4j:ERROR Could not instantiate appender named "NullAppender".
17/06/06 18:25:02 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
17/06/06 18:25:02 [main]: DEBUG common.LogUtils: Using hive-site.xml found on CLASSPATH at /etc/hive/conf.cloudera.hive/hive-site.xml
17/06/06 18:25:02 [main]: WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.2.1-1.cdh5.2.1.p0.12/jars/hive-common-0.13.1-cdh5.2.1.jar!/hive-log4j.properties
17/06/06 18:25:02 [main]: INFO SessionState:
Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.2.1-1.cdh5.2.1.p0.12/jars/hive-common-0.13.1-cdh5.2.1.jar!/hive-log4j.properties
17/06/06 18:25:02 [main]: DEBUG parse.VariableSubstitution: Substitution is on: hive
17/06/06 18:25:02 [main]: DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of successful kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
17/06/06 18:25:02 [main]: DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of failed kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
17/06/06 18:25:02 [main]: DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[GetGroups], always=false, type=DEFAULT, sampleName=Ops)
17/06/06 18:25:02 [main]: DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
17/06/06 18:25:02 [main]: DEBUG security.Groups:  Creating new Groups object
17/06/06 18:25:02 [main]: DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
17/06/06 18:25:02 [main]: DEBUG security.UserGroupInformation: hadoop login
17/06/06 18:25:02 [main]: DEBUG security.UserGroupInformation: hadoop login commit
17/06/06 18:25:02 [main]: DEBUG security.UserGroupInformation: using local user:UnixPrincipal: root
17/06/06 18:25:02 [main]: DEBUG security.UserGroupInformation: UGI loginUser:root (auth:SIMPLE)
17/06/06 18:25:02 [main]: INFO hive.metastore: Trying to connect to metastore with URI thrift://YUCLIENT:9083
17/06/06 18:25:02 [main]: DEBUG security.Groups: Returning fetched groups for 'root'
17/06/06 18:25:02 [main]: INFO hive.metastore: Connected to metastore.
17/06/06 18:25:02 [main]: DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
17/06/06 18:25:02 [main]: DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
17/06/06 18:25:02 [main]: DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
17/06/06 18:25:02 [main]: DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = /var/run/hdfs-sockets/dn
17/06/06 18:25:02 [main]: DEBUG hdfs.DFSClient: No KeyProvider found.
17/06/06 18:25:02 [main]: DEBUG hdfs.HAUtil: No HA service delegation token found for logical URI hdfs://nameservice1
17/06/06 18:25:02 [main]: DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
17/06/06 18:25:02 [main]: DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
17/06/06 18:25:02 [main]: DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
17/06/06 18:25:02 [main]: DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = /var/run/hdfs-sockets/dn
17/06/06 18:25:02 [main]: DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
17/06/06 18:25:02 [main]: DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@ace16ad
17/06/06 18:25:02 [main]: DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@21c783c5
17/06/06 18:25:03 [main]: DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
17/06/06 18:25:03 [main]: DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
17/06/06 18:25:03 [Thread-4]: DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$1@5f0f0625: starting with interruptCheckPeriodMs = 60000
17/06/06 18:25:03 [main]: DEBUG util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.
17/06/06 18:25:03 [main]: DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
17/06/06 18:25:03 [main]: INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
hive> show databases;
17/06/06 18:25:35 [main]: INFO log.PerfLogger: <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO log.PerfLogger: <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO ql.Driver: Concurrency mode is disabled, not creating a lock manager
17/06/06 18:25:35 [main]: INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: DEBUG parse.VariableSubstitution: Substitution is on: show databases
17/06/06 18:25:35 [main]: INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO parse.ParseDriver: Parsing command: show databases
17/06/06 18:25:35 [main]: INFO parse.ParseDriver: Parse Completed
17/06/06 18:25:35 [main]: INFO log.PerfLogger: </PERFLOG method=parse start=1496744735552 end=1496744735661 duration=109 from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO log.PerfLogger: <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: DEBUG exec.Utilities: Create dirs file:/tmp/root/hive_2017-06-06_18-25-35_552_4423721607896700666-1 with permission rwxrwxrwx recursive false
17/06/06 18:25:35 [main]: DEBUG nativeio.NativeIO: Initialized cache for IDs to User/Group mapping with a  cache timeout of 14400 seconds.
17/06/06 18:25:35 [main]: INFO ql.Driver: Semantic Analysis Completed
17/06/06 18:25:35 [main]: INFO log.PerfLogger: </PERFLOG method=semanticAnalyze start=1496744735661 end=1496744735710 duration=49 from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[database_name] columnTypes=[string] separator=[[B@39a4036f] nullstring=  lastColumnTakesRest=false
17/06/06 18:25:35 [main]: INFO exec.ListSinkOperator: Initializing Self 0 OP
17/06/06 18:25:35 [main]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.DelimitedJSONSerDe initialized with: columnNames=[] columnTypes=[] separator=[[B@7846a55e] nullstring=  lastColumnTakesRest=false
17/06/06 18:25:35 [main]: INFO exec.ListSinkOperator: Operator 0 OP initialized
17/06/06 18:25:35 [main]: INFO exec.ListSinkOperator: Initialization Done 0 OP
17/06/06 18:25:35 [main]: DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[database_name] columnTypes=[string] separator=[[B@685f1ba8] nullstring=  lastColumnTakesRest=false
17/06/06 18:25:35 [main]: INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:database_name, type:string, comment:from deserializer)], properties:null)
17/06/06 18:25:35 [main]: INFO log.PerfLogger: </PERFLOG method=compile start=1496744735520 end=1496744735759 duration=239 from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO log.PerfLogger: <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO ql.Driver: Starting command: show databases
17/06/06 18:25:35 [main]: INFO log.PerfLogger: </PERFLOG method=TimeToSubmit start=1496744735516 end=1496744735763 duration=247 from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO log.PerfLogger: <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO log.PerfLogger: <PERFLOG method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO exec.DDLTask: results : 1
17/06/06 18:25:35 [main]: INFO log.PerfLogger: </PERFLOG method=runTasks start=1496744735763 end=1496744735782 duration=19 from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1496744735759 end=1496744735782 duration=23 from=org.apache.hadoop.hive.ql.Driver>
OK
17/06/06 18:25:35 [main]: INFO ql.Driver: OK
17/06/06 18:25:35 [main]: INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1496744735782 end=1496744735782 duration=0 from=org.apache.hadoop.hive.ql.Driver>
17/06/06 18:25:35 [main]: INFO log.PerfLogger: </PERFLOG method=Driver.run start=1496744735516 end=1496744735782 duration=266 from=org.apache.hadoop.hive.ql.Driver>
Failed with exception java.io.IOException:java.io.IOException: Cannot create an instance of InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
17/06/06 18:25:35 [main]: ERROR CliDriver: Failed with exception java.io.IOException:java.io.IOException: Cannot create an instance of InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
java.io.IOException: java.io.IOException: Cannot create an instance of InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:636)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:534)
        at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:137)
        at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1532)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:289)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:221)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:431)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:800)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:694)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:633)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.io.IOException: Cannot create an instance of InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:234)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:416)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:561)
        ... 14 more
Caused by: java.lang.RuntimeException: Error in configuring object
        at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:230)
        ... 16 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
        ... 19 more
Caused by: java.lang.IllegalArgumentException: Compression codec com.googlecode.lzc.ZCodec not found.

在google上看了下说是缺少类文件hadoop-lzo.*.jar。但是我看到其他正常的节点也并无此类文件。不知有无其他朋友遇到此类问题,请不吝赐教!让菜鸟学习下。

已有(6)人评论

跳转到指定楼层
easthome001 发表于 2017-6-6 19:16:29
楼主不是说加入黑名单了。
先把去除黑名单,再试试
回复

使用道具 举报

a506488046 发表于 2017-6-6 19:24:06
easthome001 发表于 2017-6-6 19:16
楼主不是说加入黑名单了。
先把去除黑名单,再试试

不好意思,这正是移除黑名单后的信息了。
回复

使用道具 举报

desehawk 发表于 2017-6-6 19:51:19
a506488046 发表于 2017-6-6 19:24
不好意思,这正是移除黑名单后的信息了。

Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
这个要么是uris有问题,要么可能是网络问题或则端口等网络原因。造成链接问题
回复

使用道具 举报

a506488046 发表于 2017-6-7 10:15:05
desehawk 发表于 2017-6-6 19:51
Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metas ...

这个并无问题的。我的每台datanode执行hive都会出现这个WARN。但是唯独这一台有出现Cannot create an instance of InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!这个报错。
回复

使用道具 举报

einhep 发表于 2017-6-8 09:28:48
如果是这个错误的话,最好还是尝试解决hadoop-lzo.*.jar的问题。例如从配置等
<property>
<name>io.compression.codecs</name>
<value>com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
</property>

<property>
<name>io.compression.codec.lzo.class</name>
<value>com.hadoop.compression.lzo.LzoCodec</value>
</property>

还是其它方法等
回复

使用道具 举报

a506488046 发表于 2017-7-26 21:36:47
结贴:

问题原因:因这个平台之前并不是由我进行维护,上一任运维和厂商估计因为要让hive支持一些文件压缩格式的问题做过补丁升级。在cloudera-scm-server上分发了一些jar包到各个数据节点。

而本次被重装的datanode重新安装cm之后是不会将补丁包给传到这个节点上导致hive加载某些类时报错。

将正常节点上的几个文件cp到$HADOOP_HOME/lib/hadoop-0.20-mapreduce/lib下问题解决。
QQ图片20170726213020.png
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条