分享

hive 插入数据错误

sunt99 发表于 2016-9-13 15:50:56 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 3 10319
执行如下hive语句,map任务没有错误,reduce任务如下错误:
Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":41774},"value":{"_col0":"sr","_col1":"sr601   ","_col2":"02582579 ","_col3":"13:47:04","_col4":"卖","_col5":"投机","_col6":5342,"_col7":1,"_col8":53420,"_col9":"开","_col10":4,"_col11":0,"_col12":0,"_col13":0,"_col14":"2015-07-16"}}        at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:265)        at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444)        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:415)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":41774},"value":{"_col0":"sr","_col1":"sr601   ","_col2":"02582579 ","_col3":"13:47:04","_col4":"卖","_col5":"投机","_col6":5342,"_col7":1,"_col8":53420,"_col9":"开","_col10":4,"_col11":0,"_col12":0,"_col13":0,"_col14":"2015-07-16"}}        at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:253)        ... 7 moreCaused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: Failing write. Tried pipeline recovery 5 times without success.        at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:731)        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)        at



已有(3)人评论

跳转到指定楼层
easthome001 发表于 2016-9-13 16:34:51

回帖奖励 +1 云币

在写数据的时候,发生了io异常,推测应该是硬件问题


详细如下
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
pipeline的异常重建发生在datanode io处理这块
/**
* If this stream has encountered any errors, shutdown threads
* and mark the stream as closed.
*
* @return true if it should sleep for a while after returning.
*/  
private boolean processDatanodeOrExternalError() throws IOException {  
  if (!errorState.hasDatanodeError() && !shouldHandleExternalError()) {  
    return false;  
  }  

...  

  if (response != null) {  
    LOG.info("Error Recovery for " + block +  
        " waiting for responder to exit. ");  
    return true;  
  }  
  closeStream();  

  // move packets from ack queue to front of the data queue  
  synchronized (dataQueue) {  
    dataQueue.addAll(0, ackQueue);  
    ackQueue.clear();  
  }  

  // If we had to recover the pipeline five times in a row for the  
  // same packet, this client likely has corrupt data or corrupting  
  // during transmission.  
  if (!errorState.isRestartingNode() && ++pipelineRecoveryCount > 5) {  
    LOG.warn("Error recovering pipeline for writing " +  
        block + ". Already retried 5 times for the same packet.");  
    lastException.set(new IOException("Failing write. Tried pipeline " +  
        "recovery 5 times without success."));  
    streamerClosed = true;  
    return false;  
  }  

  setupPipelineForAppendOrRecovery();  


回复

使用道具 举报

sunt99 发表于 2016-9-13 17:41:46
easthome001 发表于 2016-9-13 16:34
在写数据的时候,发生了io异常,推测应该是硬件问题

运行了几次,每次都是这个错误,但是datanode节点并不都是一样的啊,如何确定具体的错误在哪了,麻烦了,谢谢

回复

使用道具 举报

arsenduan 发表于 2016-9-13 18:58:38
sunt99 发表于 2016-9-13 17:41
运行了几次,每次都是这个错误,但是datanode节点并不都是一样的啊,如何确定具体的错误在哪了,麻烦了, ...

有能插入成功的吗?
是什么配置
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条