分享

hbase开发环境搭建及运行hbase小实例(HBase 0.98.3新api)

pig2 2014-7-10 18:55:54 发表于 实操演练 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 33 172420
本帖最后由 nettman 于 2014-7-11 11:53 编辑

问题导读:
1.如何搭建hbase开发环境?
2.HTableDescriptor初始化产生了那些变化?
3.eclipse如何连接hbase集群?





hbase开发环境搭建与hadoop开发环境搭建差不多的。这里是以win7为例。
首先我们看一下hadoop的开发环境搭建,参考hadoop开发方式总结及操作指导,这里讲了两个方式,一种是用插件,另外一种是不是用插件。
那么对于hbase的开发环境是什么样子的,该如何搭建?
我们采用的是添加包的方式。
首先需要下载安装包:

1.安装包下载


可以通过官网下载:
http://mirror.bit.edu.cn/apache/hbase/hbase-0.98.3/

hbase.png


百度网盘下载:
hbase-0.98.3-hadoop2-bin.tar.gz
链接:http://pan.baidu.com/s/1mguTsRu 密码:xlhc


2.添加包

(1)我们解压包
解压之后,得到如下包
hbasehadoop2.png


(2)添加包
添加包的操作

tianjiabao.png

如上图5个步骤:
单击hbase-》属性弹出(2所示)Properties for hbase属性对话框.

然后我们通过下标5,单击Add External JARs。
找到hbase_home/lib,我这里是D:\hadoop2\hbase-0.98.3-hadoop2\lib

xunzhaobao.png


添加完毕,这样开发环境就搭建完毕。





搭建完毕,我们先做个简单的例子吧
就创建一个blog表。
1.首先通过list命令查看表

list.png



2.我们运行下面程序:

运行通过下面操作方式:
yunxing.png


package www.aboutyun.com.hbase;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.util.Bytes;

public class OperateTable {
         public static void main(String[] args) throws IOException {
         
         Configuration conf = HBaseConfiguration.create();
         conf.set("hbase.zookeeper.quorum", "master");//使用eclipse时必须添加这个,否则无法定位
         conf.set("hbase.zookeeper.property.clientPort", "2181");
         HBaseAdmin admin = new HBaseAdmin(conf);// 新建一个数据库管理员//新api
         HTableDescriptor desc=new HTableDescriptor(TableName.valueOf("blog"));
         //HTableDescriptor desc = new HTableDescriptor("blog");
         desc.addFamily(new HColumnDescriptor("article"));
         desc.addFamily(new HColumnDescriptor("author"));
         admin.createTable(desc );
         admin.close();
         //admin.disableTable("blog");
         //admin.deleteTable("blog");
         //assertThat(admin.tableExists("blog"),is(false));
   }
}





注释:
   conf.set("hbase.zookeeper.quorum", "master");//使用eclipse时必须添加这个,否则无法定位
这里因为使用的是win7,所以master需要配置hosts。如下图所示
master.png



hosts的路径为C:\Windows\System32\drivers\etc







得到结果:我们看到blog就创建成功了。

chuangjianwanbi.png

这里说一下:
HTableDescriptor的初始化发生变化:
duibio.png

新API
  HTableDescriptor desc=new HTableDescriptor(TableName.valueOf("blog"));
旧API
  HTableDescriptor desc = new HTableDescriptor("blog");






更多编程内容参考:

Java操作hbase编程
http://www.aboutyun.com/thread-7075-1-1.html


spark使用java读取hbase数据做分布式计算
http://www.aboutyun.com/thread-8242-1-1.html

hbase编程:通过Java api操作hbase
http://www.aboutyun.com/thread-7151-1-1.html

hbase HTable之Put、delete、get等源码分析
http://www.aboutyun.com/thread-7644-1-1.html

Hbase Java编程实现增删改查
http://www.aboutyun.com/thread-6901-1-1.html

总结Eclipse 远程连接 HBase问题及解决方案大全
http://www.aboutyun.com/thread-5866-1-1.html

HBase中如何开发LoadBalance插件
http://www.aboutyun.com/thread-8350-1-1.html

Hbase与eclipse集成的第一个例子
http://www.aboutyun.com/thread-7837-1-1.html

hbase分页应用场景及分页思路与代码实现
http://www.aboutyun.com/thread-7030-1-1.html


HBase MapReduce排序Secondary Sort
http://www.aboutyun.com/thread-7304-1-1.html


CDH4源码搭建hbase开发环境
http://www.aboutyun.com/thread-7259-1-1.html

Thrift了解4:C#通过Thrift操作HBase实战
http://www.aboutyun.com/thread-7142-1-1.html




hbase API

hadoop2.2.0帮助手册下载API及HBase 0.98.1-hadoop2 API
http://www.aboutyun.com/thread-6113-1-1.html

HBase数据迁移(1)-使用HBase的API中的Put方法
http://www.aboutyun.com/thread-8336-1-1.html


hbase编程:Java API连接Hbase进行增删改查讲解实例
http://www.aboutyun.com/thread-8290-1-1.html








已有(33)人评论

跳转到指定楼层
dipwater 发表于 2014-7-29 22:02:05
操作成功,太棒了!
回复

使用道具 举报

ascentzhen 发表于 2014-8-5 18:18:37
操作成功了,很好
回复

使用道具 举报

ascentzhen 发表于 2014-8-5 18:34:22
hbase中每个存储单元cell为何允许存储数据的多个版本啊?求解
回复

使用道具 举报

pig2 发表于 2014-8-5 19:06:26
ascentzhen 发表于 2014-8-5 18:34
hbase中每个存储单元cell为何允许存储数据的多个版本啊?求解
这是大数据的特点,保留多个副本,作用有很多,比如防止数据丢失
回复

使用道具 举报

ascentzhen 发表于 2014-8-6 09:56:42
哦,好的,多谢
回复

使用道具 举报

XiaoLuo 发表于 2014-10-22 13:36:10
非常感谢,楼主能指点一下,hbase如何进行模糊查询吗?
回复

使用道具 举报

quenlang 发表于 2014-11-1 16:44:41
牛叉,赞一个,动手搭建一下,感谢楼主的分享。
回复

使用道具 举报

quenlang 发表于 2014-11-2 14:22:37
配置失败了,win下面的host文件内容:
192.168.0.101                  hadoop1
192.168.0.102                  hadoop2
192.168.0.103                  hadoop3
192.168.0.104                  hadoop4
192.168.0.105                  hadoop5

运行后控制台信息:
2014-11-02 14:00:41,240 INFO  [main] zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2014-11-02 14:00:41,241 INFO  [main] zookeeper.ZooKeeper: Client environment:host.name=MBETUKPOUEDZLGC
2014-11-02 14:00:41,241 INFO  [main] zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
2014-11-02 14:00:41,241 INFO  [main] zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
2014-11-02 14:00:41,241 INFO  [main] zookeeper.ZooKeeper: Client environment:java.home=C:\Java\jdk1.7.0_67\jre
2014-11-02 14:00:41,241 INFO  [main] zookeeper.ZooKeeper: Client environment:java.class.path=F:\项目目录\JavaProject\.metadata\.plugins\org.apache.hadoop.eclipse\hadoop-conf-7770220133029578856;F:\项目目录\JavaProject\HbasePut\bin;C:\hbase-0.98.4-hadoop2\lib\activation-1.1.jar;C:\hbase-0.98.4-hadoop2\lib\aopalliance-1.0.jar;C:\hbase-0.98.4-hadoop2\lib\asm-3.1.jar;C:\hbase-0.98.4-hadoop2\lib\avro-1.7.4.jar;C:\hbase-0.98.4-hadoop2\lib\commons-beanutils-1.7.0.jar;C:\hbase-0.98.4-hadoop2\lib\commons-beanutils-core-1.8.0.jar;C:\hbase-0.98.4-hadoop2\lib\commons-cli-1.2.jar;C:\hbase-0.98.4-hadoop2\lib\commons-codec-1.7.jar;C:\hbase-0.98.4-hadoop2\lib\commons-collections-3.2.1.jar;C:\hbase-0.98.4-hadoop2\lib\commons-compress-1.4.1.jar;C:\hbase-0.98.4-hadoop2\lib\commons-configuration-1.6.jar;C:\hbase-0.98.4-hadoop2\lib\commons-daemon-1.0.13.jar;C:\hbase-0.98.4-hadoop2\lib\commons-digester-1.8.jar;C:\hbase-0.98.4-hadoop2\lib\commons-el-1.0.jar;C:\hbase-0.98.4-hadoop2\lib\commons-httpclient-3.1.jar;C:\hbase-0.98.4-hadoop2\lib\commons-io-2.4.jar;C:\hbase-0.98.4-hadoop2\lib\commons-lang-2.6.jar;C:\hbase-0.98.4-hadoop2\lib\commons-logging-1.1.1.jar;C:\hbase-0.98.4-hadoop2\lib\commons-math-2.1.jar;C:\hbase-0.98.4-hadoop2\lib\commons-net-3.1.jar;C:\hbase-0.98.4-hadoop2\lib\findbugs-annotations-1.3.9-1.jar;C:\hbase-0.98.4-hadoop2\lib\gmbal-api-only-3.0.0-b023.jar;C:\hbase-0.98.4-hadoop2\lib\grizzly-framework-2.1.2.jar;C:\hbase-0.98.4-hadoop2\lib\grizzly-http-2.1.2.jar;C:\hbase-0.98.4-hadoop2\lib\grizzly-http-server-2.1.2.jar;C:\hbase-0.98.4-hadoop2\lib\grizzly-http-servlet-2.1.2.jar;C:\hbase-0.98.4-hadoop2\lib\grizzly-rcm-2.1.2.jar;C:\hbase-0.98.4-hadoop2\lib\guava-12.0.1.jar;C:\hbase-0.98.4-hadoop2\lib\guice-3.0.jar;C:\hbase-0.98.4-hadoop2\lib\guice-servlet-3.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-annotations-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-auth-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-client-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-common-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-hdfs-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-mapreduce-client-app-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-mapreduce-client-common-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-mapreduce-client-core-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-mapreduce-client-jobclient-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-mapreduce-client-shuffle-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-yarn-api-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-yarn-client-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-yarn-common-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-yarn-server-common-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-yarn-server-nodemanager-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hamcrest-core-1.3.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-client-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-common-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-common-0.98.4-hadoop2-tests.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-examples-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-hadoop2-compat-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-hadoop-compat-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-it-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-it-0.98.4-hadoop2-tests.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-prefix-tree-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-protocol-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-server-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-server-0.98.4-hadoop2-tests.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-shell-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-testing-util-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-thrift-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\high-scale-lib-1.1.1.jar;C:\hbase-0.98.4-hadoop2\lib\htrace-core-2.04.jar;C:\hbase-0.98.4-hadoop2\lib\httpclient-4.1.3.jar;C:\hbase-0.98.4-hadoop2\lib\httpcore-4.1.3.jar;C:\hbase-0.98.4-hadoop2\lib\jackson-core-asl-1.8.8.jar;C:\hbase-0.98.4-hadoop2\lib\jackson-jaxrs-1.8.8.jar;C:\hbase-0.98.4-hadoop2\lib\jackson-mapper-asl-1.8.8.jar;C:\hbase-0.98.4-hadoop2\lib\jackson-xc-1.8.8.jar;C:\hbase-0.98.4-hadoop2\lib\jamon-runtime-2.3.1.jar;C:\hbase-0.98.4-hadoop2\lib\jasper-compiler-5.5.23.jar;C:\hbase-0.98.4-hadoop2\lib\jasper-runtime-5.5.23.jar;C:\hbase-0.98.4-hadoop2\lib\javax.inject-1.jar;C:\hbase-0.98.4-hadoop2\lib\javax.servlet-3.1.jar;C:\hbase-0.98.4-hadoop2\lib\javax.servlet-api-3.0.1.jar;C:\hbase-0.98.4-hadoop2\lib\jaxb-api-2.2.2.jar;C:\hbase-0.98.4-hadoop2\lib\jaxb-impl-2.2.3-1.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-client-1.9.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-core-1.8.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-grizzly2-1.9.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-guice-1.9.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-json-1.8.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-server-1.8.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-test-framework-core-1.9.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-test-framework-grizzly2-1.9.jar;C:\hbase-0.98.4-hadoop2\lib\jets3t-0.6.1.jar;C:\hbase-0.98.4-hadoop2\lib\jettison-1.3.1.jar;C:\hbase-0.98.4-hadoop2\lib\jetty-6.1.26.jar;C:\hbase-0.98.4-hadoop2\lib\jetty-sslengine-6.1.26.jar;C:\hbase-0.98.4-hadoop2\lib\jetty-util-6.1.26.jar;C:\hbase-0.98.4-hadoop2\lib\jruby-complete-1.6.8.jar;C:\hbase-0.98.4-hadoop2\lib\jsch-0.1.42.jar;C:\hbase-0.98.4-hadoop2\lib\jsp-2.1-6.1.14.jar;C:\hbase-0.98.4-hadoop2\lib\jsp-api-2.1-6.1.14.jar;C:\hbase-0.98.4-hadoop2\lib\jsr305-1.3.9.jar;C:\hbase-0.98.4-hadoop2\lib\junit-4.11.jar;C:\hbase-0.98.4-hadoop2\lib\libthrift-0.9.0.jar;C:\hbase-0.98.4-hadoop2\lib\log4j-1.2.17.jar;C:\hbase-0.98.4-hadoop2\lib\management-api-3.0.0-b012.jar;C:\hbase-0.98.4-hadoop2\lib\metrics-core-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\netty-3.6.6.Final.jar;C:\hbase-0.98.4-hadoop2\lib\paranamer-2.3.jar;C:\hbase-0.98.4-hadoop2\lib\protobuf-java-2.5.0.jar;C:\hbase-0.98.4-hadoop2\lib\servlet-api-2.5-6.1.14.jar;C:\hbase-0.98.4-hadoop2\lib\slf4j-api-1.6.4.jar;C:\hbase-0.98.4-hadoop2\lib\slf4j-log4j12-1.6.4.jar;C:\hbase-0.98.4-hadoop2\lib\snappy-java-1.0.4.1.jar;C:\hbase-0.98.4-hadoop2\lib\xmlenc-0.52.jar;C:\hbase-0.98.4-hadoop2\lib\xz-1.0.jar;C:\hbase-0.98.4-hadoop2\lib\zookeeper-3.4.6.jar
2014-11-02 14:00:41,242 INFO  [main] zookeeper.ZooKeeper: Client environment:java.library.path=C:\Java\jdk1.7.0_67\jre\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:/Users/Administrator/AppData/Local/Genuitec/Common/binary/com.sun.java.jdk.win32.x86_1.6.0.013/jre/bin/client;C:/Users/Administrator/AppData/Local/Genuitec/Common/binary/com.sun.java.jdk.win32.x86_1.6.0.013/jre/bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;d:\oracle;C:\Program Files\ibm\gsk8\lib64;C:\Program Files (x86)\ibm\gsk8\lib;D:\IBM\SQLLIB\BIN;D:\IBM\SQLLIB\FUNCTION;D:\IBM\SQLLIB\SAMPLES\REPL;C:\Program Files\MIT\Kerberos\bin;C:\strawberry\c\bin;C:\strawberry\perl\bin;C:\Java\jdk1.7.0_67\bin;C:\hadoop-2.4.1\bin;C:\apache-maven-3.2.3\bin;D:\UltraEdit\;.
2014-11-02 14:00:41,242 INFO  [main] zookeeper.ZooKeeper: Client environment:java.io.tmpdir=C:\Users\ADMINI~1\AppData\Local\Temp\
2014-11-02 14:00:41,242 INFO  [main] zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
2014-11-02 14:00:41,242 INFO  [main] zookeeper.ZooKeeper: Client environment:os.name=Windows 7
2014-11-02 14:00:41,242 INFO  [main] zookeeper.ZooKeeper: Client environment:os.arch=amd64
2014-11-02 14:00:41,242 INFO  [main] zookeeper.ZooKeeper: Client environment:os.version=6.1
2014-11-02 14:00:41,242 INFO  [main] zookeeper.ZooKeeper: Client environment:user.name=Administrator
2014-11-02 14:00:41,242 INFO  [main] zookeeper.ZooKeeper: Client environment:user.home=C:\Users\Administrator
2014-11-02 14:00:41,242 INFO  [main] zookeeper.ZooKeeper: Client environment:user.dir=F:\项目目录\JavaProject\HbasePut
2014-11-02 14:00:41,243 INFO  [main] zookeeper.ZooKeeper: Initiating client connection, connectString=hadoop4:2181,hadoop3:2181,hadoop2:2181,hadoop1:2181,hadoop5:2181 sessionTimeout=90000 watcher=hconnection-0x73cfc664, quorum=hadoop4:2181,hadoop3:2181,hadoop2:2181,hadoop1:2181,hadoop5:2181, baseZNode=/hbase
2014-11-02 14:00:41,282 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x73cfc664 connecting to ZooKeeper ensemble=hadoop4:2181,hadoop3:2181,hadoop2:2181,hadoop1:2181,hadoop5:2181
2014-11-02 14:00:41,283 INFO  [main-SendThread(hadoop5:2181)] zookeeper.ClientCnxn: Opening socket connection to server hadoop5/192.168.0.105:2181. Will not attempt to authenticate using SASL (unknown error)
2014-11-02 14:00:41,284 INFO  [main-SendThread(hadoop5:2181)] zookeeper.ClientCnxn: Socket connection established to hadoop5/192.168.0.105:2181, initiating session
2014-11-02 14:00:41,490 INFO  [main-SendThread(hadoop5:2181)] zookeeper.ClientCnxn: Session establishment complete on server hadoop5/192.168.0.105:2181, sessionid = 0x54957332c09000a, negotiated timeout = 40000

但是hbase shell中查询,没有要创建的表。

程序代码如下:
package com.apache.hbase.kora;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.util.Bytes;

public class PutExample {

        /**
         * @param args
         * @throws IOException
         */
        public static void main(String[] args) throws IOException {
                // 创建所需配置
                Configuration conf = HBaseConfiguration.create();

                conf.set("hbase.master", "hadoop1");
                conf.set("hbase.zookeeper.quorum", "hadoop1, hadoop2, hadoop3, hadoop4, hadoop5");
                conf.set("hbase.zookeeper.property.clientPort", "2181");
                // 实例化一个客户端
                HTable table = new HTable(conf,"testtable");
               
                // 指定一个行来创建一个Put
                Put put = new Put(Bytes.toBytes("row1"));

                // 向put中添加两个列
                put.add(Bytes.toBytes("cf"), Bytes.toBytes("col1"), Bytes.toBytes("val1"));
                put.add(Bytes.toBytes("cf"), Bytes.toBytes("col2"), Bytes.toBytes("val2"));
               
                table.put(put);
               
        }

}

好郁闷啊,更楼主一样的配置,不知道为什就不行。迫切希望能得到您的指点
回复

使用道具 举报

quenlang 发表于 2014-11-2 18:32:22
quenlang 发表于 2014-11-2 14:22
配置失败了,win下面的host文件内容:
192.168.0.101                  hadoop1
192.168.0.102                  hadoop2

等了很久之后,控制台反复出现如下信息:
Sun Nov 02 16:19:24 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@37734b10, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]

    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:129)
    at org.apache.hadoop.hbase.client.HTable.getRowOrBefore(HTable.java:714)
    at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:144)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:1159)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1223)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1111)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1068)
    at org.apache.hadoop.hbase.client.AsyncProcess.findDestLocation(AsyncProcess.java:365)
    at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:310)
    at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:964)
    at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1252)
    at org.apache.hadoop.hbase.client.HTable.put(HTable.java:910)
    at com.apache.hbase.kora.PutExample.main(PutExample.java:34)
Caused by: org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:532)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    at org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupConnection(RpcClient.java:578)
    at org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:868)
    at org.apache.hadoop.hbase.ipc.RpcClient.getConnection(RpcClient.java:1538)
    at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1437)
    at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1656)
    at org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1714)
    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:29876)
    at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1502)
    at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:710)
    at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:708)
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:114)
    ... 12 more
2014-11-02 16:19:24,683 DEBUG [main] client.HConnectionManager$HConnectionImplementation: locateRegionInMeta parentTable=hbase:meta, metaLocation={region=hbase:meta,,1.1588230740, hostname=hadoop4.updb.com,60020,1414507297422, seqNum=0}, attempt=6 of 35 failed; retrying after sleep of 4009 because: This server is in the failed servers list: hadoop4.updb.com/72.52.4.120:60020
2014-11-02 16:39:19,606 WARN  [main] client.HConnectionManager$HConnectionImplementation: Encountered problems when prefetch hbase:meta table:
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions:
Sun Nov 02 16:19:48 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:19:49 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.hbase.ipc.RpcClient$FailedServerException: This server is in the failed servers list: hadoop4.updb.com/72.52.4.120:60020
Sun Nov 02 16:19:49 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.hbase.ipc.RpcClient$FailedServerException: This server is in the failed servers list: hadoop4.updb.com/72.52.4.120:60020
Sun Nov 02 16:19:50 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.hbase.ipc.RpcClient$FailedServerException: This server is in the failed servers list: hadoop4.updb.com/72.52.4.120:60020
Sun Nov 02 16:20:12 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:20:36 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:21:06 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:21:36 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:22:06 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:22:36 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:23:16 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:23:57 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:24:37 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:25:17 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:25:57 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:26:37 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:27:17 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:27:57 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:28:37 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:29:17 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:29:57 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:30:38 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:31:18 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:31:58 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:32:38 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:33:18 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:33:58 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:34:38 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:35:18 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:35:58 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:36:38 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:37:19 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:37:59 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:38:39 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
Sun Nov 02 16:39:19 CST 2014, org.apache.hadoop.hbase.client.RpcRetryingCaller@42e3f9e3, org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]

    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:129)
    at org.apache.hadoop.hbase.client.HTable.getRowOrBefore(HTable.java:714)
    at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:144)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:1159)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1223)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1111)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1068)
    at org.apache.hadoop.hbase.client.AsyncProcess.findDestLocation(AsyncProcess.java:365)
    at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:310)
    at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:964)
    at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1252)
    at org.apache.hadoop.hbase.client.HTable.put(HTable.java:910)
    at com.apache.hbase.kora.PutExample.main(PutExample.java:34)
Caused by: org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=hadoop4.updb.com/72.52.4.120:60020]
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:532)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    at org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupConnection(RpcClient.java:578)
    at org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:868)
    at org.apache.hadoop.hbase.ipc.RpcClient.getConnection(RpcClient.java:1538)
    at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1437)
    at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1656)
    at org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1714)
    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:29876)
    at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1502)
    at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:710)
    at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:708)
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:114)
    ... 12 more
2014-11-02 16:39:19,608 DEBUG [main] client.HConnectionManager$HConnectionImplementation: locateRegionInMeta parentTable=hbase:meta, metaLocation={region=hbase:meta,,1.1588230740, hostname=hadoop4.updb.com,60020,1414507297422, seqNum=0}, attempt=7 of 35 failed; retrying after sleep of 10042 because: This server is in the failed servers list: hadoop4.updb.com/72.52.4.120:60020


hadoop4.updb.com/72.52.4.120:60020,奇怪的是我的hadoop4的IP是192.168.0.104,而不是72.52.4.120,这个IP我查了一下,是美国的IP,跟奇怪噢
回复

使用道具 举报

1234下一页
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条