分享

尝试通过Sqoop2向Hbase中import数据过程报错

blackmoon 发表于 2015-5-13 09:38:29 [显示全部楼层] 只看大图 回帖奖励 阅读模式 关闭右栏 17 103790
各位好,这两天部署了一套3节点的CDH集群做测试,尝试使用Sqoop2从Mysql向Hbase中导入数据,过程中遇到了报错,尝试解决未果,请各位帮忙看一下是否有解决思路。

环境:
Centos 6.5 64bit
Cloudera Manager 5.4
CDH 5.3.3

CDH中的相关组件版本为:
Sqoop2 1.99.4
hbase 0.98.6
hadoop 2.5.0

Hbase的JDBC Driver使用的是phoenix-4.3.1。
现在create link & job 都已经通过了,但是start job时报错:
java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString

按照这篇文章(http://my.oschina.net/cloudcoder/blog/291486)里面的说明,已经把hbase-protocol-0.98.6-cdh5.3.3.jar & hbase-protocol.jar 通过ln建立连接到了下面两个目录,重启整个CDH集群后,仍然报错。
/opt/cloudera/parcels/CDH/lib/hadoop/
/opt/cloudera/parcels/CDH/lib/hadoop/lib

下面是供参考的建立link & job的过程和完整报错:

建立Link(下面是建立hbase link的过程,建立mysql的link过程略):
sqoop:000> create link -c 2
Creating link for connector with id 2
Please fill following values to create new link object
Name: hbasetest

Link configuration

JDBC Driver Class: org.apache.phoenix.jdbc.PhoenixDriver
JDBC Connection String: jdbc:phoenix:master:2181
Username:
Password:
JDBC Connection Properties:
There are currently 0 values in the map:
entry#
New link was successfully created with validation status OK and persistent id 14

建立job:
sqoop:000> create job -f 6 -t 14
Creating job for links with from id 6 and to id 14
Please fill following values to create new job object
Name: MysqlToHbase

From database configuration

Schema name: clouderam
Table name: users
Table SQL statement:
Table column names:
Partition column name:
Null value allowed for the partition column:
Boundary query:

To database configuration

Schema name: main
Table name: users
Table SQL statement:
Table column names:
Stage table name:
Should clear stage table:

Throttling resources

Extractors:
Loaders:

New job was successfully created with validation status OK  and persistent id 4

完整报错,日志文件为:/var/log/sqoop2/sqoop-cmf-sqoop-SQOOP_SERVER-slave2.log.out
2015-05-12 17:23:26,138 INFO org.apache.sqoop.repository.JdbcRepositoryTransaction: Attempting transaction commit
2015-05-12 17:23:26,417 INFO org.apache.sqoop.connector.jdbc.GenericJdbcFromInitializer: Boundaries: min=1, max=4, columnType=-5
2015-05-12 17:23:26,419 INFO org.apache.sqoop.connector.jdbc.GenericJdbcFromInitializer: Using dataSql: SELECT * FROM clouderam.users WHERE ${CONDITIONS}
2015-05-12 17:23:26,419 INFO org.apache.sqoop.connector.jdbc.GenericJdbcFromInitializer: Field names: USER_ID,USER_NAME,PASSWORD_HASH,PASSWORD_SALT,PASSWORD_LOGIN,OPTIMISTIC_LOCK_VERSION
2015-05-12 17:23:29,446 ERROR org.apache.sqoop.connector.jdbc.GenericJdbcExecutor: Caught SQLException:
java.sql.SQLException: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:1024)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.getTable(ConnectionQueryServicesImpl.java:1257)
        at org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:350)
        at org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:311)
        at org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:307)
        at org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:333)
        at org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:237)
        at org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:160)
        at org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStatement.java:340)
        at org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStatement.java:330)
        at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:240)
        at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:235)
        at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
        at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:234)
        at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:1073)
        at org.apache.sqoop.connector.jdbc.GenericJdbcExecutor.getQueryColumns(GenericJdbcExecutor.java:227)
        at org.apache.sqoop.connector.jdbc.GenericJdbcToInitializer.configureTableProperties(GenericJdbcToInitializer.java:169)
        at org.apache.sqoop.connector.jdbc.GenericJdbcToInitializer.initialize(GenericJdbcToInitializer.java:49)
        at org.apache.sqoop.connector.jdbc.GenericJdbcToInitializer.initialize(GenericJdbcToInitializer.java:39)
        at org.apache.sqoop.driver.JobManager.initializeConnector(JobManager.java:451)
        at org.apache.sqoop.driver.JobManager.createJobRequest(JobManager.java:372)
        at org.apache.sqoop.driver.JobManager.start(JobManager.java:277)
        at org.apache.sqoop.handler.JobRequestHandler.startJob(JobRequestHandler.java:367)
        at org.apache.sqoop.handler.JobRequestHandler.handleEvent(JobRequestHandler.java:112)
        at org.apache.sqoop.server.v1.JobServlet.handlePutRequest(JobServlet.java:96)
        at org.apache.sqoop.server.SqoopProtocolServlet.doPut(SqoopProtocolServlet.java:79)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:646)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:555)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
        at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
        at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:620)
        at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2854)
        at org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1159)
        at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1647)
        at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1526)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:1265)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:1258)
        at org.apache.hadoop.hbase.client.HTable$17.call(HTable.java:1571)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        ... 1 more
2015-05-12 17:23:29,463 ERROR org.apache.sqoop.server.SqoopProtocolServlet: Exception in PUT http://slave2:15002/sqoop/v1/job/4/start
org.apache.sqoop.common.SqoopException: GENERIC_JDBC_CONNECTOR_0003:Unable to access meta data
        at org.apache.sqoop.connector.jdbc.GenericJdbcExecutor.getQueryColumns(GenericJdbcExecutor.java:240)
        at org.apache.sqoop.connector.jdbc.GenericJdbcToInitializer.configureTableProperties(GenericJdbcToInitializer.java:169)
        at org.apache.sqoop.connector.jdbc.GenericJdbcToInitializer.initialize(GenericJdbcToInitializer.java:49)
        at org.apache.sqoop.connector.jdbc.GenericJdbcToInitializer.initialize(GenericJdbcToInitializer.java:39)
        at org.apache.sqoop.driver.JobManager.initializeConnector(JobManager.java:451)
        at org.apache.sqoop.driver.JobManager.createJobRequest(JobManager.java:372)
        at org.apache.sqoop.driver.JobManager.start(JobManager.java:277)
        at org.apache.sqoop.handler.JobRequestHandler.startJob(JobRequestHandler.java:367)
        at org.apache.sqoop.handler.JobRequestHandler.handleEvent(JobRequestHandler.java:112)
        at org.apache.sqoop.server.v1.JobServlet.handlePutRequest(JobServlet.java:96)
        at org.apache.sqoop.server.SqoopProtocolServlet.doPut(SqoopProtocolServlet.java:79)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:646)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:555)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
        at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
        at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:620)
        at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:1024)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl.getTable(ConnectionQueryServicesImpl.java:1257)
        at org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:350)
        at org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:311)
        at org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:307)
        at org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:333)
        at org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:237)
        at org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:160)
        at org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStatement.java:340)
        at org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStatement.java:330)
        at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:240)
        at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:235)
        at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
        at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:234)
        at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:1073)
        at org.apache.sqoop.connector.jdbc.GenericJdbcExecutor.getQueryColumns(GenericJdbcExecutor.java:227)
        ... 28 more
Caused by: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2854)
        at org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1159)
        at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1647)
        at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1526)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:1265)
        at org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:1258)
        at org.apache.hadoop.hbase.client.HTable$17.call(HTable.java:1571)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        ... 1 more
2015-05-12 17:25:51,148 INFO org.apache.sqoop.repository.JdbcRepositoryTransaction: Attempting transaction commit

2015-05-12 17:30:51,149 INFO org.apache.sqoop.repository.JdbcRepositoryTransaction: Attempting transaction commit

已有(17)人评论

跳转到指定楼层
tntzbzc 发表于 2015-5-13 11:13:16

首先确认
hbase-protocol.jar是否在Hadoop's classpath中,
如果使用jar包含依赖,使用下面三种形式中的一种
[mw_shl_code=bash,true]$ HADOOP_CLASSPATH=/path/to/hbase-protocol.jar:/path/to/hbase/conf hadoop jar MyJob.jar MyJobMainClass
$ HADOOP_CLASSPATH=$(hbase mapredcp):/path/to/hbase/conf hadoop jar MyJob.jar MyJobMainClass
$ HADOOP_CLASSPATH=$(hbase classpath) hadoop jar MyJob.jar MyJobMainClass[/mw_shl_code]

如果不包含依赖使用下面命令结构
[mw_shl_code=bash,true]$ HADOOP_CLASSPATH=$(hbase mapredcp):/etc/hbase/conf hadoop jar MyApp.jar MyJobMainClass -libjars $(hbase mapredcp | tr ':' ',') ...[/mw_shl_code]



回复

使用道具 举报

bob007 发表于 2015-5-13 11:19:39


解决方法应该没错,楼主是否配置了环境变量
回复

使用道具 举报

blackmoon 发表于 2015-5-13 11:35:16
tntzbzc 发表于 2015-5-13 11:13
首先确认
hbase-protocol.jar是否在Hadoop's classpath中,
如果使用jar包含依赖,使用下面三种形式中 ...

你好,CDH部署时是使用cloudera manager自动部署了整套集群,尝试通过查看配置文件的方式寻找classpath没有找到,请问该如何定位到classpath呢?
回复

使用道具 举报

s060403072 发表于 2015-5-13 11:43:00
blackmoon 发表于 2015-5-13 11:35
你好,CDH部署时是使用cloudera manager自动部署了整套集群,尝试通过查看配置文件的方式寻找classpath没 ...

通过命令找一下
echo $HADOOP_CLASSPATH
回复

使用道具 举报

blackmoon 发表于 2015-5-13 12:21:30
本帖最后由 blackmoon 于 2015-5-13 12:30 编辑
s060403072 发表于 2015-5-13 11:43
通过命令找一下
echo $HADOOP_CLASSPATH

echo $HADOOP_CLASSPATH 之前已经试过了= =CDH没有配环境变量


我尝试ps -ef|grep -i resourcemanager ,输出如下,注意红字部分,这样查靠谱吗?靠谱的话classpath应该包括/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/lib/*

yarn     20699  1398  1 May12 ?        00:13:33 /usr/java/jdk1.7.0_75/bin/java -Dproc_resourcemanager -Xmx1000m -Djava.net.preferIPv4Stack=true -Xms52428800 -Xmx52428800 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:-CMSConcurrentMTEnabled -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -Dhadoop.event.appender=,EventCatcher -XX:OnOutOfMemoryError=/opt/cm-5.4.0/lib64/cmf/service/common/killparent.sh -Dhadoop.log.dir=/var/log/hadoop-yarn -Dyarn.log.dir=/var/log/hadoop-yarn -Dhadoop.log.file=hadoop-cmf-yarn-RESOURCEMANAGER-master.log.out -Dyarn.log.file=hadoop-cmf-yarn-RESOURCEMANAGER-master.log.out -Dyarn.home.dir=/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn -Dhadoop.home.dir=/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn -Dhadoop.root.logger=INFO,RFA -Dyarn.root.logger=INFO,RFA -Djava.library.path=/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/lib/native -classpath /opt/cm-5.4.0/run/cloudera-scm-agent/process/279-yarn-RESOURCEMANAGER:/opt/cm-5.4.0/run/cloudera-scm-agent/process/279-yarn-RESOURCEMANAGER:/opt/cm-5.4.0/run/cloudera-scm-agent/process/279-yarn-RESOURCEMANAGER:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/lib/*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/.//*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-mapreduce/lib/*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-mapreduce/.//*:/opt/cm-5.4.0/share/cmf/lib/plugins/event-publish-5.4.0-shaded.jar:/opt/cm-5.4.0/share/cmf/lib/plugins/tt-instrumentation-5.4.0.jar:/opt/cm-5.4.0/share/cmf/lib/plugins/cdh5/audit-plugin-cdh5-2.3.0-shaded.jar:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn/lib/*:/opt/cm-5.4.0/run/cloudera-scm-agent/process/279-yarn-RESOURCEMANAGER/rm-config/log4j.properties org.apache.hadoop.yarn.server.resourcemanager.ResourceManager

我再去各节点确认了一下,现在各个节点该目录下都已经有这个hbase-protocol的jar包了。
[root@master hadoop]# ll /opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/lib |grep hbase
lrwxrwxrwx 1 root root       37 5月  12 15:44 hbase-protocol-0.98.6-cdh5.3.3.jar -> ../hbase-protocol-0.98.6-cdh5.3.3.jar
lrwxrwxrwx 1 root root       21 5月  12 16:44 hbase-protocol.jar -> ../hbase-protocol.jar

回复

使用道具 举报

s060403072 发表于 2015-5-13 12:31:16
blackmoon 发表于 2015-5-13 12:21
echo $HADOOP_CLASSPATH 之前已经试过了= =CDH没有配环境变量




没有的话,可以尝试配置,然后按照下面命令尝试运行sqoop
$ HADOOP_CLASSPATH=/path/to/hbase-protocol.jar:/path/to/hbase/conf hadoop jar MyJob.jar MyJobMainClass
回复

使用道具 举报

s060403072 发表于 2015-5-13 12:33:55
blackmoon 发表于 2015-5-13 12:21
echo $HADOOP_CLASSPATH 之前已经试过了= =CDH没有配环境变量

核心思想是你能引用到这个包
回复

使用道具 举报

blackmoon 发表于 2015-5-13 13:05:01
本帖最后由 blackmoon 于 2015-5-13 13:09 编辑
s060403072 发表于 2015-5-13 12:33
核心思想是你能引用到这个包

恩,我也是这个思路,但是手工配置应该如何配置不太清楚,在上面我ps -ef看到的信息,可以看到classpath里有好大一长串......

另外,报错是java.lang.IllegalAccessError,并不是classNotFound那种可以直观看出加载没加载上的报错....不知道是没加载到还是加载了没效果......

更新一下
刚才尝试使用ps结合lsof的办法看了一下:
[root@slave2 sqoop2]# ps -ef|grep -i nodemanager
yarn    21652 1085  0 12:36 ?        00:00:12 /usr/java/jdk1.7.0_75/bin/java -Dproc_nodemanager -Xmx1000m -Djava.net.preferIPv4Stack=true -server -Xms126877696 -Xmx126877696 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:-CMSConcurrentMTEnabled -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -Dhadoop.event.appender=,EventCatcher -XX:OnOutOfMemoryError=/opt/cm-5.4.0/lib64/cmf/service/common/killparent.sh -Dhadoop.log.dir=/var/log/hadoop-yarn -Dyarn.log.dir=/var/log/hadoop-yarn -Dhadoop.log.file=hadoop-cmf-yarn-NODEMANAGER-slave2.log.out -Dyarn.log.file=hadoop-cmf-yarn-NODEMANAGER-slave2.log.out -Dyarn.home.dir=/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn -Dhadoop.home.dir=/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn -Dhadoop.root.logger=INFO,RFA -Dyarn.root.logger=INFO,RFA -Djava.library.path=/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/lib/native -classpath /opt/cm-5.4.0/run/cloudera-scm-agent/process/303-yarn-NODEMANAGER:/opt/cm-5.4.0/run/cloudera-scm-agent/process/303-yarn-NODEMANAGER:/opt/cm-5.4.0/run/cloudera-scm-agent/process/303-yarn-NODEMANAGER:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/lib/*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/.//*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-hdfs/./:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-mapreduce/lib/*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-mapreduce/.//*:/opt/cm-5.4.0/share/cmf/lib/plugins/event-publish-5.4.0-shaded.jar:/opt/cm-5.4.0/share/cmf/lib/plugins/tt-instrumentation-5.4.0.jar:/opt/cm-5.4.0/share/cmf/lib/plugins/cdh5/audit-plugin-cdh5-2.3.0-shaded.jar:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop-yarn/lib/*:/opt/cm-5.4.0/run/cloudera-scm-agent/process/303-yarn-NODEMANAGER/nm-config/log4j.properties org.apache.hadoop.yarn.server.nodemanager.NodeManager
root     25839 23258  0 13:00 pts/0    00:00:00 grep -i nodemanager
[root@slave2 sqoop2]# lsof -p 21652|grep hbase
java    21652 yarn  mem    REG              202,1  3553838  669133 /opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/jars/hbase-protocol-0.98.6-cdh5.3.3.jar
java    21652 yarn  mem    REG              202,1  3553838 1072039 /opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/hbase-protocol-0.98.6-cdh5.3.3.jar
java    21652 yarn  mem    REG              202,1  3553838 1072040 /opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/lib/hbase-protocol-0.98.6-cdh5.3.3.jar
java    21652 yarn   13r   REG              202,1  3553838 1072040 /opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/lib/hbase-protocol-0.98.6-cdh5.3.3.jar
java    21652 yarn   74r   REG              202,1  3553838 1072039 /opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/lib/hadoop/hbase-protocol-0.98.6-cdh5.3.3.jar
java    21652 yarn   98r   REG              202,1  3553838  669133 /opt/cloudera/parcels/CDH-5.3.3-1.cdh5.3.3.p0.5/jars/hbase-protocol-0.98.6-cdh5.3.3.jar

可以看到至少nodemanager(resourcemanager和另外一个nodemanager也都确认了)进程是使用了hbase-protocol包的,是不是可以认为已经加载到了= =


回复

使用道具 举报

s060403072 发表于 2015-5-13 13:07:42
blackmoon 发表于 2015-5-13 13:05
恩,我也是这个思路,但是手工配置应该如何配置不太清楚,在上面我ps -ef看到的信息,可以看到classpath ...

他说是不能访问,会不会是权限的问题
回复

使用道具 举报

12下一页
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条