分享

sqoop2 1.99.5 start job时报错,恳请各位前辈指点迷津

sqoop2 版本是 1.99.5, CDH-5.11.1
我创建了2个link 分别是:
mysql_link_124:
sqoop:000> show link -l 1
1 link(s) to show:
link with id 1 and name mysql_link_124 (Enabled: true, Created by root at 7/21/17 11:09 AM, Updated by root at 7/21/17 5:52 PM)
Using Connector generic-jdbc-connector with id 1
  Link configuration
    JDBC Driver Class: com.mysql.jdbc.Driver
    JDBC Connection String: jdbc:mysql://192.168.16.14:3306/test
    Username: root
    Password:
    JDBC Connection Properties:
      protocol = tcp

hdfs_link:

sqoop:000> show link -l 2
1 link(s) to show:
link with id 2 and name cloudera1_hdfs_link (Enabled: true, Created by root at 7/21/17 11:11 AM, Updated by root at 7/21/17 11:11 AM)
Using Connector hdfs-connector with id 3
  Link configuration
    HDFS URI: hdfs://192.168.16.162:8020


创建的job为:
sqoop:000> show job -j 1
1 job(s) to show:
Job with id 1 and name mysql_to_hdfs (Enabled: true, Created by root at 7/21/17 11:44 AM, Updated by root at 7/21/17 6:29 PM)
Using link id 1 and Connector id 1
  From database configuration
    Schema name:
    Table name: people
    Table SQL statement:
    Table column names:
    Partition column name: id
    Null value allowed for the partition column:
    Boundary query:
  Throttling resources
    Extractors: 2
    Loaders: 2
  ToJob configuration
    Override null value:
    Null value:
    Output format: TEXT_FILE
    Compression format: NONE
    Custom compression format:
    Output directory: /user/sqoop2/


然后我执行 job时 报如下错误

sqoop:000> start job -jid 1
0    [main] WARN  org.apache.hadoop.util.NativeCodeLoader  - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception has occurred during processing command
Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0001:Server has returned exception - <html><head><title>Apache Tomcat/6.0.48 - Error report</title><style><!--H1 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:22px;} H2 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:16px;} H3 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:14px;} BODY {font-family:Tahoma,Arial,sans-serif;color:black;background-color:white;} B {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;} P {font-family:Tahoma,Arial,sans-serif;background:white;color:black;font-size:12px;}A {color : black;}A.name {color : black;}HR {color : #525D76;}--></style> </head><body><h1>HTTP Status 500 - Servlet execution threw an exception</h1><HR size="1" noshade="noshade"><p><b>type</b> Exception report</p><p><b>message</b> <u>Servlet execution threw an exception</u></p><p><b>description</b> <u>The server encountered an internal error that prevented it from fulfilling this request.</u></p><p><b>exception</b> <pre>javax.servlet.ServletException: Servlet execution threw an exception
        org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:622)
        org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:301)
        org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:574)
</pre></p><p><b>root cause</b> <pre>java.lang.NoClassDefFoundError: org/codehaus/jackson/map/JsonMappingException
        org.apache.hadoop.mapreduce.Job.getJobSubmitter(Job.java:1291)
        org.apache.hadoop.mapreduce.Job.submit(Job.java:1302)
        org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submitToCluster(MapreduceSubmissionEngine.java:274)
        org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submit(MapreduceSubmissionEngine.java:255)
        org.apache.sqoop.driver.JobManager.start(JobManager.java:288)
        org.apache.sqoop.handler.JobRequestHandler.startJob(JobRequestHandler.java:380)
        org.apache.sqoop.handler.JobRequestHandler.handleEvent(JobRequestHandler.java:116)
        org.apache.sqoop.server.v1.JobServlet.handlePutRequest(JobServlet.java:96)
        org.apache.sqoop.server.SqoopProtocolServlet.doPut(SqoopProtocolServlet.java:79)
        javax.servlet.http.HttpServlet.service(HttpServlet.java:646)
        javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
        org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:622)
        org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:301)
        org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:574)
</pre></p><p><b>root cause</b> <pre>java.lang.ClassNotFoundException: org.codehaus.jackson.map.JsonMappingException
        java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        org.apache.hadoop.mapreduce.Job.getJobSubmitter(Job.java:1291)
        org.apache.hadoop.mapreduce.Job.submit(Job.java:1302)
        org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submitToCluster(MapreduceSubmissionEngine.java:274)
        org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submit(MapreduceSubmissionEngine.java:255)
        org.apache.sqoop.driver.JobManager.start(JobManager.java:288)
        org.apache.sqoop.handler.JobRequestHandler.startJob(JobRequestHandler.java:380)
        org.apache.sqoop.handler.JobRequestHandler.handleEvent(JobRequestHandler.java:116)
        org.apache.sqoop.server.v1.JobServlet.handlePutRequest(JobServlet.java:96)
        org.apache.sqoop.server.SqoopProtocolServlet.doPut(SqoopProtocolServlet.java:79)
        javax.servlet.http.HttpServlet.service(HttpServlet.java:646)
        javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
        org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:622)
        org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:301)
        org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:574)
</pre></p><p><b>note</b> <u>The full stack trace of the root cause is available in the Apache Tomcat/6.0.48 logs.</u></p><HR size="1" noshade="noshade"><h3>Apache Tomcat/6.0.48</h3></body></html>


mysql_link

mysql_link

已有(7)人评论

跳转到指定楼层
desehawk 发表于 2017-7-24 15:30:16
内容都封装了,看不出来什么问题。而且报错的好像是tomcat。
回复

使用道具 举报

大青山 发表于 2017-7-24 15:43:36
desehawk 发表于 2017-7-24 15:30
内容都封装了,看不出来什么问题。而且报错的好像是tomcat。

我是通过 cloudera Manager 去安装的 sqoop2,

在 /var/log/sqoop2 日志下面的  localhost.2017-07-24.log 日志文件中 会有这个错误,
我搜索了这个问题 基本都说缺少 jackson-core-asl-1.8.8.jar   和   jackson-mapper-asl-1.8.8.jar 这两个jar包
但是我看了 相关目录下 都有这两个包,所以我就不知道还有哪些方法去解决这个问题了。
不知道你还有没有一些办法去解决它,谢谢


[root@cloudera3 sqoop2]# cat localhost.2017-07-24.log
Jul 24, 2017 9:49:33 AM org.apache.catalina.core.StandardWrapperValve invoke
SEVERE: Servlet.service() for servlet v1.JobServlet threw exception
java.lang.ClassNotFoundException: org.codehaus.jackson.map.JsonMappingException
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at org.apache.hadoop.mapreduce.Job.getJobSubmitter(Job.java:1291)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1302)
        at org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submitToCluster(MapreduceSubmissionEngine.java:274)
        at org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submit(MapreduceSubmissionEngine.java:255)
        at org.apache.sqoop.driver.JobManager.start(JobManager.java:288)
        at org.apache.sqoop.handler.JobRequestHandler.startJob(JobRequestHandler.java:380)
        at org.apache.sqoop.handler.JobRequestHandler.handleEvent(JobRequestHandler.java:116)
        at org.apache.sqoop.server.v1.JobServlet.handlePutRequest(JobServlet.java:96)
        at org.apache.sqoop.server.SqoopProtocolServlet.doPut(SqoopProtocolServlet.java:79)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:646)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:622)
        at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:301)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:574)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
        at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
        at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:610)
        at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:503)
        at java.lang.Thread.run(Thread.java:745)

Jul 24, 2017 10:53:32 AM org.apache.catalina.core.StandardWrapperValve invoke
SEVERE: Servlet.service() for servlet v1.JobServlet threw exception
java.lang.ClassNotFoundException: org.codehaus.jackson.map.JsonMappingException
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at org.apache.hadoop.mapreduce.Job.getJobSubmitter(Job.java:1291)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1302)
        at org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submitToCluster(MapreduceSubmissionEngine.java:274)
        at org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submit(MapreduceSubmissionEngine.java:255)
        at org.apache.sqoop.driver.JobManager.start(JobManager.java:288)
        at org.apache.sqoop.handler.JobRequestHandler.startJob(JobRequestHandler.java:380)
        at org.apache.sqoop.handler.JobRequestHandler.handleEvent(JobRequestHandler.java:116)
        at org.apache.sqoop.server.v1.JobServlet.handlePutRequest(JobServlet.java:96)
        at org.apache.sqoop.server.SqoopProtocolServlet.doPut(SqoopProtocolServlet.java:79)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:646)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:622)
        at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:301)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:574)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
        at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
        at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:610)
        at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:503)
        at java.lang.Thread.run(Thread.java:745)


回复

使用道具 举报

desehawk 发表于 2017-7-24 17:24:14
大青山 发表于 2017-7-24 15:43
我是通过 cloudera Manager 去安装的 sqoop2,

在 /var/log/sqoop2 日志下面的  localhost.2017-07-24. ...

你的sqoop封装了,而不是说cloudera封装了
另外你的什么地方用到了spring,恐怕可能也和sqoop没有太大关系。
先解决spring的问题。
回复

使用道具 举报

大青山 发表于 2017-7-25 13:55:15
解决办法是 在sqoop 所在服务器上执行入下操作:

cd /opt/cloudera/parcels/CDH/lib/hadoop/client
sudo ln -s ../../hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar
sudo ln -s ../../hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar

参考链接:https://community.cloudera.com/t ... job/m-p/45359#M1890
回复

使用道具 举报

大青山 发表于 2017-7-25 14:00:18
谢谢各位 问题已解决,在sqoop服务器下执行入下操作:

cd /opt/cloudera/parcels/CDH/lib/hadoop/client
sudo ln -s ../../hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar
sudo ln -s ../../hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar

参考链接: https://community.cloudera.com/t ... job/m-p/45359#M1890
回复

使用道具 举报

大青山 发表于 2017-7-25 14:02:42
谢谢各位 问题已解决,在sqoop服务器下执行入下操作:

cd /opt/cloudera/parcels/CDH/lib/hadoop/client
sudo ln -s ../../hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar
sudo ln -s ../../hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar

参考链接: https://community.cloudera.com/t ... job/m-p/45359#M1890

回复

使用道具 举报

starrycheng 发表于 2017-7-25 14:27:36
大青山 发表于 2017-7-25 14:02
谢谢各位 问题已解决,在sqoop服务器下执行入下操作:

cd /opt/cloudera/parcels/CDH/lib/hadoop/client ...

相当于在/opt/cloudera/parcels/CDH/lib/hadoop/client这个路径下添加对应的包。

回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条