分享

sqoop 运行报错 那个大佬帮着解决下

spftoto 发表于 2018-4-21 15:56:26 [显示全部楼层] 回帖奖励 阅读模式 关闭右栏 1 12459
[hadoop@djt11 hadoop]$ sqoop import --connect jdbc:mysql://192.168.221.1/test --username root --password 123456 --table box
Warning: /home/hadoop/app/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/hadoop/app/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/04/21 22:58:26 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
18/04/21 22:58:26 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/04/21 22:58:27 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/04/21 22:58:27 INFO tool.CodeGenTool: Beginning code generation
18/04/21 22:58:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `box` AS t LIMIT 1
18/04/21 22:58:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `box` AS t LIMIT 1
18/04/21 22:58:28 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/app/hadoop
Note: /tmp/sqoop-hadoop/compile/119c7fa2a25f4928810f7f7dc3ce407a/box.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/04/21 22:58:37 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/119c7fa2a25f4928810f7f7dc3ce407a/box.jar
18/04/21 22:58:37 WARN manager.MySQLManager: It looks like you are importing from mysql.
18/04/21 22:58:37 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
18/04/21 22:58:37 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
18/04/21 22:58:37 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
18/04/21 22:58:37 INFO mapreduce.ImportJobBase: Beginning import of box
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/app/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/app/hbase/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/04/21 22:58:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/04/21 22:58:38 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
18/04/21 22:58:54 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
18/04/21 22:59:15 INFO db.DBInputFormat: Using read commited transaction isolation
18/04/21 22:59:15 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`name`), MAX(`name`) FROM `box`
18/04/21 22:59:16 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/hadoop/.staging/job_1524319265805_0001
18/04/21 22:59:16 ERROR tool.ImportTool: Import failed: java.io.IOException: Generating splits for a textual index column allowed only in case of "-Dorg.apache.sqoop.splitter.allow_text_splitter=true" property passed as a parameter
        at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:204)
        at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
        at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1314)
        at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:200)
        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:173)
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:270)
        at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
        at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: Generating splits for a textual index column allowed only in case of "-Dorg.apache.sqoop.splitter.allow_text_splitter=true" property passed as a parameter
        at org.apache.sqoop.mapreduce.db.TextSplitter.split(TextSplitter.java:67)
        at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:201)
        ... 23 more

[hadoop@djt11 hadoop]$


已有(1)人评论

跳转到指定楼层
langke93 发表于 2018-4-21 22:26:10
你这是在做什么操作?mysql导入hive??
先加上下面参数-Dorg.apache.sqoop.splitter.allow_text_splitter=true,并且如果是hive,需要有分隔符
sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true --connect jdbc:mysql://10.20.30.105/appbase  
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条