立即注册 登录
About云-梭伦科技 返回首页

einhep的个人空间 https://www.aboutyun.com/?1418 [收藏] [复制] [分享] [RSS]

日志

sqoop导出数据到mysql出现异常,语句是对的,不多不知道为什么报错

已有 1617 次阅读2016-6-8 15:31 | mysql

[root@cloud4 conf]# sqoop export --connect jdbc:mysql://192.168.56.1:3306/hive --username root --password root --table pv_info --export-dir /hive/hmbbs.db/result/dat2=2013-05-30 --input-fields-terminated-by '\t';
13/10/31 02:24:43 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
13/10/31 02:24:43 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
13/10/31 02:24:43 INFO tool.CodeGenTool: Beginning code generation
13/10/31 02:24:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pv_info` AS t LIMIT 1
13/10/31 02:24:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pv_info` AS t LIMIT 1
13/10/31 02:24:43 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: /tmp/sqoop-root/compile/4ab5f7e76b5428c59fb2869c08077146/pv_info.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/10/31 02:24:44 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/4ab5f7e76b5428c59fb2869c08077146/pv_info.jar
13/10/31 02:24:44 INFO mapreduce.ExportJobBase: Beginning export of pv_info
13/10/31 02:24:45 INFO input.FileInputFormat: Total input paths to process : 1
13/10/31 02:24:45 INFO input.FileInputFormat: Total input paths to process : 1
13/10/31 02:24:45 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/10/31 02:24:45 WARN snappy.LoadSnappy: Snappy native library not loaded
13/10/31 02:24:45 INFO mapred.JobClient: Running job: job_201310310028_0018
13/10/31 02:24:46 INFO mapred.JobClient:  map 0% reduce 0%
13/10/31 02:24:54 INFO mapred.JobClient:  map 14% reduce 0%
13/10/31 02:24:56 INFO mapred.JobClient:  map 85% reduce 0%
13/10/31 02:24:56 INFO mapred.JobClient: Task Id : attempt_201310310028_0018_m_000000_0, Status : FAILED
java.io.IOException: Can't export data, please check task tracker logs
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.util.NoSuchElementException
        at java.util.AbstractList$Itr.next(AbstractList.java:350)
        at pv_info.__loadFromFields(pv_info.java:198)
        at pv_info.parse(pv_info.java:147)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
        ... 10 more

13/10/31 02:25:01 INFO mapred.JobClient: Task Id : attempt_201310310028_0018_m_000000_1, Status : FAILED
java.io.IOException: Can't export data, please check task tracker logs
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.util.NoSuchElementException
        at java.util.AbstractList$Itr.next(AbstractList.java:350)
        at pv_info.__loadFromFields(pv_info.java:198)
        at pv_info.parse(pv_info.java:147)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
        ... 10 more

13/10/31 02:25:06 INFO mapred.JobClient: Task Id : attempt_201310310028_0018_m_000000_2, Status : FAILED
java.io.IOException: Can't export data, please check task tracker logs
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.util.NoSuchElementException
        at java.util.AbstractList$Itr.next(AbstractList.java:350)
        at pv_info.__loadFromFields(pv_info.java:198)
        at pv_info.parse(pv_info.java:147)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
        ... 10 more

13/10/31 02:25:12 INFO mapred.JobClient: Job complete: job_201310310028_0018
13/10/31 02:25:12 INFO mapred.JobClient: Counters: 20
13/10/31 02:25:12 INFO mapred.JobClient:   Job Counters
13/10/31 02:25:12 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=48806
13/10/31 02:25:12 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
13/10/31 02:25:12 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
13/10/31 02:25:12 INFO mapred.JobClient:     Rack-local map tasks=7
13/10/31 02:25:12 INFO mapred.JobClient:     Launched map tasks=10
13/10/31 02:25:12 INFO mapred.JobClient:     Data-local map tasks=3
13/10/31 02:25:12 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
13/10/31 02:25:12 INFO mapred.JobClient:     Failed map tasks=1
13/10/31 02:25:12 INFO mapred.JobClient:   File Output Format Counters
13/10/31 02:25:12 INFO mapred.JobClient:     Bytes Written=0
13/10/31 02:25:12 INFO mapred.JobClient:   FileSystemCounters
13/10/31 02:25:12 INFO mapred.JobClient:     HDFS_BYTES_READ=801
13/10/31 02:25:12 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=371720
13/10/31 02:25:12 INFO mapred.JobClient:   File Input Format Counters
13/10/31 02:25:12 INFO mapred.JobClient:     Bytes Read=0
13/10/31 02:25:12 INFO mapred.JobClient:   Map-Reduce Framework
13/10/31 02:25:12 INFO mapred.JobClient:     Map input records=0
13/10/31 02:25:12 INFO mapred.JobClient:     Physical memory (bytes) snapshot=205475840
13/10/31 02:25:12 INFO mapred.JobClient:     Spilled Records=0
13/10/31 02:25:12 INFO mapred.JobClient:     CPU time spent (ms)=2780
13/10/31 02:25:12 INFO mapred.JobClient:     Total committed heap usage (bytes)=40501248
13/10/31 02:25:12 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=2130132992
13/10/31 02:25:12 INFO mapred.JobClient:     Map output records=0
13/10/31 02:25:12 INFO mapred.JobClient:     SPLIT_RAW_BYTES=762
13/10/31 02:25:12 INFO mapreduce.ExportJobBase: Transferred 801 bytes in 27.6045 seconds (29.017 bytes/sec)
13/10/31 02:25:12 INFO mapreduce.ExportJobBase: Exported 0 records.
13/10/31 02:25:12 ERROR tool.ExportTool: Error during export: Export job failed!


解决办法:
文件名写全
真是太大意了
/hive/hmbbs.db/result/dat2=2013-05-30/000000_0

总结自:csdn

路过

雷人

握手

鲜花

鸡蛋

评论 (0 个评论)

facelist doodle 涂鸦板

您需要登录后才可以评论 登录 | 立即注册

关闭

推荐上一条 /2 下一条