分享

本地eclipse MR开发 连接集群问题

aqi915 发表于 2015-8-25 09:29:14 [显示全部楼层] 只看大图 回帖奖励 阅读模式 关闭右栏 6 23403
hadoop 2.7.1  之前都用本地模式,只有结果提交到hadoop,最近发现了,修改了代码,报错了

可以帮忙理解下下面的端口么:
“yarn.resourcemanager.address", "192.168.100.141:8032"
"fs.defaultFS", "hdfs://192.168.100.141:9000"
"mapred.job.tracker", "192.168.1.2:9001"
集群与win7端口配置见图,这个上传图没有预览功能,不知道会上传到哪了呢?就先在这写备注了




系统Linux
15/08/25 09:14:07 INFO client.RMProxy: Connecting to ResourceManager at /192.168.100.141:8032
15/08/25 09:14:07 WARN mapreduce.JobResourceUploader: No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
15/08/25 09:14:07 INFO input.FileInputFormat: Total input paths to process : 2
15/08/25 09:14:09 INFO mapreduce.JobSubmitter: number of splits:2
15/08/25 09:14:09 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1440419740355_0004
15/08/25 09:14:09 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources.
15/08/25 09:14:09 INFO impl.YarnClientImpl: Submitted application application_1440419740355_0004
15/08/25 09:14:09 INFO mapreduce.Job: The url to track the job: http://ktbigdata1:8088/proxy/application_1440419740355_0004/
15/08/25 09:14:09 INFO mapreduce.Job: Running job: job_1440419740355_0004
15/08/25 09:14:12 INFO mapreduce.Job: Job job_1440419740355_0004 running in uber mode : false
15/08/25 09:14:12 INFO mapreduce.Job:  map 0% reduce 0%
15/08/25 09:14:12 INFO mapreduce.Job: Job job_1440419740355_0004 failed with state FAILED due to: Application application_1440419740355_0004 failed 2 times due to AM Container for appattempt_1440419740355_0004_000002 exited with  exitCode: 1
For more detailed output, check application tracking page:http://ktbigdata1:8088/cluster/app/application_1440419740355_0004Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1440419740355_0004_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
        at org.apache.hadoop.util.Shell.run(Shell.java:456)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
        at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)


Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.
15/08/25 09:14:12 INFO mapreduce.Job: Counters: 0

1.png
2.png

已有(6)人评论

跳转到指定楼层
arsenduan 发表于 2015-8-25 10:40:08
“yarn.resourcemanager.address", "192.168.100.141:8032"
yarn.resourcemanager.address        RM的applications manager(ASM)端口

fs.default.name        fs.defaultFS
系统默认分布式文件 URI       


"mapred.job.tracker", "192.168.1.2:9001"

新框架中已改为 Yarn-site.xml 中的 resouceManager 及 nodeManager 具体配置项,新框架中历史 job 的查询已从 Job tracker 剥离,归入单独的mapreduce.jobtracker.jobhistory 相关配置


更多参考:
hadoop2.x常用端口、定义方法及默认端口、hadoop1.X端口对比
http://www.aboutyun.com/thread-7513-1-1.html



回复

使用道具 举报

aqi915 发表于 2015-8-25 10:43:59
刚那错误自己回答下,lib  配置错误了
回复

使用道具 举报

NEOGX 发表于 2015-8-25 10:43:49
看看输出路径
回复

使用道具 举报

aqi915 发表于 2015-8-25 10:50:53

本地eclipse 没找到自己的类??

本帖最后由 aqi915 于 2015-8-25 10:53 编辑

现在有点好转,看我写的类,有没有问题。package com.ktbigdata.mr;
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;


public class Dedup {



    //map将输入中的value复制到输出数据的key上,并直接输出

    public static class Map extends Mapper<Object,Text,Text,Text>{

        private static Text line=new Text();//每行数据



        //实现map函数

        public void map(Object key,Text value,Context context)

                throws IOException,InterruptedException{

            line=value;

            context.write(line, new Text(""));

        }



    }



    //reduce将输入中的key复制到输出数据的key上,并直接输出

    public static class Reduce extends Reducer<Text,Text,Text,Text>{

        //实现reduce函数

        public void reduce(Text key,Iterable<Text> values,Context context)

                throws IOException,InterruptedException{

            context.write(key, new Text(""));

        }



    }



    public static void main(String[] args) throws Exception{

        Configuration conf = new Configuration();

        //这句话很关键

//        conf.set("mapred.job.tracker", "192.168.1.2:9001");
            conf.set("fs.defaultFS", "hdfs://192.168.100.141:9000");  
       conf.set("mapreduce.framework.name", "yarn");  
        conf.set("yarn.resourcemanager.address", "192.168.100.141:8032");  
        conf.set("mapred.remote.os", "Linux");  
        System.out.println("系统"+conf.get("mapred.remote.os"));  
                String[] ioArgs = new String[] { "hdfs://192.168.100.141:9000/dedup_in", "hdfs://192.168.100.141:9000/dedup_out" };

                String[] otherArgs = new GenericOptionsParser(conf, ioArgs).getRemainingArgs();

                if (otherArgs.length != 2) {

                        System.err.println("Usage: Dedup <in> <out>");

                        System.exit(2);

                }

                Job job = new Job(conf, "Dedup");

    job.setJarByClass(Dedup.class);



     //设置Map、Combine和Reduce处理类

     job.setMapperClass(Map.class);

     job.setCombinerClass(Reduce.class);

     job.setReducerClass(Reduce.class);



     //设置输出类型

     job.setOutputKeyClass(Text.class);

     job.setOutputValueClass(Text.class);



     //设置输入和输出目录

     FileInputFormat.addInputPath(job, new Path(otherArgs[0]));

     FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));

     System.exit(job.waitForCompletion(true) ? 0 : 1);

     }

}




报错:
系统Linux
15/08/25 10:33:38 INFO client.RMProxy: Connecting to ResourceManager at /192.168.100.141:8032
15/08/25 10:33:39 WARN mapreduce.JobResourceUploader: No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
15/08/25 10:33:39 INFO input.FileInputFormat: Total input paths to process : 2
15/08/25 10:33:40 INFO mapreduce.JobSubmitter: number of splits:2
15/08/25 10:33:40 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1440469760662_0001
15/08/25 10:33:40 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources.
15/08/25 10:33:41 INFO impl.YarnClientImpl: Submitted application application_1440469760662_0001
15/08/25 10:33:41 INFO mapreduce.Job: The url to track the job: http://ktbigdata1:8088/proxy/application_1440469760662_0001/
15/08/25 10:33:41 INFO mapreduce.Job: Running job: job_1440469760662_0001
15/08/25 10:33:50 INFO mapreduce.Job: Job job_1440469760662_0001 running in uber mode : false
15/08/25 10:33:50 INFO mapreduce.Job:  map 0% reduce 0%
15/08/25 10:33:57 INFO mapreduce.Job: Task Id : attempt_1440469760662_0001_m_000001_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
        at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:745)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
        ... 8 more

15/08/25 10:33:57 INFO mapreduce.Job: Task Id : attempt_1440469760662_0001_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
        at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:745)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
        ... 8 more

15/08/25 10:34:01 INFO mapreduce.Job: Task Id : attempt_1440469760662_0001_m_000000_1, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
        at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:745)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
        ... 8 more

15/08/25 10:34:01 INFO mapreduce.Job: Task Id : attempt_1440469760662_0001_m_000001_1, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
        at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:745)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
        ... 8 more

15/08/25 10:34:06 INFO mapreduce.Job: Task Id : attempt_1440469760662_0001_m_000000_2, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
        at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:745)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
        ... 8 more

15/08/25 10:34:07 INFO mapreduce.Job: Task Id : attempt_1440469760662_0001_m_000001_2, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
        at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:745)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.ClassNotFoundException: Class com.ktbigdata.mr.Dedup$Map not found
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
        ... 8 more

15/08/25 10:34:13 INFO mapreduce.Job:  map 100% reduce 100%
15/08/25 10:34:13 INFO mapreduce.Job: Job job_1440469760662_0001 failed with state FAILED due to: Task failed task_1440469760662_0001_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

15/08/25 10:34:14 INFO mapreduce.Job: Counters: 13
        Job Counters
                Failed map tasks=7
                Killed map tasks=1
                Launched map tasks=8
                Other local map tasks=6
                Data-local map tasks=2
                Total time spent by all maps in occupied slots (ms)=33437
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=33437
                Total vcore-seconds taken by all map tasks=33437
                Total megabyte-seconds taken by all map tasks=34239488
        Map-Reduce Framework
                CPU time spent (ms)=0
                Physical memory (bytes) snapshot=0
                Virtual memory (bytes) snapshot=0


回复

使用道具 举报

desehawk 发表于 2015-8-25 15:09:24
aqi915 发表于 2015-8-25 10:50
现在有点好转,看我写的类,有没有问题。package com.ktbigdata.mr;
import java.io.IOException;
import ...

还是类的问题,下面的包看看是怎么回事

com.ktbigdata.mr.Dedup$Map
回复

使用道具 举报

aqi915 发表于 2015-8-25 16:17:48
desehawk 发表于 2015-8-25 15:09
还是类的问题,下面的包看看是怎么回事

com.ktbigdata.mr.Dedup$Map

刚那个问题解决了,就是有点傻,唉!
现在要用mr  入数据到hbase ,又出现和刚刚hdfs跑数错误。如图片1,
第一次的解决方法是在mapred-site与yarn-site加了jar 包路径,但这次不懂要怎么修改了?

图片1

图片1
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关闭

推荐上一条 /2 下一条