溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊(cè)×
其他方式登錄
點(diǎn)擊 登錄注冊(cè) 即表示同意《億速云用戶服務(wù)條款》

Yarn命令使用及wordcount解析

發(fā)布時(shí)間:2020-08-14 05:38:14 來(lái)源:網(wǎng)絡(luò) 閱讀:1735 作者:wangkunj 欄目:大數(shù)據(jù)

前言:

前面幾篇博客主要介紹了MapReduce與Yarn的架構(gòu)設(shè)計(jì)及簡(jiǎn)單工作流程,本篇文章將以wordcount程序?yàn)槔?,?jiǎn)單介紹下Yarn的使用。

1.wordcount示例運(yùn)行
[root@hadoop000 ~]# su - hadoop
[hadoop@hadoop000 ~]$ jps
9201 SecondaryNameNode
9425 ResourceManager
13875 Jps
9540 NodeManager
8852 NameNode
8973 DataNode
# 創(chuàng)建wordcount目錄
[hadoop@hadoop000 ~]$ hdfs dfs -mkdir -p /wordcount/input
[hadoop@hadoop000 ~]$ vi test.log
jepson ruoze
hero yimi xjp
123
a b a
[hadoop@hadoop000 ~]$ hdfs dfs -put test.log /wordcount/input
[hadoop@hadoop000 ~]$ hdfs dfs -ls /wordcount/input           
Found 1 items
-rw-r--r--   1 hadoop supergroup         37 2018-05-29 20:38 /wordcount/input/test.log
# 執(zhí)行wordcount示例jar包
[hadoop@hadoop000 ~]$ yarn jar \
> /opt/software/hadoop-2.8.1/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.8.1.jar \
> wordcount \
> /wordcount/input \
> /wordcount/output
18/05/29 20:40:59 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
18/05/29 20:40:59 INFO input.FileInputFormat: Total input files to process : 1
18/05/29 20:41:00 INFO mapreduce.JobSubmitter: number of splits:1
18/05/29 20:41:00 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1526991305992_0001
18/05/29 20:41:01 INFO impl.YarnClientImpl: Submitted application application_1526991305992_0001
18/05/29 20:41:01 INFO mapreduce.Job: The url to track the job: http://hadoop000:8088/proxy/application_1526991305992_0001/
18/05/29 20:41:01 INFO mapreduce.Job: Running job: job_1526991305992_0001
18/05/29 20:41:14 INFO mapreduce.Job: Job job_1526991305992_0001 running in uber mode : false
18/05/29 20:41:14 INFO mapreduce.Job:  map 0% reduce 0%
18/05/29 20:41:23 INFO mapreduce.Job:  map 100% reduce 0%
18/05/29 20:41:29 INFO mapreduce.Job:  map 100% reduce 100%
18/05/29 20:41:30 INFO mapreduce.Job: Job job_1526991305992_0001 completed successfully
18/05/29 20:41:30 INFO mapreduce.Job: Counters: 49
# 查看結(jié)果
[hadoop@hadoop000 ~]$ hdfs dfs -ls /wordcount/output
Found 2 items
-rw-r--r--   1 hadoop supergroup          0 2018-05-29 20:41 /wordcount/output/_SUCCESS
-rw-r--r--   1 hadoop supergroup         51 2018-05-29 20:41 /wordcount/output/part-r-00000
[hadoop@hadoop000 ~]$ hdfs dfs -cat /wordcount/output/part-r-00000
123     1
a       2
b       1
hero    1
jepson  1
ruoze   1
xjp     1
yimi    1

登錄網(wǎng)頁(yè)查看相關(guān)信息:http://192.168.6.217:8088/cluster
Yarn命令使用及wordcount解析

2.Yarn常用命令總結(jié)
yarn jar <jar>              --run a jar file
yarn application -list      --列出在跑的job
yarn application -kill application_1526991305992_0001(job的id) --殺掉在跑的job
3.wordcount流程詳解

Yarn命令使用及wordcount解析
參考:https://blog.csdn.net/yczws1/article/details/21794873

向AI問一下細(xì)節(jié)

免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長(zhǎng)郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。

AI