真实的国产乱ⅩXXX66竹夫人,五月香六月婷婷激情综合,亚洲日本VA一区二区三区,亚洲精品一区二区三区麻豆

成都創(chuàng)新互聯(lián)網(wǎng)站制作重慶分公司

hadoop運(yùn)行實(shí)例分析

這篇文章主要講解了“hadoop運(yùn)行實(shí)例分析”,文中的講解內(nèi)容簡(jiǎn)單清晰,易于學(xué)習(xí)與理解,下面請(qǐng)大家跟著小編的思路慢慢深入,一起來(lái)研究和學(xué)習(xí)“hadoop運(yùn)行實(shí)例分析”吧!

讓客戶滿意是我們工作的目標(biāo),不斷超越客戶的期望值來(lái)自于我們對(duì)這個(gè)行業(yè)的熱愛(ài)。我們立志把好的技術(shù)通過(guò)有效、簡(jiǎn)單的方式提供給客戶,將通過(guò)不懈努力成為客戶在信息化領(lǐng)域值得信任、有價(jià)值的長(zhǎng)期合作伙伴,公司提供的服務(wù)項(xiàng)目有:主機(jī)域名、網(wǎng)站空間、營(yíng)銷軟件、網(wǎng)站建設(shè)、天水網(wǎng)站維護(hù)、網(wǎng)站推廣。

1.找到examples的jar包

2.創(chuàng)建輸入和輸出目錄

3.將需要分隔的文件上傳到wc_input目錄下

4.查看上傳的文件

5.hadoop jar /hadoop_soft/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar wordcount /wc_input/* /wc_output/

 [root@hadoop input]# hadoop jar /hadoop_soft/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar wordcount /wc_input/* /wc_output/
17/08/15 10:25:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/08/15 10:25:25 INFO client.RMProxy: Connecting to ResourceManager at /192.168.1.120:18040
17/08/15 10:25:27 INFO input.FileInputFormat: Total input paths to process : 2
17/08/15 10:25:27 INFO mapreduce.JobSubmitter: number of splits:2
17/08/15 10:25:28 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1502762082449_0001
17/08/15 10:25:28 INFO impl.YarnClientImpl: Submitted application application_1502762082449_0001
17/08/15 10:25:29 INFO mapreduce.Job: The url to track the job: http://hadoop:18088/proxy/application_1502762082449_0001/
17/08/15 10:25:29 INFO mapreduce.Job: Running job: job_1502762082449_0001
17/08/15 10:25:48 INFO mapreduce.Job: Job job_1502762082449_0001 running in uber mode : true
17/08/15 10:25:48 INFO mapreduce.Job:  map 0% reduce 0%
17/08/15 10:25:50 INFO mapreduce.Job:  map 100% reduce 0%
17/08/15 10:25:51 INFO mapreduce.Job:  map 100% reduce 100%
17/08/15 10:25:51 INFO mapreduce.Job: Job job_1502762082449_0001 completed successfully
17/08/15 10:25:52 INFO mapreduce.Job: Counters: 52
 File System Counters
  FILE: Number of bytes read=276
  FILE: Number of bytes written=545
  FILE: Number of read operations=0
  FILE: Number of large read operations=0
  FILE: Number of write operations=0
  HDFS: Number of bytes read=798
  HDFS: Number of bytes written=398613
  HDFS: Number of read operations=66
  HDFS: Number of large read operations=0
  HDFS: Number of write operations=23
 Job Counters
  Launched map tasks=2
  Launched reduce tasks=1
  Other local map tasks=2
  Total time spent by all maps in occupied slots (ms)=1972
  Total time spent by all reduces in occupied slots (ms)=803
  TOTAL_LAUNCHED_UBERTASKS=3
  NUM_UBER_SUBMAPS=2
  NUM_UBER_SUBREDUCES=1
  Total time spent by all map tasks (ms)=1972
  Total time spent by all reduce tasks (ms)=803
  Total vcore-milliseconds taken by all map tasks=1972
  Total vcore-milliseconds taken by all reduce tasks=803
  Total megabyte-milliseconds taken by all map tasks=2019328
  Total megabyte-milliseconds taken by all reduce tasks=822272
 Map-Reduce Framework
  Map input records=5
  Map output records=11
  Map output bytes=111
  Map output materialized bytes=109
  Input split bytes=210
  Combine input records=11
  Combine output records=8
  Reduce input groups=7
  Reduce shuffle bytes=109
  Reduce input records=8
  Reduce output records=7
  Spilled Records=16
  Shuffled Maps =2
  Failed Shuffles=0
  Merged Map outputs=2
  GC time elapsed (ms)=637
  CPU time spent (ms)=1820
  Physical memory (bytes) snapshot=830070784
  Virtual memory (bytes) snapshot=8998096896
  Total committed heap usage (bytes)=500510720
 Shuffle Errors
  BAD_ID=0
  CONNECTION=0
  IO_ERROR=0
  WRONG_LENGTH=0
  WRONG_MAP=0
  WRONG_REDUCE=0
 File Input Format Counters
  Bytes Read=70
 File Output Format Counters
  Bytes Written=57

6.查看運(yùn)行結(jié)果

7.檢查結(jié)果數(shù)據(jù)

感謝各位的閱讀,以上就是“hadoop運(yùn)行實(shí)例分析”的內(nèi)容了,經(jīng)過(guò)本文的學(xué)習(xí)后,相信大家對(duì)hadoop運(yùn)行實(shí)例分析這一問(wèn)題有了更深刻的體會(huì),具體使用情況還需要大家實(shí)踐驗(yàn)證。這里是創(chuàng)新互聯(lián),小編將為大家推送更多相關(guān)知識(shí)點(diǎn)的文章,歡迎關(guān)注!


本文名稱:hadoop運(yùn)行實(shí)例分析
URL分享:http://weahome.cn/article/pdiccc.html

其他資訊

在線咨詢

微信咨詢

電話咨詢

028-86922220(工作日)

18980820575(7×24)

提交需求

返回頂部