欧美bbbwbbbw肥妇,免费乱码人妻系列日韩,一级黄片

hadoop運(yùn)行java程序(jar包)并運(yùn)行時(shí)動(dòng)態(tài)指定參數(shù)

 更新時(shí)間:2021年06月23日 15:35:07   作者:AKA石頭  
這篇文章主要介紹了hadoop如何運(yùn)行java程序(jar包)并運(yùn)行時(shí)動(dòng)態(tài)指定參數(shù),使用hadoop 運(yùn)行 java jar包,Main函數(shù)一定要加上全限定類(lèi)名,需要的朋友可以參考下

1)首先啟動(dòng)hadoop2個(gè)進(jìn)程,進(jìn)入hadoop/sbin目錄下,依次啟動(dòng)如下命令

[root@node02 sbin]# pwd
/usr/server/hadoop/hadoop-2.7.0/sbin
sh start-dfs.sh
sh start-yarn.sh
jps

2)通過(guò)jps查看是否正確啟動(dòng),確保啟動(dòng)如下6個(gè)程序

[root@node02 sbin]# jps
10096 DataNode
6952 NodeManager
9962 NameNode
10269 SecondaryNameNode
12526 Jps
6670 ResourceManager

3)如果啟動(dòng)帶有文件的話,將文件加入到hdfs 的 /input下,如果出現(xiàn)如下錯(cuò)誤的話,

[root@node02 hadoop-2.7.0]# hadoop fs -put sample.txt /input
21/01/02 01:13:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for atform... using builtin-java classes where applicable

在環(huán)境變量中添加如下字段

[root@node02 ~]# vim /etc/profile
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_PREFIX}/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_PREFIX/lib"

4)進(jìn)入到hadoop根目錄,根據(jù)存放位置決定

[root@node02 hadoop-2.7.0]# pwd
/usr/server/hadoop/hadoop-2.7.0

5)新建hadoop hdfs 文件系統(tǒng)上的 /input 文件夾(用于存放輸入文件)

hadoop fs -mkdir /input

6)傳入測(cè)試文件,測(cè)試文件需要自己上傳到根目錄下(僅供測(cè)試,生產(chǎn)環(huán)境下存放到指定目錄)

[root@node02 hadoop-2.7.0]# hadoop fs -put sample.txt /input

7)查看傳入文件是否存在

[root@node02 hadoop-2.7.0]# hadoop fs -ls /input
-rw-r--r--   1 root supergroup        529 2021-01-02 01:13 /input/sample.txt

8)上傳jar包到根目錄下(生產(chǎn)環(huán)境下,放入指定目錄下),測(cè)試jar包為study_demo.jar

[root@node02 hadoop-2.7.0]# ll
總用量 1968
drwxr-xr-x. 2 10021 10021    4096 4月  11 2015 bin
drwxr-xr-x. 3 10021 10021    4096 4月  11 2015 etc
drwxr-xr-x. 2 10021 10021    4096 4月  11 2015 include
drwxr-xr-x. 3 10021 10021    4096 4月  11 2015 lib
drwxr-xr-x. 2 10021 10021    4096 4月  11 2015 libexec
-rw-r--r--. 1 10021 10021   15429 4月  11 2015 LICENSE.txt
drwxr-xr-x. 3 root  root     4096 1月   2 01:36 logs
-rw-r--r--. 1 10021 10021     101 4月  11 2015 NOTICE.txt
-rw-r--r--. 1 10021 10021    1366 4月  11 2015 README.txt
drwxr-xr-x. 2 10021 10021    4096 4月  11 2015 sbin
drwxr-xr-x. 4 10021 10021    4096 4月  11 2015 share
-rw-r--r--. 1 root  root  1956989 6月  14 2021 study_demo.jar

9)使用hadoop 運(yùn)行 java jar包,Main函數(shù)一定要加上全限定類(lèi)名

hadoop jar study_demo.jar com.ncst.hadoop.MaxTemperature /input/sample.txt /output

10)運(yùn)行結(jié)果縮略圖

21/01/02 01:37:54 INFO mapreduce.Job: Counters: 49
	File System Counters
		FILE: Number of bytes read=61
		FILE: Number of bytes written=342877
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=974
		HDFS: Number of bytes written=17
		HDFS: Number of read operations=9
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=2
	Job Counters 
		Launched map tasks=2
		Launched reduce tasks=1
		Data-local map tasks=2
		Total time spent by all maps in occupied slots (ms)=14668
		Total time spent by all reduces in occupied slots (ms)=4352
		Total time spent by all map tasks (ms)=14668
		Total time spent by all reduce tasks (ms)=4352
		Total vcore-seconds taken by all map tasks=14668
		Total vcore-seconds taken by all reduce tasks=4352
		Total megabyte-seconds taken by all map tasks=15020032
		Total megabyte-seconds taken by all reduce tasks=4456448
	Map-Reduce Framework
		Map input records=5
		Map output records=5
		Map output bytes=45
		Map output materialized bytes=67
		Input split bytes=180
		Combine input records=0
		Combine output records=0
		Reduce input groups=2
		Reduce shuffle bytes=67
		Reduce input records=5
		Reduce output records=2
		Spilled Records=10
		Shuffled Maps =2
		Failed Shuffles=0
		Merged Map outputs=2
		GC time elapsed (ms)=525
		CPU time spent (ms)=2510
		Physical memory (bytes) snapshot=641490944
		Virtual memory (bytes) snapshot=6241415168
		Total committed heap usage (bytes)=476053504
	Shuffle Errors
		BAD_ID=0
		CONNECTION=0
		IO_ERROR=0
		WRONG_LENGTH=0
		WRONG_MAP=0
		WRONG_REDUCE=0
	File Input Format Counters 
		Bytes Read=794
	File Output Format Counters 
		Bytes Written=17

10)運(yùn)行成功后執(zhí)行命令查看,此時(shí)多出一個(gè) /output 文件夾

[root@node02 hadoop-2.7.0]# hadoop fs -ls /
drwxr-xr-x   - root supergroup          0 2021-01-02 01:13 /input
drwxr-xr-x   - root supergroup          0 2021-01-02 01:37 /output
drwx------   - root supergroup          0 2021-01-02 01:37 /tmp

11)查看 /output文件夾的文件

[root@node02 hadoop-2.7.0]# hadoop fs -ls /output
-rw-r--r--   1 root supergroup          0 2021-01-02 01:37 /output/_SUCCESS
-rw-r--r--   1 root supergroup         17 2021-01-02 01:37 /output/part-00000

12)查看part-r-00000 文件夾中的內(nèi)容,我這個(gè)測(cè)試用例用來(lái)獲取1949年和1950年的最高氣溫(華氏度)

[root@node02 hadoop-2.7.0]# hadoop fs -cat /output/part-00000
1949	111
1950	22

13)在瀏覽器端訪問(wèn)端口可以觀看可視化界面,對(duì)應(yīng)的是hadoop服務(wù)器地址和自己設(shè)置的端口,通過(guò)可視化界面查看input文件夾面剛剛上傳的sample.txt文件
http://192.168.194.XXX:50070/

在這里插入圖片描述

14)測(cè)試程序jar包和測(cè)試文件已上傳到github上面,此目錄有面經(jīng)和我自己總結(jié)的面試題

GitHub
如有興趣的同學(xué)也可以查閱我的秒殺系統(tǒng)
秒殺系統(tǒng)

以上就是hadoop如何運(yùn)行java程序(jar包)運(yùn)行時(shí)動(dòng)態(tài)指定參數(shù)的詳細(xì)內(nèi)容,更多關(guān)于hadoop運(yùn)行java程序的資料請(qǐng)關(guān)注腳本之家其它相關(guān)文章!

相關(guān)文章

最新評(píng)論