真实的国产乱ⅩXXX66竹夫人,五月香六月婷婷激情综合,亚洲日本VA一区二区三区,亚洲精品一区二区三区麻豆

成都創(chuàng)新互聯(lián)網(wǎng)站制作重慶分公司

spark2.1.0standalone模式配置以及jar包怎么通過(guò)spark-submit提交

本篇文章為大家展示了spark 2.1.0 standalone模式配置以及jar包怎么通過(guò)spark-submit提交,內(nèi)容簡(jiǎn)明扼要并且容易理解,絕對(duì)能使你眼前一亮,通過(guò)這篇文章的詳細(xì)介紹希望你能有所收獲。

成都創(chuàng)新互聯(lián)是一家集網(wǎng)站建設(shè),榮昌企業(yè)網(wǎng)站建設(shè),榮昌品牌網(wǎng)站建設(shè),網(wǎng)站定制,榮昌網(wǎng)站建設(shè)報(bào)價(jià),網(wǎng)絡(luò)營(yíng)銷(xiāo),網(wǎng)絡(luò)優(yōu)化,榮昌網(wǎng)站推廣為一體的創(chuàng)新建站企業(yè),幫助傳統(tǒng)企業(yè)提升企業(yè)形象加強(qiáng)企業(yè)競(jìng)爭(zhēng)力??沙浞譂M(mǎn)足這一群體相比中小企業(yè)更為豐富、高端、多元的互聯(lián)網(wǎng)需求。同時(shí)我們時(shí)刻保持專(zhuān)業(yè)、時(shí)尚、前沿,時(shí)刻以成就客戶(hù)成長(zhǎng)自我,堅(jiān)持不斷學(xué)習(xí)、思考、沉淀、凈化自己,讓我們?yōu)楦嗟钠髽I(yè)打造出實(shí)用型網(wǎng)站。

配置
spark-env.sh
	export JAVA_HOME=/apps/jdk1.8.0_181
	export SPARK_MASTER_HOST=bigdata00
	export SPARK_MASTER_PORT=7077
slaves
	bigdata01
	bigdata02
	bigdata03
啟動(dòng)spark shell
./spark-shell  --master spark://bigdata00:7077 --executor-memory 512M 
用spark shell 完成一個(gè)wordcount
scala> sc.textFile("hdfs://bigdata00:9000/words").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).collect
結(jié)果:
res3: Array[(String, Int)] = Array((this,1), (is,4), (girl,3), (love,1), (will,1), (day,1), (boreing,1), (my,1), (miss,2), (test,2), (forget,1), (spark,2), (soon,1), (most,1), (that,1), (a,2), (afternonn,1), (i,3), (might,1), (of,1), (today,2), (good,1), (for,1), (beautiful,1), (time,1), (and,1), (the,5))
//主類(lèi)
package hgs.sparkwc
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
object WordCount {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setAppName("WordCount")
    val context = new SparkContext()
    context.textFile(args(0),1).flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).sortBy(_._2).saveAsTextFile(args(1))
    context.stop
  }
}
//------------------------------------------------------------------------------------------
//以下式pom.xml文件

  4.0.0
  hgs
  sparkwc
  1.0.0
  jar
  sparkwc
  http://maven.apache.org
  
    UTF-8
  

        
            org.scala-lang
            scala-library
            2.11.8
        
        
            org.apache.spark
            spark-core_2.11
            2.1.0
        
        
            org.apache.hadoop
            hadoop-client
            2.6.1
        
    
    
    
    
        
            
                maven-assembly-plugin
                2.6
                
           
                  
                        
                            
                            hgs.sparkwc.WordCount
                        
                     
                    
                    
                        
                            
                            jar-with-dependencies
                        
                    
                
                
                
                    
                        make-assembly
                        package
                        
                            single
                        
                    
                
            
            
              
                org.apache.maven.plugins
                maven-compiler-plugin
                
                    1.8
                    1.8
                
             
             
				net.alchim31.maven
				scala-maven-plugin
				3.2.0
				
					
						
							compile
							testCompile
					    
						
							
							
                			-dependencyfile
                			${project.build.directory}/.scala_dependencies
              				
						
					
				
			
			
			
				org.apache.maven.plugins
				maven-surefire-plugin
				2.18.1
				
				false
				true
				
				
				
					**/*Test.*
					**/*Suite.*
				
				
			
          
        
    
最后在build assembly:assembly的時(shí)候出現(xiàn)以下問(wèn)題
      scalac error: bad option: '-make:transitive'
      原因是scala-maven-plugin 插件的配置 -make:transitive 有問(wèn)題,把該行注釋掉即可
      
      網(wǎng)上的答案:
      刪除-make:transitive 
      或者添加該依賴(lài):

org.specs2
specs2-junit_${scala.compat.version}
2.4.16
test

最后在服務(wù)器提交任務(wù):
./spark-submit --master spark://bigdata00:7077  --executor-memory 512M --total-executor-cores 3  /home/sparkwc.jar   hdfs://bigdata00:9000/words  hdfs://bigdata00:9000/wordsout2

上述內(nèi)容就是spark 2.1.0 standalone模式配置以及jar包怎么通過(guò)spark-submit提交,你們學(xué)到知識(shí)或技能了嗎?如果還想學(xué)到更多技能或者豐富自己的知識(shí)儲(chǔ)備,歡迎關(guān)注創(chuàng)新互聯(lián)行業(yè)資訊頻道。


文章標(biāo)題:spark2.1.0standalone模式配置以及jar包怎么通過(guò)spark-submit提交
轉(zhuǎn)載源于:http://weahome.cn/article/jgeepd.html

其他資訊

在線咨詢(xún)

微信咨詢(xún)

電話咨詢(xún)

028-86922220(工作日)

18980820575(7×24)

提交需求

返回頂部