這篇文章主要講解了“如何解決eclipse spark提交遠(yuǎn)程任務(wù)異常”,文中的講解內(nèi)容簡(jiǎn)單清晰,易于學(xué)習(xí)與理解,下面請(qǐng)大家跟著小編的思路慢慢深入,一起來(lái)研究和學(xué)習(xí)“如何解決eclipse spark提交遠(yuǎn)程任務(wù)異?!卑桑?/p>
創(chuàng)新互聯(lián)專注于宜陽(yáng)網(wǎng)站建設(shè)服務(wù)及定制,我們擁有豐富的企業(yè)做網(wǎng)站經(jīng)驗(yàn)。 熱誠(chéng)為您提供宜陽(yáng)營(yíng)銷型網(wǎng)站建設(shè),宜陽(yáng)網(wǎng)站制作、宜陽(yáng)網(wǎng)頁(yè)設(shè)計(jì)、宜陽(yáng)網(wǎng)站官網(wǎng)定制、微信小程序開(kāi)發(fā)服務(wù),打造宜陽(yáng)網(wǎng)絡(luò)公司原創(chuàng)品牌,更為您提供宜陽(yáng)網(wǎng)站排名全網(wǎng)營(yíng)銷落地服務(wù)。
java.lang.IllegalStateException: Library directory scala-2.11\jars' does not exist; make sure Spark is built.
package my.test; import java.io.IOException; import org.apache.spark.deploy.SparkSubmit; import com.huangyueran.spark.utils.Constant; public class Main { public static void main(String[] args) throws IOException { System.setProperty("HADOOP_USER_NAME", "hadoop"); System.setProperty("user.name", "hadoop"); //System.setProperty("HADOOP_CONF_DIR", "C:\\eclipse-workspace\\SparkDemo\\src\\main\\resources"); //System.setProperty("HADOOP_CONF_DIR", "C:\\my\\soft\\hadoop\\hadoop-2.8.5\\hadoop-2.8.5\\etc\\hadoop"); System.out.println("------------"+System.getenv("HADOOP_CONF_DIR")); System.out.println("------------"+System.getenv("HADOOP_HOME")); String appName = "wordCount-yarn-cluster"; String className = "my.test.WordCount"; String path = "C:\\eclipse-workspace\\SparkDemo\\target\\SparkDemo-1.0-SNAPSHOT.jar"; path = Constant.HDFS_FILE_PREX +"/user/zzm/SparkDemo-1.0-SNAPSHOT.jar"; String [] arg0=new String[]{ // "--jars",Constant.HDFS_FILE_PREX +"/user/zzm/spark-lib", "--master","yarn",//ip端口 "--deploy-mode","cluster", "--name",appName, "--class",className,//運(yùn)行主類main //"--spark.yarn.archive",Constant.HDFS_FILE_PREX + "/user/zzm/spark-lib", "--executor-memory","2G", "--total-executor-cores","10", "--executor-cores","2", path,//在linux上的包 可改為hdfs上面的路徑 // "LR", "20180817111111", "66"http://jar中的參數(shù),注意這里的參數(shù)寫法 }; SparkSubmit.main(arg0); } }
需要配置spark和hadoop得本地環(huán)境才能本地提交yarn cluster
感謝各位的閱讀,以上就是“如何解決eclipse spark提交遠(yuǎn)程任務(wù)異?!钡膬?nèi)容了,經(jīng)過(guò)本文的學(xué)習(xí)后,相信大家對(duì)如何解決eclipse spark提交遠(yuǎn)程任務(wù)異常這一問(wèn)題有了更深刻的體會(huì),具體使用情況還需要大家實(shí)踐驗(yàn)證。這里是創(chuàng)新互聯(lián),小編將為大家推送更多相關(guān)知識(shí)點(diǎn)的文章,歡迎關(guān)注!