Spark2.2.0中RDD轉(zhuǎn)DataFrame的方式是怎樣的,相信很多沒有經(jīng)驗的人對此束手無策,為此本文總結(jié)了問題出現(xiàn)的原因和解決方法,通過這篇文章希望你能解決這個問題。
讓客戶滿意是我們工作的目標,不斷超越客戶的期望值來自于我們對這個行業(yè)的熱愛。我們立志把好的技術(shù)通過有效、簡單的方式提供給客戶,將通過不懈努力成為客戶在信息化領(lǐng)域值得信任、有價值的長期合作伙伴,公司提供的服務(wù)項目有:域名申請、虛擬主機、營銷軟件、網(wǎng)站建設(shè)、貴陽網(wǎng)站維護、網(wǎng)站推廣。
Spark SQL將現(xiàn)有的RDDs轉(zhuǎn)換為數(shù)據(jù)集。
方法:使用反射來推斷包含特定對象類型的RDD的模式。這種基于反射的方法使代碼更加簡潔,并且當您在編寫Spark應(yīng)用程序時已經(jīng)了解了模式時,它可以很好地工作。
第一種方法代碼實例java版本實現(xiàn):
數(shù)據(jù)準備studentDatatxt
1001,20,zhangsan1002,17,lisi1003,24,wangwu1004,16,zhaogang
本地模式代碼實現(xiàn):
package com.unicom.ljs.spark220.study;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.api.java.function.VoidFunction;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SQLContext;
/**
* @author: Created By lujisen
* @company ChinaUnicom Software JiNan
* @date: 2020-01-20 08:58
* @version: v1.0
* @description: com.unicom.ljs.spark220.study
*/
public class RDD2DataFrameReflect {
public static void main(String[] args) {
SparkConf sparkConf = new SparkConf().setMaster("local[*]").setAppName("RDD2DataFrameReflect");
JavaSparkContext sc = new JavaSparkContext(sparkConf);
SQLContext sqlContext=new SQLContext(sc);
JavaRDD
lines = sc.textFile("C:\\Users\\Administrator\\Desktop\\studentData.txt"); JavaRDD
studentRDD = lines.map(new Function () { @Override
public Student2 call(String line) throws Exception {
String[] split = line.split(",");
Student2 student=new Student2();
student.setId(Integer.valueOf(split[0]));
student.setAge(Integer.valueOf(split[1]));
student.setName(split[2]);
return student;
}
});
//使用反射方式將RDD轉(zhuǎn)換成dataFrame
//將Student.calss傳遞進去,其實就是利用反射的方式來創(chuàng)建DataFrame
Dataset
dataFrame = sqlContext.createDataFrame(studentRDD, Student2.class);
//拿到DataFrame之后將其注冊為臨時表,然后針對其中的數(shù)據(jù)執(zhí)行SQL語句
dataFrame.registerTempTable("studentTable");
//針對student臨時表,執(zhí)行sql語句查詢年齡小于18歲的學(xué)生,
/*DataFrame rowDF */
Dataset
dataset = sqlContext.sql("select * from studentTable where age < 18");
JavaRDD
rowJavaRDD = dataset.toJavaRDD();
JavaRDD
ageRDD = rowJavaRDD.map(new Function () {
@Override
public Student2 call(Row row) throws Exception {
Student2 student = new Student2();
student.setId(row.getInt(0));
student.setAge(row.getInt(1));
student.setName(row.getString(2));
return student;
}
});
ageRDD.foreach(new VoidFunction
() { @Override
public void call(Student2 student) throws Exception {
System.out.println(student.toString());
}
});
}
}
Student2類:
package com.unicom.ljs.spark220.study;
import java.io.Serializable;
/**
* @author: Created By lujisen
* @company ChinaUnicom Software JiNan
* @date: 2020-01-20 08:57
* @version: v1.0
* @description: com.unicom.ljs.spark220.study
*/
public class Student2 implements Serializable {
int id;
int age;
String name;
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
@Override
public String toString() {
return "Student2{" +
"id=" + id +
", age=" + age +
", name='" + name + '\'' +
'}';
}
}
pom.xml關(guān)鍵依賴:
2.2.0
2.11.8
org.apache.spark spark-sql_2.11 ${spark.version} org.apache.spark spark-core_2.11 ${spark.version}
看完上述內(nèi)容,你們掌握Spark2.2.0中RDD轉(zhuǎn)DataFrame的方式是怎樣的的方法了嗎?如果還想學(xué)到更多技能或想了解更多相關(guān)內(nèi)容,歡迎關(guān)注創(chuàng)新互聯(lián)行業(yè)資訊頻道,感謝各位的閱讀!