這篇文章給大家分享的是有關(guān)Java API如何操作Hive的內(nèi)容。小編覺得挺實用的,因此分享給大家做個參考,一起跟隨小編過來看看吧。
成都創(chuàng)新互聯(lián)專注于寶山企業(yè)網(wǎng)站建設(shè),成都響應(yīng)式網(wǎng)站建設(shè),商城網(wǎng)站定制開發(fā)。寶山網(wǎng)站建設(shè)公司,為寶山等地區(qū)提供建站服務(wù)。全流程按需求定制制作,專業(yè)設(shè)計,全程項目跟蹤,成都創(chuàng)新互聯(lián)專業(yè)和態(tài)度為您提供的服務(wù)
環(huán)境:
IDEA2017.3+Maven-3.3.9+Hive1.1.0
1. pom.xml里面的依賴包配置
1.1.0 org.apache.hive hive-jdbc ${hive.version}
2. 新建HiveJdbcClient.java 文件 package com.ruozedata.day6; import java.sql.*; public class HiveJdbcClient { /** * 需要把:org.apache.hadoop.hive.jdbc.HiveDriver * 改成:org.apache.hive.jdbc.HiveDriver */ private static String driverName = "org.apache.hive.jdbc.HiveDriver"; public static void main(String[] args) throws SQLException { try { Class.forName(driverName); } catch (ClassNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); System.exit(1); } /** * 需要把:jdbc:hive://localhost:10000/default * 改成:jdbc:hive2://192.168.1.108:10000/ruozedata_test */ Connection con = DriverManager.getConnection("jdbc:hive2://192.168.1.108:10000/ruozedata_test", "root", "root"); Statement stmt = con.createStatement(); String tableName = "a"; String sql = "" ; /** * a的表結(jié)構(gòu): * # col_name data_type * id int * name string * age int * select id,name,age query */ sql = "select id,name,age from " + tableName; System.out.println("Running: " + sql); ResultSet res = stmt.executeQuery(sql); while (res.next()) { System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2)+ "\t" + res.getInt(3)); } } } 需要注意2點: 1.官網(wǎng)的驅(qū)動名稱driverName: org.apache.hadoop.hive.jdbc.HiveDriver 需要改成:org.apache.hive.jdbc.HiveDriver 2.官網(wǎng)的數(shù)據(jù)庫連接URL:jdbc:hive://localhost:10000/default 需要改成:jdbc:hive2://localhost:10000/default 其中l(wèi)ocalhost改成數(shù)據(jù)庫的IP,default改成需要連接的數(shù)據(jù)名 3. 需要開啟Hive的 hiveserver2 服務(wù)。 如果不開啟hiveserver2 服務(wù),運(yùn)行程序的時候就會報錯: Exception in thread "main" java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.1.108:10000/ruozedata_test: java.net.SocketException: Connection reset 4. 運(yùn)行程序的結(jié)果: SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/D:/software/apache-maven-3.3.9/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/D:/software/apache-maven-3.3.9/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] log4j:WARN No appenders could be found for logger (org.apache.hive.jdbc.Utils). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Running: select id,name,age from a 1 zhangsan 15 Process finished with exit code 0
感謝各位的閱讀!關(guān)于“Java API如何操作Hive”這篇文章就分享到這里了,希望以上內(nèi)容可以對大家有一定的幫助,讓大家可以學(xué)到更多知識,如果覺得文章不錯,可以把它分享出去讓更多的人看到吧!