您好,登錄后才能下訂單哦!
小編給大家分享一下Hive的基本操作有哪些,相信大部分人都還不怎么了解,因此分享這篇文章給大家參考一下,希望大家閱讀完這篇文章后大有收獲,下面讓我們一起去了解一下吧!
一:首先,配置好連接Hive的Properties,創(chuàng)建hivedb.properties 如下:
driver=org.apache.hive.jdbc.HiveDriver url=jdbc:hive2://master:10000/default user=hive password=xujun
二:創(chuàng)建HiveProperties,繼承自Properties,讀取hivedb.properties(單例)
public class HiveProperties extends Properties { /** * @author Berg * 讀取hivedb.properties */ private static final long serialVersionUID = 1L; private static HiveProperties hiveProperties; private HiveProperties(){ InputStream iis = HiveProperties.class.getClassLoader().getResourceAsStream("hivedb.properties"); try{ super.load( iis ); }catch(IOException e ){ e.printStackTrace(); } } public static HiveProperties getInstance(){ if( hiveProperties == null){ hiveProperties = new HiveProperties(); } return hiveProperties; } }
三:創(chuàng)建HiveUtil,對(duì)Hive中的基本操作進(jìn)行簡(jiǎn)單封裝:
注意:導(dǎo)入數(shù)據(jù)的時(shí)候,可以從本地文件導(dǎo)入數(shù)據(jù),也可以從HDFS上導(dǎo)入數(shù)據(jù),
區(qū)別是:如果導(dǎo)入的是HDFS數(shù)據(jù),則不需要local關(guān)鍵字
public class HiveUtil { //讀取一次驅(qū)動(dòng) 靜態(tài)塊 static{ try { Class.forName(HiveProperties.getInstance().getProperty("driver")); }catch(ClassNotFoundException e){ e.printStackTrace(); //輸出至標(biāo)準(zhǔn)錯(cuò)誤流。 } } //獲取HiveConnection public Connection getConn() throws SQLException{ HiveProperties hp = HiveProperties.getInstance(); Connection conn = DriverManager.getConnection(hp.getProperty("url"), hp.getProperty("user"), hp.getProperty("password") ); return conn; } //創(chuàng)建表: public boolean createTable(Statement statement,String sql) throws SQLException{ /** * 如果第一個(gè)結(jié)果為 ResultSet 對(duì)象,則返回 true * 如果其為更新計(jì)數(shù)或者不存在任何結(jié)果,則返回 false * 因?yàn)?nbsp;create 語(yǔ)法不屬于DML, 所以執(zhí)行create sql語(yǔ)句后將返回false。 * 看起來(lái)還是有點(diǎn)缺陷。 */ boolean result = statement.execute(sql); if( !result ){ return true; }else{ return false; } } //判斷表是否存在 public boolean tableIsExists(Statement statement, String tableName) throws SQLException{ String sql = "show tables '" + tableName + "'"; ResultSet rs = statement.executeQuery(sql); if (rs.next()) { return true; }else{ return false; } } //刪除某張表: public boolean dropTable(Statement statement,String tableName) throws SQLException { String sql = "drop table " + tableName; boolean result = statement.execute(sql); if( !result ){ return true; }else{ return false; } } //顯示某張表或顯示所有的表 public ResultSet showTables(Statement statement, String tableName) throws SQLException{ String sql = ""; if(tableName==null||tableName.equals(null)){ sql = "show tables"; }else{ sql = "show tables '" + tableName + "'"; } ResultSet rs = statement.executeQuery(sql); return rs; } //對(duì)表的描述: public ResultSet describeTable(Statement statement, String tableName) throws SQLException { String sql = "describe " + tableName; ResultSet rs = statement.executeQuery(sql); return rs; } /** *通過(guò)文件方式,將文件中數(shù)據(jù)插入到表中:加載本地本劍數(shù)據(jù)。 * @throws SQLException * @filePath: 文件路徑, 比如: /home/hadoop/mytestdata/hive.txt * 注意此路徑在虛擬機(jī)下。。。 * */ //通過(guò)文件方式,將文件中數(shù)據(jù)插入到表中:加載本地本劍數(shù)據(jù)。 public boolean loadDataFromLocal(Statement statement, String tableName,String filePath) throws SQLException { String sql = "load data local inpath '" + filePath + "' into table " + tableName; boolean result = statement.execute(sql); if( !result ){ return true; }else{ return false; } } //通過(guò)文件方式,將文件中數(shù)據(jù)插入到表中:加載本地本劍數(shù)據(jù)。 public boolean loadDataFromHDFS(Statement statement, String tableName,String filePath) throws SQLException { String sql = "load data inpath '" + filePath + "' into table " + tableName; boolean result = statement.execute(sql); if( !result ){ return true; }else{ return false; } } //查詢表中的數(shù)據(jù): public ResultSet selectData(Statement statement, String tableName) throws SQLException { String sql = "select * from " + tableName; ResultSet rs = statement.executeQuery(sql); return rs; } //統(tǒng)計(jì)表中數(shù)據(jù): public ResultSet countData(Statement statement, String sql) throws SQLException{ ResultSet rs = statement.executeQuery(sql); return rs; } //關(guān)閉連接: public void close(Connection conn,Statement statement) throws SQLException{ if (conn != null) { conn.close(); conn = null; } if (statement != null) { statement.close(); statement = null; } } }
四:對(duì)基本操作的測(cè)試,如下創(chuàng)建TestMain:
public class TestMain { private HiveUtil hu = new HiveUtil(); private Connection conn = null; private Statement statement; //1.測(cè)試連接 @Test public void testConn() throws SQLException{ conn = hu.getConn(); System.out.println( "獲取連接為: "+ conn ); } //2.創(chuàng)建表: @Test public void testCreateTable() throws SQLException{ conn = hu.getConn(); statement = conn.createStatement(); String tableName = "testhive"; String sql = "create table " + tableName + " (key int, value string) row format delimited fields terminated by '\t'"; boolean result = hu.createTable(statement, sql); if( result ){ System.out.println( "創(chuàng)建表成功。" + result ); }else{ System.out.println( "創(chuàng)建表失敗。" + result ); } } //3.判斷表是否存在 @Test public void testTableIsExists() throws SQLException{ conn = hu.getConn(); statement = conn.createStatement(); String tableName = "testhive"; boolean result = hu.tableIsExists(statement, tableName); if(result){ System.out.println( "表已經(jīng)存在。" ); }else{ System.out.println( "表不存在,請(qǐng)新建表。"); } } //4.刪除表:刪除表之前先判斷表是否存在。 @Test public void testDropTable() throws SQLException{ conn = hu.getConn(); statement = conn.createStatement(); String tableName = "testhive"; boolean result = hu.tableIsExists(statement, tableName); if(result){ System.out.println( "表已經(jīng)存在,開(kāi)始刪除這張表: " ); result = hu.dropTable(statement, tableName); if( result ){ System.out.println( "刪除表成功。"); }else{ System.out.println( "刪除表失敗。"); } }else{ System.out.println( "表不存在,請(qǐng)新建表。"); } } //5.顯示所有的表 @Test public void testShowTables() throws SQLException{ conn = hu.getConn(); statement = conn.createStatement(); String tableName = "testhive"; ResultSet rs = hu.showTables(statement, tableName); while( rs.next() ){ System.out.println( "表名為: "+ rs.getString(1)); } } //6.對(duì)表的描述 @Test public void testDescribeTable() throws SQLException{ conn = hu.getConn(); statement = conn.createStatement(); String tableName = "testhive"; //首先判斷張表是否存在: boolean result = hu.tableIsExists(statement, tableName); if( result ){ //表示表已經(jīng)存在: ResultSet rs = hu.describeTable(statement, tableName); while( rs.next() ){ System.out.println( rs.getString(1) ); } }else{ System.out.println( "Error: 表不存在,不能對(duì)表描述。"); } } //7.導(dǎo)入數(shù)據(jù): local 和 hdfs @Test public void testLoadData() throws SQLException{ conn = hu.getConn(); statement = conn.createStatement(); String tableName = "testhive"; //String localFilePath = "/home/hadoop/mytestdata/hive.txt"; String hdfsFilePath = "hdfs://master:9000/user/hive/hive.txt"; //首先判斷張表是否存在: boolean result = hu.tableIsExists(statement, tableName); if( result ){ //表示表已經(jīng)存在: //加載導(dǎo)入本地?cái)?shù)據(jù) //result = hu.loadDataFromLocal(statement, tableName, localFilePath); //加載導(dǎo)入HDFS上文件數(shù)據(jù) result = hu.loadDataFromHDFS(statement, tableName, hdfsFilePath); if( result ){ System.out.println( "成功將本地文件數(shù)據(jù)導(dǎo)入到表" + tableName + "中" ); }else{ System.out.println( " 導(dǎo)入失敗 。"); } }else{ System.out.println( "Error: 表不存在,不能對(duì)表導(dǎo)入文件數(shù)據(jù)。"); } } //8.查詢表中的數(shù)據(jù): @Test public void testSelectData() throws SQLException{ conn = hu.getConn(); statement = conn.createStatement(); String tableName = "testhive"; //首先判斷張表是否存在: boolean result = hu.tableIsExists(statement, tableName); if( result ){ //表示表已經(jīng)存在: ResultSet rs = hu.selectData(statement, tableName); while( rs.next() ){ System.out.println( rs.getString(1) +"\t"+ rs.getString(2) ); } }else{ System.out.println( "Error: 表不存在,不能對(duì)表導(dǎo)入文件數(shù)據(jù)。"); } } //9.統(tǒng)計(jì)表中的數(shù)據(jù): @Test public void testCountData() throws SQLException{ conn = hu.getConn(); statement = conn.createStatement(); String tableName = "testhive"; //首先判斷張表是否存在: boolean result = hu.tableIsExists(statement, tableName); if( result ){ //表示表已經(jīng)存在: String sql = "select count(1) from " + tableName; ResultSet rs = hu.countData(statement, sql); System.out.println( rs ); while( rs.next() ){ System.out.println( "當(dāng)前記錄數(shù)據(jù)量為: " + rs.getString(1) ); } }else{ System.out.println( "Error: 表不存在,不能統(tǒng)計(jì)數(shù)據(jù)。"); } } }
5.測(cè)試數(shù)據(jù):(以Tab隔開(kāi)。)
1 Berg 1 Berg 2 Cccc 3 Xxxx 4 Jjjj
當(dāng)然 ,也可創(chuàng)建Maven 的依賴,我的如下:
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>jstl</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.4</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.4</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>2.0.0</version>
<exclusions>
<exclusion>
<artifactId>
pentaho-aggdesigner-algorithm
</artifactId>
<groupId>org.pentaho</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.5</version>
</dependency>
</dependencies>
以上是“Hive的基本操作有哪些”這篇文章的所有內(nèi)容,感謝各位的閱讀!相信大家都有了一定的了解,希望分享的內(nèi)容對(duì)大家有所幫助,如果還想學(xué)習(xí)更多知識(shí),歡迎關(guān)注億速云行業(yè)資訊頻道!
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長(zhǎng)郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。