您好,登錄后才能下訂單哦!
本篇內(nèi)容主要講解“Hive的使用方法”,感興趣的朋友不妨來看看。本文介紹的方法操作簡單快捷,實用性強(qiáng)。下面就讓小編來帶大家學(xué)習(xí)“Hive的使用方法”吧!
1、運行模式(集群與本地)
1.1、集群模式:>SET mapred.job.tracker=cluster
1.2、本地模式:>SET mapred.job.tracker=local
2、訪問Hive的3鐘方式
2.1、終端訪問
#hive 或者 #hive --service cli
2.2、web訪問,端口9999
#hive --service hwi &
2.3、hive遠(yuǎn)程服務(wù),端口10000
#hive --service hiveserver &
3、數(shù)據(jù)類型
3.1、基本數(shù)據(jù)類型 :
數(shù)據(jù)類型 | 占用長度 |
tinyint | 1byte(-128~127) |
smallint | 2byte(-2^16 ~ 2^16-1) |
int | 4byte(-2^31 ~ 2^31-1) |
bigint | 8byte(-2^63 ~ 2^63-1) |
float | 4byte單精度 |
double | 8byte雙精度 |
string | |
boolean |
3.2、復(fù)合數(shù)據(jù)類型:ARRAY,MAP,STRUCT,UNION
4、數(shù)據(jù)存儲
4.1、基于HDFS
4.2、存儲結(jié)構(gòu):database 、table 、file 、view
4.3、指定行、列分隔符即可解析數(shù)據(jù)
5、基本操作
5.1、創(chuàng)建數(shù)據(jù)庫:>create database db_name
5.2、指定數(shù)據(jù)庫:>use db
5.3、顯示表:show tables;
5.4、創(chuàng)建表
5.4.1、內(nèi)部表(默認(rèn)):create table table_name(param_name type1,param_name2 type2,...) row format delimited fields terminated by '分隔符';
例:create table trade_detail(id bigint, account string, income double, expenses double, time string) row format delimited fields terminated by '\t';
內(nèi)部表類似數(shù)據(jù)庫表,存儲在HDFS上(位置通過hive.metastore.warehouse.dir參數(shù)查看,除了外部表以外都保存在此處的表),表被刪除時,表的元數(shù)據(jù)信息一起被刪除。
加載數(shù)據(jù):load data local inpath 'path' into table table_name;
5.4.2、分區(qū)表:create table table_name(param_name type1,param_name2 type2,...) partitioned by (param_name type) row format delimited fields terminated by '分隔符';
例:create table td_part(id bigint, account string, income double, expenses double, time string) partitioned by (logdate string) row format delimited fields terminated by '\t';
和普通表的區(qū)別:各個數(shù)據(jù)劃分到不同的分區(qū)文件,表中的每一個partition對應(yīng)表下的一個目錄,盡管
加載數(shù)據(jù):load data local inpath 'path' into table table_name partition (parti_param1='value',parti_param2='value',..);
添加分區(qū):alter table partition_table add partition (daytime='2013-02-04',city='bj');
刪除分區(qū):alter table partition_table drop partition (daytime='2013-02-04',city='bj'),元數(shù)據(jù)和數(shù)據(jù)文件被刪除,但是目錄還存在
5.4.3、外部表:create external table td_ext(id bigint, account string, income double, expenses double, time string) row format delimited fields terminated by '\t' location 'hdfs_path';
加載數(shù)據(jù):load data inpath 'hdfs_path' table_name;
5.4.4、桶表:是對數(shù)據(jù)進(jìn)行哈希取值,然后放到不同文件中存儲。
創(chuàng)建表:create table bucket_table(id string) clustered by(id) into 4 buckets;
加載數(shù)據(jù):
set hive.enforce.bucketing = true;
必須先把以上的操作執(zhí)行才能加載數(shù)據(jù)
insert into table bucket_table select name from stu;
insert overwrite table bucket_table select name from stu;
數(shù)據(jù)加載到桶表時,會對字段取hash值,然后與桶的數(shù)量取模。把數(shù)據(jù)放到對應(yīng)的文件中。
對數(shù)據(jù)抽樣調(diào)查:select * from bucket_table tablesample(bucket 1 out of 4 on id);
6、創(chuàng)建視圖:CREATE VIEW v1 AS select * from t1;
7、修改表:alter table tb_name add columns (param_name,type);
8、刪除表:drop table tb_name;
9、數(shù)據(jù)導(dǎo)入
9.1、加載數(shù)據(jù):LOAD DATA [LOCAL] INPATH 'filepath' [OVERWRITE] INTO TABLE tablename [PARTITION (partcol1=val1, partcol2=val2 ...)]
數(shù)據(jù)加載到表時,不會對數(shù)據(jù)進(jìn)行轉(zhuǎn)移,LOAD操作只是將數(shù)據(jù)復(fù)制到HIVE表對應(yīng)的位置
9.2、Hive中表的互導(dǎo):INSERT OVERWRITE TABLE tablename [PARTITION (partcol1=val1, partcol2=val2 ...)] select_statement FROM from_statement
9.3、create as :CREATE [EXTERNAL] TABLE [IF NOT EXISTS] table_name (col_name data_type, ...) …AS SELECT * FROM TB_NAME;
10、查詢
10.1、語法結(jié)構(gòu)
SELECT [ALL | DISTINCT] select_expr, select_expr, ...
FROM table_reference
[WHERE where_condition]
[GROUP BY col_list]
[ CLUSTER BY col_list | [DISTRIBUTE BY col_list] [SORT BY col_list] | [ORDER BY col_list] ]
[LIMIT number]
ALL and DISTINCT :去重
10.2、partition查詢
利用分區(qū)剪枝(input pruning)的特性,類似“分區(qū)索引”,只有當(dāng)語句中出現(xiàn)WHERE才會啟動分區(qū)剪枝
10.3、LIMIT Clause
Limit 可以限制查詢的記錄數(shù)。查詢的結(jié)果是隨機(jī)選擇的。語法:SELECT * FROM t1 LIMIT 5
10.4、Top N
SET mapred.reduce.tasks = 1 SELECT * FROM sales SORT BY amount DESC LIMIT 5
11、表連接
11.1、內(nèi)連接:select b.name,a.* from dim_ac a join acinfo b on (a.ac=b.acip) limit 10;
11.2、左外連接:select b.name,a.* from dim_ac a left outer join acinfo b on a.ac=b.acip limit 10;
12、Java客戶端
12.1、啟動遠(yuǎn)程服務(wù)#hive --service hiveserver
12.2、相關(guān)代碼
Class.forName("org.apache.hadoop.hive.jdbc.HiveDriver"); Connection con = DriverManager.getConnection("jdbc:hive://192.168.1.102:10000/wlan_dw", "", ""); Statement stmt = con.createStatement(); String querySQL="SELECT * FROM wlan_dw.dim_m order by flux desc limit 10"; ResultSet res = stmt.executeQuery(querySQL); while (res.next()) { System.out.println(res.getString(1) +"\t" +res.getLong(2)+"\t" +res.getLong(3)+"\t" +res.getLong(4)+"\t" +res.getLong(5)); }
13、自定義函數(shù)(UDF)
13.1、UDF函數(shù)可以直接應(yīng)用于select語句,對查詢結(jié)構(gòu)做格式化處理后,再輸出內(nèi)容。
13.2、編寫UDF函數(shù)的時候需要注意一下幾點:
a)自定義UDF需要繼承org.apache.hadoop.hive.ql.UDF。
b)需要實現(xiàn)evaluate函數(shù),evaluate函數(shù)支持重載。
13.3、步驟
a)把程序打包放到目標(biāo)機(jī)器上去;
b)進(jìn)入hive客戶端,添加jar包:hive>add jar /run/jar/udf_test.jar;
c)創(chuàng)建臨時函數(shù):hive>CREATE TEMPORARY FUNCTION add_example AS 'hive.udf.Add';
d)查詢HQL語句:
SELECT add_example(8, 9) FROM scores;
SELECT add_example(scores.math, scores.art) FROM scores;
SELECT add_example(6, 7, 8, 6.8) FROM scores;
e)銷毀臨時函數(shù):hive> DROP TEMPORARY FUNCTION add_example;
注:UDF只能實現(xiàn)一進(jìn)一出的操作,如果需要實現(xiàn)多進(jìn)一出,則需要實現(xiàn)UDAF
13.4、代碼
package cn.itheima.bigdata.hive; import java.util.HashMap; import org.apache.hadoop.hive.ql.exec.UDF; public class AreaTranslationUDF extends UDF{ private static HashMap<String, String> areaMap = new HashMap<String, String>(); static{ areaMap.put("138", "beijing"); areaMap.put("139", "shanghai"); areaMap.put("137", "guangzhou"); areaMap.put("136", "niuyue"); } //用來將手機(jī)號翻譯成歸屬地,evaluate方法一定要是public修飾的,否則調(diào)不到 public String evaluate(String phonenbr) { String area = areaMap.get(phonenbr.substring(0,3)); return area==null?"other":area; } //用來求兩個字段的和 public int evaluate(int x,int y){ return x+y; } }
到此,相信大家對“Hive的使用方法”有了更深的了解,不妨來實際操作一番吧!這里是億速云網(wǎng)站,更多相關(guān)內(nèi)容可以進(jìn)入相關(guān)頻道進(jìn)行查詢,關(guān)注我們,繼續(xù)學(xué)習(xí)!
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點不代表本網(wǎng)站立場,如果涉及侵權(quán)請聯(lián)系站長郵箱:is@yisu.com進(jìn)行舉報,并提供相關(guān)證據(jù),一經(jīng)查實,將立刻刪除涉嫌侵權(quán)內(nèi)容。