溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊(cè)×
其他方式登錄
點(diǎn)擊 登錄注冊(cè) 即表示同意《億速云用戶服務(wù)條款》

【hadoop】hive 安裝實(shí)踐

發(fā)布時(shí)間:2020-06-04 05:54:37 來(lái)源:網(wǎng)絡(luò) 閱讀:579 作者:浮爾魔司 欄目:大數(shù)據(jù)

1.下載Hive安裝包:

  官網(wǎng)下載:http://hive.apache.org/downloads.html

2.上傳Hive的tar包,并解壓:
建議和hadoop目錄在一級(jí),方便后續(xù)使用;

  解壓:tar -zxvf apache-hive-1.2.1-bin.tar.gz -C /home/hadoop/hive

  修改解壓后的文件名稱(chēng):mv apache-hive-1.2.1-bin hive-1.2.1

3.安裝MySql
  MySQL用于存儲(chǔ)Hive的元數(shù)據(jù),(安裝教程見(jiàn)之前的文章)

4.修改配置文件:主要是配置metastore(元數(shù)據(jù)存儲(chǔ))存儲(chǔ)方式
  4.1. vi /home/hadoop/hive/hive-1.2.1/conf/hive-site.xml(存儲(chǔ)方式:內(nèi)嵌Derby方式、本地mysql、遠(yuǎn)端mysql)

  4.2 粘貼如下內(nèi)容:

<configuration>
    <property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
        <description>JDBC connect string for a JDBC metastore</description>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>com.mysql.jdbc.Driver</value>
        <description>Driver class name for a JDBC metastore</description>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>root</value>
        <description>username to use against metastore database</description>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>root</value> 
        <description>password to use against metastore database</description>
    </property>
</configuration>

5.拷貝jar包:

  拷貝mysql驅(qū)動(dòng)jar包到Hive的lib目錄下面去,

    下載路徑:https://pan.baidu.com/s/17iHOIjt4XZbRAngGFf_GgA

6.啟動(dòng)Hive:
?。?)啟動(dòng)Hive之前需要先把Hadoop集群?jiǎn)?dòng)起來(lái)。

    (2)使用hadoop用戶

  啟動(dòng)命令:/usr/local/src/hive-1.2.1/bin/hive

  出現(xiàn)如下表示啟動(dòng)成功:
hive>
  

7、驗(yàn)證Hive運(yùn)行正常:?jiǎn)?dòng)Hive以后輸入下面的命令:

hive> show databases;
OK
default
test_db
Time taken: 0.567 seconds, Fetched: 2 row(s)

hive> use default;
OK
Time taken: 0.068 seconds

hive> show tables;
OK
Time taken: 0.086 seconds

8、 創(chuàng)建數(shù)據(jù)庫(kù), 數(shù)據(jù)庫(kù)的數(shù)據(jù)文件被存放在HDFS的/user/hive/warehouse/test_db.db下面

hive> create database test_db;
OK
Time taken: 0.505 seconds

9、在test_db里創(chuàng)建表,表的數(shù)據(jù)文件被存放在HDFS的/user/hive/warehouse/test_db.db/t_test下面;
并且表的數(shù)據(jù)文件字段以“|”分割開(kāi);  

use test_db;

create table flat1_test (mobile string,opr_type string,lastupdatetime string,monthly string,sp_code string,oper_code string,unknown string,subtime string)
row format delimited
fields terminated by '|';

10、上傳數(shù)據(jù)文件到hdfs指定目錄,目錄為hive數(shù)據(jù)庫(kù)表文件目錄
  hadoop fs -put hivefile1.txt /user/hive/warehouse/test_db.db/flat1_test

11、使用sql查詢(xún)數(shù)據(jù)
hive> select * from flat1_test;

12、查詢(xún)Hive的元數(shù)據(jù),進(jìn)入mysql中查詢(xún)

mysql> show databases;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| hive               |
| mysql              |
| performance_schema |
| test               |
+--------------------+
5 rows in set (0.00 sec)

mysql> use hive;
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> 
mysql> show tables;
+---------------------------+
| Tables_in_hive            |
+---------------------------+
| BUCKETING_COLS            |
| CDS                       |
| COLUMNS_V2                |
| DATABASE_PARAMS           |
| DBS                       |
| FUNCS                     |
| FUNC_RU                   |
| GLOBAL_PRIVS              |
| IDXS                      |
| INDEX_PARAMS              |
| PARTITIONS                |
| PARTITION_KEYS            |
| PARTITION_KEY_VALS        |
| PARTITION_PARAMS          |
| PART_COL_PRIVS            |
| PART_COL_STATS            |
| PART_PRIVS                |
| ROLES                     |
| SDS                       |
| SD_PARAMS                 |
| SEQUENCE_TABLE            |
| SERDES                    |
| SERDE_PARAMS              |
| SKEWED_COL_NAMES          |
| SKEWED_COL_VALUE_LOC_MAP  |
| SKEWED_STRING_LIST        |
| SKEWED_STRING_LIST_VALUES |
| SKEWED_VALUES             |
| SORT_COLS                 |
| TABLE_PARAMS              |
| TAB_COL_STATS             |
| TBLS                      |
| TBL_COL_PRIVS             |
| TBL_PRIVS                 |
| VERSION                   |
+---------------------------+
35 rows in set (0.01 sec)

mysql> select * from DBS;
+-------+-----------------------+-----------------------------------------------------------+---------+------------+------------+
| DB_ID | DESC                  | DB_LOCATION_URI                                           | NAME    | OWNER_NAME | OWNER_TYPE |
+-------+-----------------------+-----------------------------------------------------------+---------+------------+------------+
|     1 | Default Hive database | hdfs://XXXXXXXXXX:9000/user/hive/warehouse            | default | public     | ROLE       |
|     6 | NULL                  | hdfs://XXXXXXXXXX:9000/user/hive/warehouse/test_db.db | test_db | hadoop     | USER       |
+-------+-----------------------+-----------------------------------------------------------+---------+------------+------------+
2 rows in set (0.00 sec)

mysql>          
向AI問(wèn)一下細(xì)節(jié)

免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長(zhǎng)郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。

AI