0
点赞
收藏
分享

微信扫一扫

Hive 安装配置、HWI

一、安装Hive

  • 环境: CentOS7
  • 已安装Hadoop-2.7.3.tar
wget https://mirrors.cnnic.cn/apache/hive/hive-2.1.1/apache-hive-2.1.1-bin.tar.gz --no-check-certificate
mkdir -p /usr/local/hadoop
mv apache-hive-2.1.1-bin.tar.gz /usr/local/hadoop
cd /usr/local/hadoop
tar -zxvf apache-hive-2.1.1-bin.tar.gz
mv apache-hive-2.1.1-bin apache-hive-2.1.1

echo export HIVE_HOME=/usr/local/hadoop/apache-hive-2.1.1/ >> /etc/profile
echo export PATH=$PATH:$HIVE_HOME/bin >> /etc/profile

source /etc/profile
chown -R hadoop:hadoop /usr/local/hadoop/apache-hive-2.1.1/

安装mysql

wget http://dev.mysql.com/get/mysql-community-release-el7-5.noarch.rpm
rpm -ivh mysql-community-release-el7-5.noarch.rpm
yum install mysql-community-server
service mysqld status
service mysqld start
mysql -uroot -p
use mysql;
update user set password=PASSWORD("hadoop")where user="root";
flush privileges;
quit
service mysqld restart
mysql -uroot -phadoop
或者mysql –uroot -hmaster –phadoop
如果可以登录成功,则表示MySQL数据库已经安装成功。
创建Hive用户:
mysql>CREATE USER 'hive' IDENTIFIED BY 'hive';
mysql>GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%' WITH GRANT OPTION;
mysql>GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%' IDENTIFIED BY 'hive';
mysql>flush privileges;
创建Hive数据库:
mysql>create database hive;
mysql>quit;
service mysqld restart
cd /usr/local/hadoop/apache-hive-2.1.1/conf
cp hive-default.xml.template hive-site.xml
vi hive-site.xml
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive(mysql用户名)</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive(mysql用户密码)</value>
</property>
  • 配置mysql信息
cd /usr/local/hadoop/apache-hive-2.1.1/lib
wget http://central.maven.org/maven2/mysql/mysql-connector-java/5.1.38/mysql-connector-java-5.1.38.jar
分发(这一步不要运行)
scp -r /usr/local/hadoop/apache-hive-2.1.1 slave1.whr.com:/usr/local/hadoop/
scp -r /usr/local/hadoop/apache-hive-2.1.1 slave2.whr.com:/usr/local/hadoop/
scp -r /usr/local/hadoop/apache-hive-2.1.1 slave3.whr.com:/usr/local/hadoop/
  • 配置slave环境变量…(这一步不要运行)
  • 下面启动hive
cd /usr/local/hadoop/apache-hive-2.1.1/bin
./hive --service metastore &
./hive --service hiveserver2 &
# 启动命令行
./hive
hive>show tables;

其它启动方式

# 命令行模式
./hive --service cli
# web
./hive --service hwi
# jdbc
nohup hive --service hiveserver &

二、常见问题处理

Required table missing : "`DBS`" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"

处理

cd /usr/local/hadoop/apache-hive-2.1.1/bin
./schematool -dbType mysql -initSchema
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/apache-hive-2.1.1/lib/log4j-slf4j-impl-2.4.1.jj/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-org/slf4j/impl/StaticLoggerBinder.class]

处理

rm /usr/local/hadoop/apache-hive-2.1.1/lib/log4j-slf4j-impl-2.4.1.jar
Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D

处理
新建文件夹

cd /usr/local/hadoop/apache-hive-2.1.1/
mkdir tmpdir

vi conf/hive-site.xml

<property>
<name>hive.querylog.location</name>
<value>${system:java.io.tmpdir}/${system:user.name}</value>
<description>Location of Hive run time structured log file</description>
</property>

把文件中包含的

${system:java.io.tmpdir}/${system:user.name}都改掉
<property>
<name>hive.querylog.location</name>
<value>/usr/local/hadoop/apache-hive-2.1.1/tmpdir</value>
<description>Location of Hive run time structured log file</description>
</property>

三、基本操作

hive> show tables;    #查看所有表
hive> dfs -ls /; #查看目录列表

四、安装Hive HWI

wget http://apache.fayea.com/hive/hive-2.1.1/apache-hive-2.1.1-src.tar.gz
tar -xzf apache-hive-2.1.1-src.tar.gz
cd apache-hive-2.1.1-src/hwi
jar cfM hive-hwi-1.2.0.war -C web .
cp hive-hwi-2.1.1.war /usr/local/hadoop/apache-hive-2.1.1/lib
wget https://www.apache.org/dist/ant/binaries/apache-ant-1.10.1-bin.tar.gz
tar -xzf apache-ant-1.10.1-bin.tar.gz
cp apache-ant-1.10.1/lib/ant.jar ${HIVE_HOME}/lib
chmod 777 ${HIVE_HOME}/lib/ant.jar
vim /usr/local/hadoop/apache-hive-2.1.1/conf/hive-site.xml
<property>
<name>hive.hwi.war.file</name>
<value>lib/hive-hwi-2.1.1.war</value>
<description>This sets the path to the HWI war file, relative to ${HIVE_HOME}. </description>
</property>
ln -s $JAVA_HOME/lib/tools.jar $HIVE_HOME/lib/
cd /usr/local/hadoop/apache-hive-2.1.1/bin
./hive --service hwi 2>/tmp/hwi2.log &

访问:
​​​http://192.10.200.81:9999/hwi​​​
基本每个页面都需要多刷新几次。

一些问题处理 ​​
官网地址:​​​https://cwiki.apache.org/confluence/display/Hive/HiveWebInterface​​


举报

相关推荐

0 条评论