0
点赞
收藏
分享

微信扫一扫

Hive2.1.0安装配置

Gascognya 2022-02-04 阅读 93


前提:

下载安装压缩包:apache-hive-2.1.0-bin.tar.gz

已经安装了Mysql和Hadoop

解压缩安装包:apache-hive-2.1.0-bin.tar.gz

设置 Hive环境变量

export JAVA_HOME=/home/liguodong/install/jdk
export HADOOP_HOME=/home/liguodong/install/hadoop
export HIVE_HOME=/home/liguodong/install/hive
PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HIVE_HOME/bin:$PATH"

在 Hive 中创建表之前,需要在HDFS上创建hive-site.xml文件配置的目录/usr/liguodong/tmp 和 /user/hive/warehouse ( 属性项 hive.metastore.warehouse.dir 的默认值) ,并给它们赋写权限。

hdfs dfs -mkdir -p /usr/liguodong/tmp
hdfs dfs -mkdir -p /usr/hive/warehouse
//用户组加写权限
hdfs dfs -chmod g+w /usr/liguodong/tmp
hdfs dfs -chmod g+w /usr/hive/warehouse

使环境变量生效:

source .bashrc

配置Hive

配置文件重命名

cp hive-env.sh.template hive-env.sh
cp hive-default.xml.template hive-site.xml
cp hive-log4j2.properties.template hive-log4j2.properties
cp hive-exec-log4j2.properties.template hive-exec-log4j2.properties

修改hive-env.sh

因为 Hive 使用了 Hadoop, 需要在 hive-env.sh 文件中指定 Hadoop 安装路径:

export JAVA_HOME=/home/liguodong/install/jdk
export HADOOP_HOME=/home/liguodong/install/hadoop
export HIVE_HOME=/home/liguodong/install/hive
export HIVE_CONF_DIR=/home/liguodong/install/hive/conf

修改hive-site.xml

替换hive-site.xml文件中的​​${system:java.io.tmpdir}​​ 和​​${system:user.name}​

在命令行运行 hive 命令时必须保证 HDFS 已经启动。可以使用 start-dfs.sh 来启动 HDFS。

从 Hive 2.1 版本开始, 我们需要先运行 schematool 命令来执行初始化操作。

schematool -dbType mysql -initSchema

进入hive命令行

hive

beeline

./bin/hive --service hiveserver2  &

bin/beeline

嵌入模式:
beeline> !connect jdbc:hive2://
Connecting to jdbc:hive2://
Enter username for jdbc:hive2://:
Enter password for jdbc:hive2://:
远程模式:
beeline -u jdbc:hive2://192.168.133.147:10000 -n root -p liguodong

异常:

beeline> !connect jdbc:hive2://192.168.133.147:10000 scott tiger
Connecting to jdbc:hive2://192.168.133.147:10000
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/liguodong/install/hive/lib/hive-jdbc-2.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/liguodong/install/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/liguodong/install/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Error: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: liguodong is not allowed to impersonate scott (state=,code=0)

User: liguodong is not allowed to impersonate scott

解决方法:

<property>
<name>hadoop.proxyuser.liguodong.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.liguodong.groups</name>
<value>*</value>
</property>
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct MetaStore DB connections, we don't support retries at the client level.) (state=08S01,code=1)

解决方法:

<property>
<name>hive.metastore.uris</name>
<value>thrift://192.168.133.147:9083</value>
<description>Thrift URI for the remote metastore. Used by metastore client to connect to remote metastore.</description>
</property>

liguodong@gcc-p7-1015cn:~/install/hive$ hive --service metastore



举报

相关推荐

0 条评论