一、业务背景
在搭建好Hive环境后,应用项目远程连接hive需要设置用户名和密码,但hive默认的用户名和密码都是空,因此需要设置自定义用户名和密码。
二、开发步骤
2.1 新建maven项目,pom.xml引入相关依赖,主要是hadoop、HIve和打包依赖
<dependencies>
<dependency>
<groupId>junitgroupId>
<artifactId>junitartifactId>
<version>4.11version>
<scope>testscope>
dependency>
<dependency>
<groupId>org.apache.hadoopgroupId>
<artifactId>hadoop-commonartifactId>
<version>3.3.1version>
dependency>
<dependency>
<groupId>org.apache.hivegroupId>
<artifactId>hive-serviceartifactId>
<version>3.1.1version>
dependency>
<dependency>
<groupId>org.apache.hivegroupId>
<artifactId>hive-jdbcartifactId>
<version>3.1.2version>
dependency>
dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.pluginsgroupId>
<artifactId>maven-compiler-pluginartifactId>
<version>3.8.1version>
<configuration>
<source>1.8source>
<target>1.8target>
<encoding>UTF-8encoding>
configuration>
plugin>
plugins>
build>
2.2 新建自定义验证类代码:
package org.apache.hadoop.hive.contrib.auth;
import org.apache.commons.lang3.StringUtils;
import org.apache.hive.service.auth.PasswdAuthenticationProvider;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hive.conf.HiveConf;
import javax.security.sasl.AuthenticationException;
/**
* @describe: hive自定义认证
* @author: 容若
* @created: 2024-03-12 10:13
*/
public class CustomPasswdAuthenticator implements PasswdAuthenticationProvider {
private static final String HIVE_JDBC_PASSWD_AUTH_PREFIX="hive.jdbc_passwd.auth.%s";
private Configuration conf=null;
@Override
public void Authenticate(String userName, String passwd) throws AuthenticationException {
System.out.println("user: "+userName+" try login.");
String passwdConf = getConf().get(String.format(HIVE_JDBC_PASSWD_AUTH_PREFIX, userName));
if (StringUtils.isEmpty(passwdConf)) {
String message = "找不到用户的ACL配置. user:"+userName;
System.out.println(message);
throw new AuthenticationException(message);
}
if(!passwd.equals(passwdConf)){
String message = "用户名和密码不匹配. user:"+userName;
throw new AuthenticationException(message);
}
}
public Configuration getConf() {
if(conf==null){
this.conf=new Configuration(new HiveConf());
}
return conf;
}
public void setConf(Configuration conf) {
this.conf=conf;
}
}
2.3 将项目打包成jar,之后,将jar包放在hive根目录的lib,可能是:/opt/hive/conf/lib下,放在别的地方找到不到,会报错,如下:
2024-03-12T09:42:20,241 ERROR [HiveServer2-Handler-Pool: Thread-41] server.TThreadPoolServer: Error occurred during processing of message. java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticatornot found at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195) at
........
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticatve not found at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101) at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193) ... 13 more
2.4 自定义jar生效配置
修改hive目录下的conf文件夹中的hive-site.xml,添加配置:
<property>
<name>hive.server2.authenticationname>
<value>CUSTOMvalue>
property>
<property>
<name>hive.server2.custom.authentication.classname>
<value>org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticatorvalue>
property>
<property>
<name>hive.jdbc_passwd.auth.ext_bdname>
<value>swsc600369value>
property>
hive-site.xml中配置用户密码 用户名"ext_bd",密码 “swsc600369” 如果有多个就写多个property.
注意:指定解析jar包处,value里的路径必须要和自己项目里的路径一样。

2.5 配置Java连接Hive权限(在安装Hadoop时应该已经配置)
最后可能需要修改hadoop的相关文件,切换到hadoop配置文件目录:hadoop/etc/hadoop,修改hadoop:core-site.xml,否则java连接hive没权限;
<property>
<name>hadoop.proxyuser.hadoop.hostsname>
<value>*value>
property>
<property>
<name>hadoop.proxyuser.hadoop.groupsname>
<value>*value>
property>
2.6 重启服务
重启hadoop和hive,可以使用beeline工具进行连接测试,看hiveserver2是否配置正确,启动另一个终端,进入到hive/bin路径,使用beeline -u jdbc:hive2://ip:10000 命令利用beeline工具测试hiveserver2是否正常启动。
2.7 Java测试连接Hive
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
/**
* @describe:
* @author: 容若
* @created: 2024-03-12 16:27
*/
public class HiveAuthTest {
private static final String URLHIVE = "jdbc:hive2://190.150.19.179:10000";
private static Connection connection = null;
public static void main(String args[]) throws SQLException{
String sql1="select * from table limit 1";
PreparedStatement pstm = getHiveConnection().prepareStatement(sql1);
ResultSet rs= pstm.executeQuery(sql1);
while (rs.next()) {
System.out.println(rs.getString(2));
}
pstm.close();
rs.close();
}
public static Connection getHiveConnection() {
if (null == connection) {
synchronized (HiveAuthTest.class) {
if (null == connection) {
try {
Class.forName("org.apache.hive.jdbc.HiveDriver");
connection = DriverManager.getConnection(URLHIVE, "ext_bd", "swsc600369");
System.out.println("hive启动连接成功!");
} catch (SQLException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
}
}
}
return connection;
}
}

2.8 执行与调试
若执行报出如下错误:
java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=zhangsan, access=EXECUTE, inode="/tmp/hive":root:supergroup:drwx------ at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at
.........
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
说明目录没有权限,进入到hadoop/bin目录执行
[root@master bin]# ./hdfs dfs -chmod -R 777 /tmp