活了二十多年,没能为祖国、为人民做点什么,每思及此,伤心欲绝 !

java通过jdbc连接hive

Java 柯广 7118℃ 0评论

Hive系列文章

  1. Hive表的基本操作
  2. Hive中的集合数据类型
  3. Hive动态分区详解
  4. hive中orc格式表的数据导入
  5. Java通过jdbc连接hive
  6. 通过HiveServer2访问Hive
  7. SpringBoot连接Hive实现自助取数
  8. hive关联hbase表
  9. Hive udf 使用方法
  10. Hive基于UDF进行文本分词
  11. Hive窗口函数row number的用法
  12. 数据仓库之拉链表

java连接hive进行一些统计分析,在工作中再正常不过了。这里介绍通过java直连和mybatis整合的方式。

1. java直连的方式

java直连的方式跟连接mysql手法一样,只是driverName不一样而已。

public class HiveJdbcConnect implements Closeable{

    private static String driverName = "org.apache.hive.jdbc.HiveDriver";

    //    jdbc:hive2://
    private String url ;
    private String user ;
    private String password ;

    private  Connection conn = null;

    public HiveJdbcConnect(String url, String user, String password) {
        this.url = url;
        this.user = user;
        this.password = password;
        try {
            init();
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    private void init() throws Exception {
        Class.forName(driverName);
        conn = DriverManager.getConnection(url, user, password);
    }

}

测试用例

@Test
    public void test(){
        HiveConfigModel hiveConfigModel = ConfigureContext.getInstance("hive-config.properties")
                .addClass(HiveConfigModel.class)
                .getModelProperties(HiveConfigModel.class);

        try {
            Connection conn = DriverManager.getConnection(hiveConfigModel.getUrl(),
                    hiveConfigModel.getUsername(), hiveConfigModel.getPassword());

            String sql = "show tables";
            PreparedStatement preparedStatement = conn.prepareStatement(sql);
            ResultSet rs = preparedStatement.executeQuery();
            List<String> tables = new ArrayList<>();
            while (rs.next()){
                tables.add(rs.getString(1));
            }

            System.out.println(tables);
        } catch (SQLException e) {
            e.printStackTrace();
        }
    }

2. 整合mybatis

需要的maven依赖,为了节省篇幅,给出了主要的依赖:

<dependency>
        <groupId>org.mybatis</groupId>
        <artifactId>mybatis</artifactId>
        <version>3.2.7</version>
</dependency>
<!-- hadoop依赖 -->
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.6.5</version>
</dependency>

<!-- hive-jdbc -->
<!-- https://mvnrepository.com/artifact/org.apache.hive/hive-jdbc -->
<dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-jdbc</artifactId>
    <exclusions>
        <exclusion>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
        </exclusion>
        <exclusion>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-core</artifactId>
        </exclusion>
        <exclusion>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-classic</artifactId>
        </exclusion>
    </exclusions>
    <version>1.2.1</version>
</dependency>

mybatis配置文件:

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE configuration PUBLIC "-//mybatis.org//DTD Config 3.0//EN"
        "http://mybatis.org/dtd/mybatis-3-config.dtd">
<configuration>
    <environments default="production">
        <environment id="production">
            <transactionManager type="JDBC"/>
            <dataSource type="POOLED">
                <property name="driver" value="org.apache.hive.jdbc.HiveDriver"/>
                <property name="url" value="jdbc:hive2://localhost:10000/db"/>
                <property name="username" value="hive"/>
                <property name="password" value="hive"/>
            </dataSource>
        </environment>
    </environments>
    <mappers>
        <mapper resource="mapper/hive/hm2/ActiveHelperMapper.xml"/>
    </mappers>
</configuration>

mapper文件:

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN"
        "http://mybatis.org/dtd/mybatis-3-mapper.dtd">
<mapper namespace="***">
    <select  id="***" resultType="Integer" useCache="false" parameterType="String">
        select count(1) from tableName where dt = #{dateTime}
</select>

</mapper>

sqlSessionFactory部分:

public class HiveSqlSessionFactory {

    private static Object o = new Object();

    private static HiveSqlSessionFactory hiveSqlSessionFactory;
    private SqlSessionFactory sqlSessionFactory;

    public static HiveSqlSessionFactory getInstance (){
        synchronized (o){
            if (hiveSqlSessionFactory==null){
                hiveSqlSessionFactory = new HiveSqlSessionFactory();
            }
            return hiveSqlSessionFactory;
        }
    }

    private HiveSqlSessionFactory(){
        try {
            init();
        }catch (Exception e){
            log.error("init sqlSessionError->"+e.getMessage());
        }
    }

    public void init () throws IOException {
        SqlSessionFactoryBuilder builder = new SqlSessionFactoryBuilder();
        InputStream inputStream= Resources.getResourceAsStream("sqlConfig.xml");
        SqlSessionFactory sqlSessionFactory = builder.build(inputStream);
        log.info("sqlSession init->sqlConfig.xml");
        this.sqlSessionFactory=sqlSessionFactory;

    }

    public SqlSessionFactory getSqlSessionFactory(){
        synchronized (o) {
            return sqlSessionFactory;
        }
    }
}

mapper层:

public interface ActiveHelperMapper {
    int getHelperCount(@Param("dateTime") String dateTime);
}

dao层:

public class ActiveHelperMapperImpl implements ActiveHelperMapper {

    private static SqlSessionFactory sqlSessionFactory = HiveSqlSessionFactory.getInstance().getSqlSessionFactory();

    @Override
    public int getHelperCount(String dateTime) {
        SqlSession sqlSession = sqlSessionFactory.openSession();
        ActiveHelperMapper activeHelperMapper = sqlSession.getMapper(ActiveHelperMapper.class);

        int count = activeHelperMapper.getHelperCount(dateTime);

        sqlSession.close();

        return count;
    }
}

测试用例:

@Test
public void getHelperCountTest(){
    String dateTime = DateUtils.getDayDelayFormat(new Date(), -1, "yyyy-MM-dd");
    ActiveHelperMapperImpl activeHelperMapper = new ActiveHelperMapperImpl();
    System.out.println(activeHelperMapper.getHelperCount(dateTime));
}

转载请注明:柯广的网络日志 » java通过jdbc连接hive

喜欢 (1)or分享 (0)