Apache Druid 系列文章
1、Druid(Imply-3.0.4)介绍及部署(centos6.10)、验证 2、Druid的入门示例(使用三种不同的方式摄入数据和提交任务) 3、Druid的load data 示例(实时kafka数据和离线-本地或hdfs数据) 4、java操作druid api 5、Druid配置文件详细介绍以及示例 6、Druid的Roll up详细介绍及示例
(文章目录)
本文简单通过一个示例介绍了通过jdbc访问druid的数据。 本文依赖druid可用,且数据已经准备好 本文分为2部分,即需求和实现步骤。
Druid提供了JDBC接口,项目可以直接使用 JDBC 连接Druid进行实时数据分析。
一、实现需求
查询estdata数据源中
二、实现步骤
1、导入依赖 2、编写JDBC代码连接Druid获取数据
- 加载Druid JDBC驱动
- 获取Druid JDBC连接
- 构建SQL语句
- 构建Statement,执行SQL获取结果集
- 关闭Druid连接
1、导入依赖
<dependency>
<groupId>org.apache.calcite.avatica</groupId>
<artifactId>avatica</artifactId>
<version>1.13.0</version>
</dependency>
<dependency>
<groupId>org.apache.calcite.avatica</groupId>
<artifactId>avatica-core</artifactId>
</dependency>
2、获取数据
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
/**
* 使用JDBC操作Druid
*/
public class App {
public static void main(String[] args) throws Exception {
Class.forName("org.apache.calcite.avatica.remote.Driver");
Connection connection = DriverManager.getConnection("jdbc:avatica:remote:url=http://server3:8888/druid/v2/sql/avatica/");
Statement st = null;
ResultSet rs = null;
try {
st = connection.createStatement();
rs = st.executeQuery("SELECT * FROM \"testdata\"");
ResultSetMetaData rsmd = rs.getMetaData();
List<Map> resultList = new ArrayList();
while (rs.next()) {
Map map = new HashMap();
for (int i = 0; i < rsmd.getColumnCount(); i++) {
String columnName = rsmd.getColumnName(i + 1);
map.put(columnName, rs.getObject(columnName));
}
resultList.add(map);
}
System.out.println("resultList = " + resultList);
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
connection.close();
} catch (SQLException e) {
e.printStackTrace();
}
}
}
}
3、结果验证
輸出結果:
resultList = [
{__time=2022-05-08 01:00:00.0, sum_money=20100, areaName=北京, count=3, category=书籍},
{__time=2022-05-08 01:00:00.0, sum_money=6661, areaName=北京, count=1, category=家具},
{__time=2022-05-08 01:00:00.0, sum_money=1550, areaName=北京, count=1, category=家电},
{__time=2022-05-08 01:00:00.0, sum_money=11312, areaName=北京, count=3, category=手机},
{__time=2022-05-08 01:00:00.0, sum_money=9000, areaName=北京, count=1, category=服饰},
{__time=2022-05-08 01:00:00.0, sum_money=9322, areaName=北京, count=3, category=电脑},
{__time=2022-05-08 01:00:00.0, sum_money=27971, areaName=北京, count=4, category=食品},
{__time=2022-05-08 01:00:00.0, sum_money=6660, areaName=天津, count=1, category=电脑},
{__time=2022-05-08 01:00:00.0, sum_money=5600, areaName=天津, count=1, category=食品},
{__time=2022-05-08 01:00:00.0, sum_money=6840, areaName=杭州, count=2, category=服饰},
{__time=2022-05-08 01:00:00.0, sum_money=7101, areaName=杭州, count=2, category=电脑},
{__time=2022-05-08 01:00:00.0, sum_money=1120, areaName=郑州, count=1, category=家具},
{__time=2022-05-08 02:00:00.0, sum_money=7000, areaName=北京, count=1, category=服饰},
{__time=2022-05-08 03:00:00.0, sum_money=4410, areaName=北京, count=1, category=家电},
{__time=2022-05-08 03:00:00.0, sum_money=6660, areaName=杭州, count=1, category=电脑},
{__time=2022-05-08 05:00:00.0, sum_money=1513, areaName=上海, count=1, category=电脑},
{__time=2022-05-08 05:00:00.0, sum_money=1230, areaName=北京, count=1, category=书籍},
{__time=2022-05-08 05:00:00.0, sum_money=1230, areaName=杭州, count=1, category=家具}
]