Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
D
dataease
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
njgzx
dataease
Commits
d449e41c
提交
d449e41c
authored
3月 10, 2021
作者:
taojinlong
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
feat: 抽取数据到hbase
上级
27ea12ed
隐藏空白字符变更
内嵌
并排
正在显示
19 个修改的文件
包含
348 行增加
和
80 行删除
+348
-80
pom.xml
backend/pom.xml
+34
-11
DatasetTableField.java
.../main/java/io/dataease/base/domain/DatasetTableField.java
+4
-2
DatasetMode.java
.../main/java/io/dataease/commons/constants/DatasetMode.java
+6
-0
JobStatus.java
...rc/main/java/io/dataease/commons/constants/JobStatus.java
+2
-2
HbaseConfig.java
backend/src/main/java/io/dataease/config/HbaseConfig.java
+30
-0
DataSetTableFieldController.java
...aease/controller/dataset/DataSetTableFieldController.java
+1
-1
DatasourceProvider.java
...a/io/dataease/datasource/provider/DatasourceProvider.java
+4
-0
JdbcProvider.java
...in/java/io/dataease/datasource/provider/JdbcProvider.java
+55
-17
DatasourceRequest.java
...ava/io/dataease/datasource/request/DatasourceRequest.java
+3
-0
DeScheduleJob.java
...rc/main/java/io/dataease/job/sechedule/DeScheduleJob.java
+14
-11
ExtractDataJob.java
...c/main/java/io/dataease/job/sechedule/ExtractDataJob.java
+22
-0
ScheduleManager.java
.../main/java/io/dataease/job/sechedule/ScheduleManager.java
+3
-3
AppStartListener.java
.../src/main/java/io/dataease/listener/AppStartListener.java
+1
-7
ScheduleService.java
...nd/src/main/java/io/dataease/service/ScheduleService.java
+6
-5
DataSetTableService.java
...java/io/dataease/service/dataset/DataSetTableService.java
+54
-18
DataSetTableTaskLogService.java
.../dataease/service/dataset/DataSetTableTaskLogService.java
+7
-0
DataSetTableTaskService.java
.../io/dataease/service/dataset/DataSetTableTaskService.java
+7
-1
ExtractDataService.java
.../java/io/dataease/service/dataset/ExtractDataService.java
+93
-0
pom.xml
pom.xml
+2
-2
没有找到文件。
backend/pom.xml
浏览文件 @
d449e41c
...
@@ -303,17 +303,6 @@
...
@@ -303,17 +303,6 @@
<artifactId>
reflections8
</artifactId>
<artifactId>
reflections8
</artifactId>
<version>
0.11.7
</version>
<version>
0.11.7
</version>
</dependency>
</dependency>
<!-- k8s client -->
<!--<dependency>
<groupId>io.fabric8</groupId>
<artifactId>kubernetes-client</artifactId>
<version>4.13.0</version>
</dependency>
<dependency>
<groupId>com.github.fge</groupId>
<artifactId>json-schema-validator</artifactId>
<version>2.2.6</version>
</dependency>-->
<!--开启 cache 缓存 -->
<!--开启 cache 缓存 -->
<dependency>
<dependency>
<groupId>
org.springframework.boot
</groupId>
<groupId>
org.springframework.boot
</groupId>
...
@@ -325,6 +314,40 @@
...
@@ -325,6 +314,40 @@
<artifactId>
ehcache
</artifactId>
<artifactId>
ehcache
</artifactId>
<version>
2.9.1
</version>
<version>
2.9.1
</version>
</dependency>
</dependency>
<!-- hbase -->
<dependency>
<groupId>
org.apache.hbase
</groupId>
<artifactId>
hbase
</artifactId>
<version>
2.4.1
</version>
<type>
pom
</type>
</dependency>
<dependency>
<groupId>
org.apache.hbase
</groupId>
<artifactId>
hbase-client
</artifactId>
<version>
2.4.1
</version>
</dependency>
<dependency>
<groupId>
org.apache.hbase
</groupId>
<artifactId>
hbase-common
</artifactId>
<version>
2.4.1
</version>
</dependency>
<dependency>
<groupId>
org.apache.hbase
</groupId>
<artifactId>
hbase-server
</artifactId>
<version>
2.4.1
</version>
</dependency>
<dependency>
<groupId>
org.apache.hbase
</groupId>
<artifactId>
hbase-mapreduce
</artifactId>
<version>
2.4.1
</version>
</dependency>
<dependency>
<groupId>
org.testng
</groupId>
<artifactId>
testng
</artifactId>
<version>
6.8
</version>
<scope>
test
</scope>
</dependency>
</dependencies>
</dependencies>
<build>
<build>
...
...
backend/src/main/java/io/dataease/base/domain/DatasetTableField.java
浏览文件 @
d449e41c
package
io
.
dataease
.
base
.
domain
;
package
io
.
dataease
.
base
.
domain
;
import
java.io.Serializable
;
import
java.io.Serializable
;
import
lombok.Builder
;
import
lombok.Data
;
import
lombok.Data
;
@Data
@Data
@Builder
public
class
DatasetTableField
implements
Serializable
{
public
class
DatasetTableField
implements
Serializable
{
private
String
id
;
private
String
id
;
...
@@ -24,4 +27,4 @@ public class DatasetTableField implements Serializable {
...
@@ -24,4 +27,4 @@ public class DatasetTableField implements Serializable {
private
Integer
deType
;
private
Integer
deType
;
private
static
final
long
serialVersionUID
=
1L
;
private
static
final
long
serialVersionUID
=
1L
;
}
}
\ No newline at end of file
backend/src/main/java/io/dataease/commons/constants/DatasetMode.java
0 → 100644
浏览文件 @
d449e41c
package
io
.
dataease
.
commons
.
constants
;
public
class
DatasetMode
{
public
static
final
String
EXTRACT
=
"1"
;
public
static
final
String
DIRECT
=
"0"
;
}
backend/src/main/java/io/dataease/commons/constants/
TestPlan
Status.java
→
backend/src/main/java/io/dataease/commons/constants/
Job
Status.java
浏览文件 @
d449e41c
package
io
.
dataease
.
commons
.
constants
;
package
io
.
dataease
.
commons
.
constants
;
public
enum
TestPlan
Status
{
public
enum
Job
Status
{
Prepare
,
Underway
,
Completed
Prepare
,
Underway
,
Completed
,
Error
}
}
backend/src/main/java/io/dataease/config/HbaseConfig.java
0 → 100644
浏览文件 @
d449e41c
package
io
.
dataease
.
config
;
import
com.fit2cloud.autoconfigure.QuartzAutoConfiguration
;
import
org.springframework.boot.autoconfigure.AutoConfigureBefore
;
import
org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean
;
import
org.springframework.context.annotation.Bean
;
import
org.springframework.context.annotation.Configuration
;
import
org.springframework.context.annotation.DependsOn
;
import
org.springframework.core.env.Environment
;
import
javax.annotation.Resource
;
@Configuration
@AutoConfigureBefore
(
QuartzAutoConfiguration
.
class
)
public
class
HbaseConfig
{
@Resource
private
Environment
env
;
// 保存了配置文件的信息
@Bean
@ConditionalOnMissingBean
public
org
.
apache
.
hadoop
.
conf
.
Configuration
configuration
(){
org
.
apache
.
hadoop
.
conf
.
Configuration
configuration
=
new
org
.
apache
.
hadoop
.
conf
.
Configuration
();
configuration
.
set
(
"hbase.zookeeper.quorum"
,
env
.
getProperty
(
"hbase.zookeeper.quorum"
));
configuration
.
set
(
"hbase.zookeeper.property.clientPort"
,
env
.
getProperty
(
"hbase.zookeeper.property.clientPort"
));
configuration
.
set
(
"hbase.client.retries.number"
,
env
.
getProperty
(
"hbase.client.retries.number"
,
"1"
));
return
configuration
;
}
}
backend/src/main/java/io/dataease/controller/dataset/DataSetTableFieldController.java
浏览文件 @
d449e41c
...
@@ -19,7 +19,7 @@ public class DataSetTableFieldController {
...
@@ -19,7 +19,7 @@ public class DataSetTableFieldController {
@PostMapping
(
"list/{tableId}"
)
@PostMapping
(
"list/{tableId}"
)
public
List
<
DatasetTableField
>
list
(
@PathVariable
String
tableId
)
{
public
List
<
DatasetTableField
>
list
(
@PathVariable
String
tableId
)
{
DatasetTableField
datasetTableField
=
new
DatasetTableFie
ld
();
DatasetTableField
datasetTableField
=
DatasetTableField
.
builder
().
bui
ld
();
datasetTableField
.
setTableId
(
tableId
);
datasetTableField
.
setTableId
(
tableId
);
return
dataSetTableFieldsService
.
list
(
datasetTableField
);
return
dataSetTableFieldsService
.
list
(
datasetTableField
);
}
}
...
...
backend/src/main/java/io/dataease/datasource/provider/DatasourceProvider.java
浏览文件 @
d449e41c
...
@@ -23,4 +23,8 @@ public abstract class DatasourceProvider {
...
@@ -23,4 +23,8 @@ public abstract class DatasourceProvider {
getData
(
datasourceRequest
);
getData
(
datasourceRequest
);
}
}
abstract
public
Long
count
(
DatasourceRequest
datasourceRequest
)
throws
Exception
;
abstract
public
List
<
String
[]>
getPageData
(
DatasourceRequest
datasourceRequest
)
throws
Exception
;
}
}
backend/src/main/java/io/dataease/datasource/provider/JdbcProvider.java
浏览文件 @
d449e41c
...
@@ -9,6 +9,7 @@ import io.dataease.datasource.request.DatasourceRequest;
...
@@ -9,6 +9,7 @@ import io.dataease.datasource.request.DatasourceRequest;
import
org.apache.commons.lang3.StringUtils
;
import
org.apache.commons.lang3.StringUtils
;
import
org.springframework.stereotype.Service
;
import
org.springframework.stereotype.Service
;
import
java.sql.*
;
import
java.sql.*
;
import
java.text.MessageFormat
;
import
java.util.*
;
import
java.util.*
;
@Service
(
"jdbc"
)
@Service
(
"jdbc"
)
...
@@ -23,23 +24,24 @@ public class JdbcProvider extends DatasourceProvider{
...
@@ -23,23 +24,24 @@ public class JdbcProvider extends DatasourceProvider{
Statement
stat
=
connection
.
createStatement
();
Statement
stat
=
connection
.
createStatement
();
ResultSet
rs
=
stat
.
executeQuery
(
datasourceRequest
.
getQuery
())
ResultSet
rs
=
stat
.
executeQuery
(
datasourceRequest
.
getQuery
())
)
{
)
{
ResultSetMetaData
metaData
=
rs
.
getMetaData
();
list
=
fetchResult
(
rs
);
int
columnCount
=
metaData
.
getColumnCount
();
}
catch
(
SQLException
e
){
while
(
rs
.
next
())
{
throw
new
Exception
(
"ERROR:"
+
e
.
getMessage
(),
e
);
String
[]
row
=
new
String
[
columnCount
];
}
catch
(
Exception
e
)
{
for
(
int
j
=
0
;
j
<
columnCount
;
j
++)
{
throw
new
Exception
(
"ERROR:"
+
e
.
getMessage
(),
e
);
int
columType
=
metaData
.
getColumnType
(
j
+
1
);
}
switch
(
columType
)
{
return
list
;
case
java
.
sql
.
Types
.
DATE
:
}
row
[
j
]
=
rs
.
getDate
(
j
+
1
).
toString
();
break
;
@Override
default
:
public
List
<
String
[]>
getPageData
(
DatasourceRequest
datasourceRequest
)
throws
Exception
{
row
[
j
]
=
rs
.
getString
(
j
+
1
);
List
<
String
[]>
list
=
new
LinkedList
<>();
break
;
try
(
}
Connection
connection
=
getConnection
(
datasourceRequest
);
}
Statement
stat
=
connection
.
createStatement
();
list
.
add
(
row
);
ResultSet
rs
=
stat
.
executeQuery
(
datasourceRequest
.
getQuery
()
+
MessageFormat
.
format
(
" LIMIT {0}, {1}"
,
(
datasourceRequest
.
getStartPage
()
-
1
)*
datasourceRequest
.
getPageSize
(),
datasourceRequest
.
getPageSize
()))
}
)
{
list
=
fetchResult
(
rs
);
}
catch
(
SQLException
e
){
}
catch
(
SQLException
e
){
throw
new
Exception
(
"ERROR:"
+
e
.
getMessage
(),
e
);
throw
new
Exception
(
"ERROR:"
+
e
.
getMessage
(),
e
);
}
catch
(
Exception
e
)
{
}
catch
(
Exception
e
)
{
...
@@ -48,6 +50,29 @@ public class JdbcProvider extends DatasourceProvider{
...
@@ -48,6 +50,29 @@ public class JdbcProvider extends DatasourceProvider{
return
list
;
return
list
;
}
}
private
List
<
String
[]>
fetchResult
(
ResultSet
rs
)
throws
Exception
{
List
<
String
[]>
list
=
new
LinkedList
<>();
ResultSetMetaData
metaData
=
rs
.
getMetaData
();
int
columnCount
=
metaData
.
getColumnCount
();
while
(
rs
.
next
())
{
String
[]
row
=
new
String
[
columnCount
];
for
(
int
j
=
0
;
j
<
columnCount
;
j
++)
{
int
columType
=
metaData
.
getColumnType
(
j
+
1
);
switch
(
columType
)
{
case
java
.
sql
.
Types
.
DATE
:
row
[
j
]
=
rs
.
getDate
(
j
+
1
).
toString
();
break
;
default
:
row
[
j
]
=
rs
.
getString
(
j
+
1
);
break
;
}
}
list
.
add
(
row
);
}
return
list
;
}
@Override
@Override
public
List
<
String
>
getTables
(
DatasourceRequest
datasourceRequest
)
throws
Exception
{
public
List
<
String
>
getTables
(
DatasourceRequest
datasourceRequest
)
throws
Exception
{
List
<
String
>
tables
=
new
ArrayList
<>();
List
<
String
>
tables
=
new
ArrayList
<>();
...
@@ -106,6 +131,19 @@ public class JdbcProvider extends DatasourceProvider{
...
@@ -106,6 +131,19 @@ public class JdbcProvider extends DatasourceProvider{
}
}
}
}
public
Long
count
(
DatasourceRequest
datasourceRequest
)
throws
Exception
{
try
(
Connection
con
=
getConnection
(
datasourceRequest
);
Statement
ps
=
con
.
createStatement
())
{
ResultSet
resultSet
=
ps
.
executeQuery
(
datasourceRequest
.
getQuery
());
while
(
resultSet
.
next
())
{
return
resultSet
.
getLong
(
1
);
}
}
catch
(
Exception
e
)
{
throw
new
Exception
(
"ERROR: "
+
e
.
getMessage
(),
e
);
}
return
0L
;
}
private
Connection
getConnection
(
DatasourceRequest
datasourceRequest
)
throws
Exception
{
private
Connection
getConnection
(
DatasourceRequest
datasourceRequest
)
throws
Exception
{
String
username
=
null
;
String
username
=
null
;
String
password
=
null
;
String
password
=
null
;
...
...
backend/src/main/java/io/dataease/datasource/request/DatasourceRequest.java
浏览文件 @
d449e41c
...
@@ -11,5 +11,8 @@ public class DatasourceRequest {
...
@@ -11,5 +11,8 @@ public class DatasourceRequest {
protected
String
query
;
protected
String
query
;
protected
String
table
;
protected
String
table
;
protected
Datasource
datasource
;
protected
Datasource
datasource
;
private
Long
pageSize
;
private
Long
startPage
;
}
}
backend/src/main/java/io/dataease/job/sechedule/DeScheduleJob.java
浏览文件 @
d449e41c
package
io
.
dataease
.
job
.
sechedule
;
package
io
.
dataease
.
job
.
sechedule
;
import
org.quartz.Job
;
import
io.dataease.commons.utils.LogUtil
;
import
org.quartz.JobExecutionContext
;
import
org.quartz.*
;
import
org.quartz.JobExecutionException
;
public
abstract
class
DeScheduleJob
implements
Job
{
public
abstract
class
DeScheduleJob
implements
Job
{
protected
String
datasetTableId
;
protected
String
expression
;
protected
String
taskId
;
@Override
@Override
public
void
execute
(
JobExecutionContext
context
)
throws
JobExecutionException
{
public
void
execute
(
JobExecutionContext
context
)
throws
JobExecutionException
{
JobKey
jobKey
=
context
.
getTrigger
().
getJobKey
();
JobDataMap
jobDataMap
=
context
.
getJobDetail
().
getJobDataMap
();
this
.
datasetTableId
=
jobDataMap
.
getString
(
"datasetTableId"
);
this
.
expression
=
jobDataMap
.
getString
(
"expression"
);
this
.
taskId
=
jobDataMap
.
getString
(
"taskId"
);
// JobKey jobKey = context.getTrigger().getJobKey();
LogUtil
.
info
(
jobKey
.
getGroup
()
+
" Running: "
+
datasetTableId
);
// JobDataMap jobDataMap = context.getJobDetail().getJobDataMap();
LogUtil
.
info
(
jobKey
.
getName
()
+
" Running: "
+
datasetTableId
);
// this.resourceId = jobDataMap.getString("resourceId");
LogUtil
.
info
(
"CronExpression: "
+
expression
);
// this.userId = jobDataMap.getString("userId");
// this.expression = jobDataMap.getString("expression");
//
// LogUtil.info(jobKey.getGroup() + " Running: " + resourceId);
// LogUtil.info("CronExpression: " + expression);
businessExecute
(
context
);
businessExecute
(
context
);
}
}
...
...
backend/src/main/java/io/dataease/job/sechedule/ExtractDataJob.java
0 → 100644
浏览文件 @
d449e41c
package
io
.
dataease
.
job
.
sechedule
;
import
io.dataease.commons.utils.CommonBeanFactory
;
import
io.dataease.service.dataset.ExtractDataService
;
import
org.quartz.JobExecutionContext
;
import
org.springframework.stereotype.Component
;
@Component
public
class
ExtractDataJob
extends
DeScheduleJob
{
private
ExtractDataService
extractDataService
;
public
ExtractDataJob
()
{
extractDataService
=
(
ExtractDataService
)
CommonBeanFactory
.
getBean
(
ExtractDataService
.
class
);
}
@Override
void
businessExecute
(
JobExecutionContext
context
)
{
extractDataService
.
extractData
(
datasetTableId
,
taskId
);
}
}
backend/src/main/java/io/dataease/job/sechedule/ScheduleManager.java
浏览文件 @
d449e41c
...
@@ -369,11 +369,11 @@ public class ScheduleManager {
...
@@ -369,11 +369,11 @@ public class ScheduleManager {
addOrUpdateCronJob
(
jobKey
,
triggerKey
,
jobClass
,
cron
,
startTime
,
endTime
,
null
);
addOrUpdateCronJob
(
jobKey
,
triggerKey
,
jobClass
,
cron
,
startTime
,
endTime
,
null
);
}
}
public
JobDataMap
getDefaultJobDataMap
(
String
resourceId
,
String
expression
,
String
user
Id
)
{
public
JobDataMap
getDefaultJobDataMap
(
String
resourceId
,
String
expression
,
String
task
Id
)
{
JobDataMap
jobDataMap
=
new
JobDataMap
();
JobDataMap
jobDataMap
=
new
JobDataMap
();
jobDataMap
.
put
(
"resourceId"
,
resourceId
);
jobDataMap
.
put
(
"datasetTableId"
,
resourceId
);
jobDataMap
.
put
(
"taskId"
,
taskId
);
jobDataMap
.
put
(
"expression"
,
expression
);
jobDataMap
.
put
(
"expression"
,
expression
);
jobDataMap
.
put
(
"userId"
,
userId
);
return
jobDataMap
;
return
jobDataMap
;
}
}
...
...
backend/src/main/java/io/dataease/listener/AppStartListener.java
浏览文件 @
d449e41c
...
@@ -20,13 +20,7 @@ public class AppStartListener implements ApplicationListener<ApplicationReadyEve
...
@@ -20,13 +20,7 @@ public class AppStartListener implements ApplicationListener<ApplicationReadyEve
@Override
@Override
public
void
onApplicationEvent
(
ApplicationReadyEvent
applicationReadyEvent
)
{
public
void
onApplicationEvent
(
ApplicationReadyEvent
applicationReadyEvent
)
{
System
.
out
.
println
(
"================= 应用启动 ================="
);
System
.
out
.
println
(
"================= Application start ================="
);
/* cron schedule */
// scheduleManager.addCronJob(new JobKey("abc", "def"), new TriggerKey("abc", "def"), TestJob.class, "*/10 * * * * ?");
/* single schedule*/
// long timestamp = System.currentTimeMillis() + 90 * 1000;
// Date date = new Date(timestamp);
// scheduleManager.addSingleJob(new JobKey("abc", "def"), new TriggerKey("abc", "def"), TestJob.class, date);
// 项目启动,从数据库读取任务加入到Quartz
// 项目启动,从数据库读取任务加入到Quartz
List
<
DatasetTableTask
>
list
=
dataSetTableTaskService
.
list
(
new
DatasetTableTask
());
List
<
DatasetTableTask
>
list
=
dataSetTableTaskService
.
list
(
new
DatasetTableTask
());
for
(
DatasetTableTask
task
:
list
)
{
for
(
DatasetTableTask
task
:
list
)
{
...
...
backend/src/main/java/io/dataease/service/ScheduleService.java
浏览文件 @
d449e41c
package
io
.
dataease
.
service
;
package
io
.
dataease
.
service
;
import
io.dataease.base.domain.DatasetTableTask
;
import
io.dataease.base.domain.DatasetTableTask
;
import
io.dataease.job.sechedule.ExtractDataJob
;
import
io.dataease.job.sechedule.ScheduleManager
;
import
io.dataease.job.sechedule.ScheduleManager
;
import
io.dataease.job.sechedule.TestJob
;
import
org.apache.commons.lang3.StringUtils
;
import
org.apache.commons.lang3.StringUtils
;
import
org.quartz.JobKey
;
import
org.quartz.JobKey
;
import
org.quartz.TriggerKey
;
import
org.quartz.TriggerKey
;
...
@@ -24,8 +24,8 @@ public class ScheduleService {
...
@@ -24,8 +24,8 @@ public class ScheduleService {
if
(
StringUtils
.
equalsIgnoreCase
(
datasetTableTask
.
getRate
(),
"0"
))
{
if
(
StringUtils
.
equalsIgnoreCase
(
datasetTableTask
.
getRate
(),
"0"
))
{
scheduleManager
.
addOrUpdateSingleJob
(
new
JobKey
(
datasetTableTask
.
getId
(),
datasetTableTask
.
getTableId
()),
scheduleManager
.
addOrUpdateSingleJob
(
new
JobKey
(
datasetTableTask
.
getId
(),
datasetTableTask
.
getTableId
()),
new
TriggerKey
(
datasetTableTask
.
getId
(),
datasetTableTask
.
getTableId
()),
new
TriggerKey
(
datasetTableTask
.
getId
(),
datasetTableTask
.
getTableId
()),
TestJob
.
class
,
//TODO
ExtractDataJob
.
class
,
new
Date
(
datasetTableTask
.
getStartTime
()));
new
Date
(
datasetTableTask
.
getStartTime
())
,
scheduleManager
.
getDefaultJobDataMap
(
datasetTableTask
.
getTableId
(),
datasetTableTask
.
getCron
(),
datasetTableTask
.
getId
())
);
}
else
if
(
StringUtils
.
equalsIgnoreCase
(
datasetTableTask
.
getRate
(),
"1"
))
{
}
else
if
(
StringUtils
.
equalsIgnoreCase
(
datasetTableTask
.
getRate
(),
"1"
))
{
Date
endTime
;
Date
endTime
;
if
(
datasetTableTask
.
getEndTime
()
==
null
||
datasetTableTask
.
getEndTime
()
==
0
)
{
if
(
datasetTableTask
.
getEndTime
()
==
null
||
datasetTableTask
.
getEndTime
()
==
0
)
{
...
@@ -36,8 +36,9 @@ public class ScheduleService {
...
@@ -36,8 +36,9 @@ public class ScheduleService {
scheduleManager
.
addOrUpdateCronJob
(
new
JobKey
(
datasetTableTask
.
getId
(),
datasetTableTask
.
getTableId
()),
scheduleManager
.
addOrUpdateCronJob
(
new
JobKey
(
datasetTableTask
.
getId
(),
datasetTableTask
.
getTableId
()),
new
TriggerKey
(
datasetTableTask
.
getId
(),
datasetTableTask
.
getTableId
()),
new
TriggerKey
(
datasetTableTask
.
getId
(),
datasetTableTask
.
getTableId
()),
TestJob
.
class
,
// TODO
ExtractDataJob
.
class
,
datasetTableTask
.
getCron
(),
new
Date
(
datasetTableTask
.
getStartTime
()),
endTime
);
datasetTableTask
.
getCron
(),
new
Date
(
datasetTableTask
.
getStartTime
()),
endTime
,
scheduleManager
.
getDefaultJobDataMap
(
datasetTableTask
.
getTableId
(),
datasetTableTask
.
getCron
(),
datasetTableTask
.
getId
()));
}
}
}
}
...
...
backend/src/main/java/io/dataease/service/dataset/DataSetTableService.java
浏览文件 @
d449e41c
...
@@ -71,7 +71,9 @@ public class DataSetTableService {
...
@@ -71,7 +71,9 @@ public class DataSetTableService {
public
List
<
DatasetTable
>
list
(
DataSetTableRequest
dataSetTableRequest
)
{
public
List
<
DatasetTable
>
list
(
DataSetTableRequest
dataSetTableRequest
)
{
DatasetTableExample
datasetTableExample
=
new
DatasetTableExample
();
DatasetTableExample
datasetTableExample
=
new
DatasetTableExample
();
datasetTableExample
.
createCriteria
().
andSceneIdEqualTo
(
dataSetTableRequest
.
getSceneId
());
if
(
StringUtils
.
isNotEmpty
(
dataSetTableRequest
.
getSceneId
())){
datasetTableExample
.
createCriteria
().
andSceneIdEqualTo
(
dataSetTableRequest
.
getSceneId
());
}
if
(
StringUtils
.
isNotEmpty
(
dataSetTableRequest
.
getSort
()))
{
if
(
StringUtils
.
isNotEmpty
(
dataSetTableRequest
.
getSort
()))
{
datasetTableExample
.
setOrderByClause
(
dataSetTableRequest
.
getSort
());
datasetTableExample
.
setOrderByClause
(
dataSetTableRequest
.
getSort
());
}
}
...
@@ -92,7 +94,7 @@ public class DataSetTableService {
...
@@ -92,7 +94,7 @@ public class DataSetTableService {
}
}
public
Map
<
String
,
List
<
DatasetTableField
>>
getFieldsFromDE
(
DataSetTableRequest
dataSetTableRequest
)
throws
Exception
{
public
Map
<
String
,
List
<
DatasetTableField
>>
getFieldsFromDE
(
DataSetTableRequest
dataSetTableRequest
)
throws
Exception
{
DatasetTableField
datasetTableField
=
new
DatasetTableFie
ld
();
DatasetTableField
datasetTableField
=
DatasetTableField
.
builder
().
bui
ld
();
datasetTableField
.
setTableId
(
dataSetTableRequest
.
getId
());
datasetTableField
.
setTableId
(
dataSetTableRequest
.
getId
());
datasetTableField
.
setChecked
(
Boolean
.
TRUE
);
datasetTableField
.
setChecked
(
Boolean
.
TRUE
);
List
<
DatasetTableField
>
fields
=
dataSetTableFieldsService
.
list
(
datasetTableField
);
List
<
DatasetTableField
>
fields
=
dataSetTableFieldsService
.
list
(
datasetTableField
);
...
@@ -121,7 +123,7 @@ public class DataSetTableService {
...
@@ -121,7 +123,7 @@ public class DataSetTableService {
datasourceRequest
.
setDatasource
(
ds
);
datasourceRequest
.
setDatasource
(
ds
);
String
table
=
new
Gson
().
fromJson
(
dataSetTableRequest
.
getInfo
(),
DataTableInfoDTO
.
class
).
getTable
();
String
table
=
new
Gson
().
fromJson
(
dataSetTableRequest
.
getInfo
(),
DataTableInfoDTO
.
class
).
getTable
();
DatasetTableField
datasetTableField
=
new
DatasetTableFie
ld
();
DatasetTableField
datasetTableField
=
DatasetTableField
.
builder
().
bui
ld
();
datasetTableField
.
setTableId
(
dataSetTableRequest
.
getId
());
datasetTableField
.
setTableId
(
dataSetTableRequest
.
getId
());
datasetTableField
.
setChecked
(
Boolean
.
TRUE
);
datasetTableField
.
setChecked
(
Boolean
.
TRUE
);
List
<
DatasetTableField
>
fields
=
dataSetTableFieldsService
.
list
(
datasetTableField
);
List
<
DatasetTableField
>
fields
=
dataSetTableFieldsService
.
list
(
datasetTableField
);
...
@@ -137,15 +139,13 @@ public class DataSetTableService {
...
@@ -137,15 +139,13 @@ public class DataSetTableService {
DatasourceRequest
datasourceRequest
=
new
DatasourceRequest
();
DatasourceRequest
datasourceRequest
=
new
DatasourceRequest
();
datasourceRequest
.
setDatasource
(
ds
);
datasourceRequest
.
setDatasource
(
ds
);
String
table
=
new
Gson
().
fromJson
(
dataSetTableRequest
.
getInfo
(),
DataTableInfoDTO
.
class
).
getTable
();
String
table
=
new
Gson
().
fromJson
(
dataSetTableRequest
.
getInfo
(),
DataTableInfoDTO
.
class
).
getTable
();
// datasourceRequest.setTable(table);
DatasetTableField
datasetTableField
=
new
DatasetTableFie
ld
();
DatasetTableField
datasetTableField
=
DatasetTableField
.
builder
().
bui
ld
();
datasetTableField
.
setTableId
(
dataSetTableRequest
.
getId
());
datasetTableField
.
setTableId
(
dataSetTableRequest
.
getId
());
datasetTableField
.
setChecked
(
Boolean
.
TRUE
);
datasetTableField
.
setChecked
(
Boolean
.
TRUE
);
List
<
DatasetTableField
>
fields
=
dataSetTableFieldsService
.
list
(
datasetTableField
);
List
<
DatasetTableField
>
fields
=
dataSetTableFieldsService
.
list
(
datasetTableField
);
String
[]
fieldArray
=
fields
.
stream
().
map
(
DatasetTableField:
:
getOriginName
).
toArray
(
String
[]::
new
);
String
[]
fieldArray
=
fields
.
stream
().
map
(
DatasetTableField:
:
getOriginName
).
toArray
(
String
[]::
new
);
// datasourceRequest.setQuery("SELECT " + StringUtils.join(fieldArray, ",") + " FROM " + table + " LIMIT 0,10;");
datasourceRequest
.
setQuery
(
createQuerySQL
(
ds
.
getType
(),
table
,
fieldArray
)
+
" LIMIT 0,10"
);
datasourceRequest
.
setQuery
(
createQuerySQL
(
ds
.
getType
(),
table
,
fieldArray
)
+
" LIMIT 0,10"
);
List
<
String
[]>
data
=
new
ArrayList
<>();
List
<
String
[]>
data
=
new
ArrayList
<>();
...
@@ -154,17 +154,6 @@ public class DataSetTableService {
...
@@ -154,17 +154,6 @@ public class DataSetTableService {
}
catch
(
Exception
e
)
{
}
catch
(
Exception
e
)
{
}
}
/*JSONArray jsonArray = new JSONArray();
if (CollectionUtils.isNotEmpty(data)) {
data.forEach(ele -> {
JSONObject jsonObject = new JSONObject();
for (int i = 0; i < ele.length; i++) {
jsonObject.put(fieldArray[i], ele[i]);
}
jsonArray.add(jsonObject);
});
}*/
List
<
Map
<
String
,
Object
>>
jsonArray
=
new
ArrayList
<>();
List
<
Map
<
String
,
Object
>>
jsonArray
=
new
ArrayList
<>();
if
(
CollectionUtils
.
isNotEmpty
(
data
))
{
if
(
CollectionUtils
.
isNotEmpty
(
data
))
{
jsonArray
=
data
.
stream
().
map
(
ele
->
{
jsonArray
=
data
.
stream
().
map
(
ele
->
{
...
@@ -184,6 +173,53 @@ public class DataSetTableService {
...
@@ -184,6 +173,53 @@ public class DataSetTableService {
return
map
;
return
map
;
}
}
public
List
<
String
[]>
getDataSetData
(
String
datasourceId
,
String
table
,
List
<
DatasetTableField
>
fields
){
List
<
String
[]>
data
=
new
ArrayList
<>();
Datasource
ds
=
datasourceMapper
.
selectByPrimaryKey
(
datasourceId
);
DatasourceProvider
datasourceProvider
=
ProviderFactory
.
getProvider
(
ds
.
getType
());
DatasourceRequest
datasourceRequest
=
new
DatasourceRequest
();
datasourceRequest
.
setDatasource
(
ds
);
String
[]
fieldArray
=
fields
.
stream
().
map
(
DatasetTableField:
:
getOriginName
).
toArray
(
String
[]::
new
);
datasourceRequest
.
setQuery
(
createQuerySQL
(
ds
.
getType
(),
table
,
fieldArray
)
+
" LIMIT 0, 10"
);
try
{
data
.
addAll
(
datasourceProvider
.
getData
(
datasourceRequest
));
}
catch
(
Exception
e
)
{
}
return
data
;
}
public
Long
getDataSetTotalData
(
String
datasourceId
,
String
table
){
List
<
String
[]>
data
=
new
ArrayList
<>();
Datasource
ds
=
datasourceMapper
.
selectByPrimaryKey
(
datasourceId
);
DatasourceProvider
datasourceProvider
=
ProviderFactory
.
getProvider
(
ds
.
getType
());
DatasourceRequest
datasourceRequest
=
new
DatasourceRequest
();
datasourceRequest
.
setDatasource
(
ds
);
datasourceRequest
.
setQuery
(
"select count(*) from "
+
table
);
try
{
return
datasourceProvider
.
count
(
datasourceRequest
);
}
catch
(
Exception
e
)
{
}
return
0
l
;
}
public
List
<
String
[]>
getDataSetPageData
(
String
datasourceId
,
String
table
,
List
<
DatasetTableField
>
fields
,
Long
startPage
,
Long
pageSize
){
List
<
String
[]>
data
=
new
ArrayList
<>();
Datasource
ds
=
datasourceMapper
.
selectByPrimaryKey
(
datasourceId
);
DatasourceProvider
datasourceProvider
=
ProviderFactory
.
getProvider
(
ds
.
getType
());
DatasourceRequest
datasourceRequest
=
new
DatasourceRequest
();
datasourceRequest
.
setDatasource
(
ds
);
String
[]
fieldArray
=
fields
.
stream
().
map
(
DatasetTableField:
:
getOriginName
).
toArray
(
String
[]::
new
);
datasourceRequest
.
setPageSize
(
pageSize
);
datasourceRequest
.
setStartPage
(
startPage
);
datasourceRequest
.
setQuery
(
createQuerySQL
(
ds
.
getType
(),
table
,
fieldArray
));
try
{
return
datasourceProvider
.
getData
(
datasourceRequest
);
}
catch
(
Exception
e
)
{
}
return
data
;
}
public
void
saveTableField
(
DatasetTable
datasetTable
)
throws
Exception
{
public
void
saveTableField
(
DatasetTable
datasetTable
)
throws
Exception
{
Datasource
ds
=
datasourceMapper
.
selectByPrimaryKey
(
datasetTable
.
getDataSourceId
());
Datasource
ds
=
datasourceMapper
.
selectByPrimaryKey
(
datasetTable
.
getDataSourceId
());
DataSetTableRequest
dataSetTableRequest
=
new
DataSetTableRequest
();
DataSetTableRequest
dataSetTableRequest
=
new
DataSetTableRequest
();
...
@@ -193,7 +229,7 @@ public class DataSetTableService {
...
@@ -193,7 +229,7 @@ public class DataSetTableService {
if
(
CollectionUtils
.
isNotEmpty
(
fields
))
{
if
(
CollectionUtils
.
isNotEmpty
(
fields
))
{
for
(
int
i
=
0
;
i
<
fields
.
size
();
i
++)
{
for
(
int
i
=
0
;
i
<
fields
.
size
();
i
++)
{
TableFiled
filed
=
fields
.
get
(
i
);
TableFiled
filed
=
fields
.
get
(
i
);
DatasetTableField
datasetTableField
=
new
DatasetTableFie
ld
();
DatasetTableField
datasetTableField
=
DatasetTableField
.
builder
().
bui
ld
();
datasetTableField
.
setTableId
(
datasetTable
.
getId
());
datasetTableField
.
setTableId
(
datasetTable
.
getId
());
datasetTableField
.
setOriginName
(
filed
.
getFieldName
());
datasetTableField
.
setOriginName
(
filed
.
getFieldName
());
datasetTableField
.
setName
(
filed
.
getRemarks
());
datasetTableField
.
setName
(
filed
.
getRemarks
());
...
...
backend/src/main/java/io/dataease/service/dataset/DataSetTableTaskLogService.java
浏览文件 @
d449e41c
...
@@ -48,4 +48,11 @@ public class DataSetTableTaskLogService {
...
@@ -48,4 +48,11 @@ public class DataSetTableTaskLogService {
return
extDataSetTaskMapper
.
list
(
request
);
return
extDataSetTaskMapper
.
list
(
request
);
}
}
public
void
deleteByTaskId
(
String
taskId
){
DatasetTableTaskLogExample
datasetTableTaskLogExample
=
new
DatasetTableTaskLogExample
();
DatasetTableTaskLogExample
.
Criteria
criteria
=
datasetTableTaskLogExample
.
createCriteria
();
criteria
.
andTaskIdEqualTo
(
taskId
);
datasetTableTaskLogMapper
.
deleteByExample
(
datasetTableTaskLogExample
);
}
}
}
backend/src/main/java/io/dataease/service/dataset/DataSetTableTaskService.java
浏览文件 @
d449e41c
...
@@ -20,7 +20,8 @@ import java.util.UUID;
...
@@ -20,7 +20,8 @@ import java.util.UUID;
public
class
DataSetTableTaskService
{
public
class
DataSetTableTaskService
{
@Resource
@Resource
private
DatasetTableTaskMapper
datasetTableTaskMapper
;
private
DatasetTableTaskMapper
datasetTableTaskMapper
;
@Resource
private
DataSetTableTaskLogService
dataSetTableTaskLogService
;
@Resource
@Resource
private
ScheduleService
scheduleService
;
private
ScheduleService
scheduleService
;
...
@@ -46,6 +47,11 @@ public class DataSetTableTaskService {
...
@@ -46,6 +47,11 @@ public class DataSetTableTaskService {
DatasetTableTask
datasetTableTask
=
datasetTableTaskMapper
.
selectByPrimaryKey
(
id
);
DatasetTableTask
datasetTableTask
=
datasetTableTaskMapper
.
selectByPrimaryKey
(
id
);
datasetTableTaskMapper
.
deleteByPrimaryKey
(
id
);
datasetTableTaskMapper
.
deleteByPrimaryKey
(
id
);
scheduleService
.
deleteSchedule
(
datasetTableTask
);
scheduleService
.
deleteSchedule
(
datasetTableTask
);
dataSetTableTaskLogService
.
deleteByTaskId
(
id
);
}
public
DatasetTableTask
get
(
String
id
)
{
return
datasetTableTaskMapper
.
selectByPrimaryKey
(
id
);
}
}
public
List
<
DatasetTableTask
>
list
(
DatasetTableTask
datasetTableTask
)
{
public
List
<
DatasetTableTask
>
list
(
DatasetTableTask
datasetTableTask
)
{
...
...
backend/src/main/java/io/dataease/service/dataset/ExtractDataService.java
0 → 100644
浏览文件 @
d449e41c
package
io
.
dataease
.
service
.
dataset
;
import
com.google.gson.Gson
;
import
io.dataease.base.domain.DatasetTable
;
import
io.dataease.base.domain.DatasetTableField
;
import
io.dataease.base.domain.DatasetTableTaskLog
;
import
io.dataease.commons.constants.JobStatus
;
import
io.dataease.commons.utils.CommonBeanFactory
;
import
io.dataease.dto.dataset.DataTableInfoDTO
;
import
org.apache.hadoop.conf.Configuration
;
import
org.apache.hadoop.hbase.TableName
;
import
org.apache.hadoop.hbase.client.*
;
import
org.springframework.stereotype.Service
;
import
javax.annotation.Resource
;
import
java.util.List
;
import
java.util.UUID
;
import
java.util.concurrent.ExecutorService
;
import
java.util.concurrent.Executors
;
@Service
public
class
ExtractDataService
{
@Resource
private
DataSetTableService
dataSetTableService
;
@Resource
private
DataSetTableFieldsService
dataSetTableFieldsService
;
@Resource
private
DataSetTableTaskLogService
dataSetTableTaskLogService
;
private
Long
pageSize
=
10000
l
;
private
static
ExecutorService
pool
=
Executors
.
newScheduledThreadPool
(
50
);
//设置连接池
private
Connection
connection
;
public
void
extractData
(
String
datasetTableId
,
String
taskId
)
{
DatasetTableTaskLog
datasetTableTaskLog
=
new
DatasetTableTaskLog
();
try
{
datasetTableTaskLog
.
setTableId
(
datasetTableId
);
datasetTableTaskLog
.
setTaskId
(
taskId
);
datasetTableTaskLog
.
setStatus
(
JobStatus
.
Underway
.
name
());
datasetTableTaskLog
.
setStartTime
(
System
.
currentTimeMillis
());
dataSetTableTaskLogService
.
save
(
datasetTableTaskLog
);
Admin
admin
=
getConnection
().
getAdmin
();
DatasetTable
datasetTable
=
dataSetTableService
.
get
(
datasetTableId
);
List
<
DatasetTableField
>
datasetTableFields
=
dataSetTableFieldsService
.
list
(
DatasetTableField
.
builder
().
tableId
(
datasetTable
.
getId
()).
build
());
String
table
=
new
Gson
().
fromJson
(
datasetTable
.
getInfo
(),
DataTableInfoDTO
.
class
).
getTable
();
TableName
tableName
=
TableName
.
valueOf
(
table
+
"-"
+
datasetTable
.
getDataSourceId
());
if
(!
admin
.
tableExists
(
tableName
)){
TableDescriptorBuilder
descBuilder
=
TableDescriptorBuilder
.
newBuilder
(
tableName
);
ColumnFamilyDescriptor
hcd
=
ColumnFamilyDescriptorBuilder
.
of
(
"cf"
);
descBuilder
.
setColumnFamily
(
hcd
);
TableDescriptor
desc
=
descBuilder
.
build
();
admin
.
createTable
(
desc
);
}
admin
.
disableTable
(
tableName
);
admin
.
truncateTable
(
tableName
,
true
);
Table
tab
=
getConnection
().
getTable
(
tableName
);
Long
total
=
dataSetTableService
.
getDataSetTotalData
(
datasetTable
.
getDataSourceId
(),
table
);
Long
pageCount
=
total
%
pageSize
==
0
?
total
/
pageSize
:
(
total
/
pageSize
)
+
1
;
for
(
Long
pageIndex
=
1
l
;
pageIndex
<=
pageCount
;
pageIndex
++)
{
List
<
String
[]>
data
=
dataSetTableService
.
getDataSetPageData
(
datasetTable
.
getDataSourceId
(),
table
,
datasetTableFields
,
pageIndex
,
pageSize
);
for
(
String
[]
d
:
data
)
{
for
(
int
i
=
0
;
i
<
datasetTableFields
.
size
();
i
++){
Put
put
=
new
Put
(
UUID
.
randomUUID
().
toString
().
getBytes
());
String
value
=
d
[
i
];
if
(
value
==
null
){
value
=
"null"
;
}
put
.
addColumn
(
"cf"
.
getBytes
(),
datasetTableFields
.
get
(
i
).
getOriginName
().
getBytes
(),
value
.
getBytes
());
tab
.
put
(
put
);
}
}
}
datasetTableTaskLog
.
setStatus
(
JobStatus
.
Completed
.
name
());
datasetTableTaskLog
.
setEndTime
(
System
.
currentTimeMillis
());
dataSetTableTaskLogService
.
save
(
datasetTableTaskLog
);
}
catch
(
Exception
e
){
datasetTableTaskLog
.
setStatus
(
JobStatus
.
Error
.
name
());
datasetTableTaskLog
.
setEndTime
(
System
.
currentTimeMillis
());
dataSetTableTaskLogService
.
save
(
datasetTableTaskLog
);
}
}
private
synchronized
Connection
getConnection
()
throws
Exception
{
if
(
connection
==
null
||
connection
.
isClosed
()){
Configuration
cfg
=
CommonBeanFactory
.
getBean
(
Configuration
.
class
);
connection
=
ConnectionFactory
.
createConnection
(
cfg
,
pool
);
}
return
connection
;
}
}
pom.xml
浏览文件 @
d449e41c
...
@@ -10,14 +10,14 @@
...
@@ -10,14 +10,14 @@
<parent>
<parent>
<groupId>
org.springframework.boot
</groupId>
<groupId>
org.springframework.boot
</groupId>
<artifactId>
spring-boot-starter-parent
</artifactId>
<artifactId>
spring-boot-starter-parent
</artifactId>
<!--<version>2.2.6.RELEASE</version>-->
<version>
2.4.3
</version>
<version>
2.4.3
</version>
<relativePath/>
<!-- lookup parent from repository -->
<relativePath/>
</parent>
</parent>
<name>
dataease
</name>
<name>
dataease
</name>
<modules>
<modules>
<module>
backend
</module>
<module>
backend
</module>
<module>
frontend
</module>
</modules>
</modules>
</project>
</project>
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论