Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
D
dataease
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
njgzx
dataease
Commits
9302d0f0
提交
9302d0f0
authored
4月 23, 2021
作者:
junjie
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
feat(backend):数据抽取,切换Doris
上级
84f68f1e
全部展开
隐藏空白字符变更
内嵌
并排
正在显示
9 个修改的文件
包含
117 行增加
和
143 行删除
+117
-143
CommonConfig.java
backend/src/main/java/io/dataease/config/CommonConfig.java
+25
-26
AppStartInitDataSourceListener.java
.../io/dataease/listener/AppStartInitDataSourceListener.java
+0
-9
AppStartReadHBaseListener.java
.../java/io/dataease/listener/AppStartReadHBaseListener.java
+47
-52
ChartViewService.java
...main/java/io/dataease/service/chart/ChartViewService.java
+35
-6
DataSetTableService.java
...java/io/dataease/service/dataset/DataSetTableService.java
+3
-2
ExtractDataService.java
.../java/io/dataease/service/dataset/ExtractDataService.java
+3
-44
SparkCalc.java
...nd/src/main/java/io/dataease/service/spark/SparkCalc.java
+0
-0
zh.js
frontend/src/lang/zh.js
+2
-2
QuotaItem.vue
frontend/src/views/chart/components/drag-item/QuotaItem.vue
+2
-2
没有找到文件。
backend/src/main/java/io/dataease/config/CommonConfig.java
浏览文件 @
9302d0f0
...
@@ -2,7 +2,6 @@ package io.dataease.config;
...
@@ -2,7 +2,6 @@ package io.dataease.config;
import
com.fit2cloud.autoconfigure.QuartzAutoConfiguration
;
import
com.fit2cloud.autoconfigure.QuartzAutoConfiguration
;
import
io.dataease.commons.utils.CommonThreadPool
;
import
io.dataease.commons.utils.CommonThreadPool
;
import
org.apache.spark.sql.SparkSession
;
import
org.pentaho.di.core.KettleEnvironment
;
import
org.pentaho.di.core.KettleEnvironment
;
import
org.pentaho.di.repository.filerep.KettleFileRepository
;
import
org.pentaho.di.repository.filerep.KettleFileRepository
;
import
org.pentaho.di.repository.filerep.KettleFileRepositoryMeta
;
import
org.pentaho.di.repository.filerep.KettleFileRepositoryMeta
;
...
@@ -32,31 +31,31 @@ public class CommonConfig {
...
@@ -32,31 +31,31 @@ public class CommonConfig {
// return configuration;
// return configuration;
// }
// }
@Bean
//
@Bean
@ConditionalOnMissingBean
//
@ConditionalOnMissingBean
public
SparkSession
javaSparkSession
()
{
//
public SparkSession javaSparkSession() {
SparkSession
spark
=
SparkSession
.
builder
()
//
SparkSession spark = SparkSession.builder()
.
appName
(
env
.
getProperty
(
"spark.appName"
,
"DataeaseJob"
))
//
.appName(env.getProperty("spark.appName", "DataeaseJob"))
.
master
(
env
.
getProperty
(
"spark.master"
,
"local[*]"
))
//
.master(env.getProperty("spark.master", "local[*]"))
.
config
(
"spark.scheduler.mode"
,
env
.
getProperty
(
"spark.scheduler.mode"
,
"FAIR"
))
//
.config("spark.scheduler.mode", env.getProperty("spark.scheduler.mode", "FAIR"))
.
config
(
"spark.serializer"
,
env
.
getProperty
(
"spark.serializer"
,
"org.apache.spark.serializer.KryoSerializer"
))
////
.config("spark.serializer", env.getProperty("spark.serializer", "org.apache.spark.serializer.KryoSerializer"))
.
config
(
"spark.executor.cores"
,
env
.
getProperty
(
"spark.executor.cores"
,
"8"
))
////
.config("spark.executor.cores", env.getProperty("spark.executor.cores", "8"))
.
config
(
"spark.executor.memory"
,
env
.
getProperty
(
"spark.executor.memory"
,
"6442450944b"
))
////
.config("spark.executor.memory", env.getProperty("spark.executor.memory", "6442450944b"))
.
config
(
"spark.locality.wait"
,
env
.
getProperty
(
"spark.locality.wait"
,
"600000"
))
////
.config("spark.locality.wait", env.getProperty("spark.locality.wait", "600000"))
.
config
(
"spark.maxRemoteBlockSizeFetchToMem"
,
env
.
getProperty
(
"spark.maxRemoteBlockSizeFetchToMem"
,
"2000m"
))
////
.config("spark.maxRemoteBlockSizeFetchToMem", env.getProperty("spark.maxRemoteBlockSizeFetchToMem", "2000m"))
.
config
(
"spark.shuffle.detectCorrupt"
,
env
.
getProperty
(
"spark.shuffle.detectCorrupt"
,
"false"
))
////
.config("spark.shuffle.detectCorrupt", env.getProperty("spark.shuffle.detectCorrupt", "false"))
.
config
(
"spark.shuffle.service.enabled"
,
env
.
getProperty
(
"spark.shuffle.service.enabled"
,
"true"
))
////
.config("spark.shuffle.service.enabled", env.getProperty("spark.shuffle.service.enabled", "true"))
.
config
(
"spark.sql.adaptive.enabled"
,
env
.
getProperty
(
"spark.sql.adaptive.enabled"
,
"true"
))
////
.config("spark.sql.adaptive.enabled", env.getProperty("spark.sql.adaptive.enabled", "true"))
.
config
(
"spark.sql.adaptive.shuffle.targetPostShuffleInputSize"
,
env
.
getProperty
(
"spark.sql.adaptive.shuffle.targetPostShuffleInputSize"
,
"200M"
))
////
.config("spark.sql.adaptive.shuffle.targetPostShuffleInputSize", env.getProperty("spark.sql.adaptive.shuffle.targetPostShuffleInputSize", "200M"))
.
config
(
"spark.sql.broadcastTimeout"
,
env
.
getProperty
(
"spark.sql.broadcastTimeout"
,
"12000"
))
////
.config("spark.sql.broadcastTimeout", env.getProperty("spark.sql.broadcastTimeout", "12000"))
.
config
(
"spark.sql.retainGroupColumns"
,
env
.
getProperty
(
"spark.sql.retainGroupColumns"
,
"false"
))
////
.config("spark.sql.retainGroupColumns", env.getProperty("spark.sql.retainGroupColumns", "false"))
.
config
(
"spark.sql.sortMergeJoinExec.buffer.in.memory.threshold"
,
env
.
getProperty
(
"spark.sql.sortMergeJoinExec.buffer.in.memory.threshold"
,
"100000"
))
////
.config("spark.sql.sortMergeJoinExec.buffer.in.memory.threshold", env.getProperty("spark.sql.sortMergeJoinExec.buffer.in.memory.threshold", "100000"))
.
config
(
"spark.sql.sortMergeJoinExec.buffer.spill.threshold"
,
env
.
getProperty
(
"spark.sql.sortMergeJoinExec.buffer.spill.threshold"
,
"100000"
))
////
.config("spark.sql.sortMergeJoinExec.buffer.spill.threshold", env.getProperty("spark.sql.sortMergeJoinExec.buffer.spill.threshold", "100000"))
.
config
(
"spark.sql.variable.substitute"
,
env
.
getProperty
(
"spark.sql.variable.substitute"
,
"false"
))
////
.config("spark.sql.variable.substitute", env.getProperty("spark.sql.variable.substitute", "false"))
.
config
(
"spark.temp.expired.time"
,
env
.
getProperty
(
"spark.temp.expired.time"
,
"3600"
))
////
.config("spark.temp.expired.time", env.getProperty("spark.temp.expired.time", "3600"))
.
getOrCreate
();
//
.getOrCreate();
return
spark
;
//
return spark;
}
//
}
@Bean
@Bean
@ConditionalOnMissingBean
@ConditionalOnMissingBean
...
...
backend/src/main/java/io/dataease/listener/AppStartInitDataSourceListener.java
浏览文件 @
9302d0f0
package
io
.
dataease
.
listener
;
package
io
.
dataease
.
listener
;
import
io.dataease.base.domain.DatasetTable
;
import
io.dataease.base.domain.DatasetTableExample
;
import
io.dataease.base.domain.DatasetTableField
;
import
io.dataease.base.mapper.DatasetTableMapper
;
import
io.dataease.commons.utils.CommonThreadPool
;
import
io.dataease.datasource.service.DatasourceService
;
import
io.dataease.datasource.service.DatasourceService
;
import
io.dataease.service.dataset.DataSetTableFieldsService
;
import
io.dataease.service.spark.SparkCalc
;
import
org.springframework.boot.context.event.ApplicationReadyEvent
;
import
org.springframework.boot.context.event.ApplicationReadyEvent
;
import
org.springframework.context.ApplicationListener
;
import
org.springframework.context.ApplicationListener
;
import
org.springframework.core.annotation.Order
;
import
org.springframework.core.annotation.Order
;
import
org.springframework.core.env.Environment
;
import
org.springframework.stereotype.Component
;
import
org.springframework.stereotype.Component
;
import
javax.annotation.Resource
;
import
javax.annotation.Resource
;
import
java.util.List
;
@Component
@Component
@Order
(
value
=
2
)
@Order
(
value
=
2
)
...
...
backend/src/main/java/io/dataease/listener/AppStartReadHBaseListener.java
浏览文件 @
9302d0f0
package
io
.
dataease
.
listener
;
//package io.dataease.listener;
//
import
io.dataease.base.domain.DatasetTable
;
//import io.dataease.base.mapper.DatasetTableMapper;
import
io.dataease.base.domain.DatasetTableExample
;
//import io.dataease.commons.utils.CommonThreadPool;
import
io.dataease.base.domain.DatasetTableField
;
//import io.dataease.service.dataset.DataSetTableFieldsService;
import
io.dataease.base.mapper.DatasetTableMapper
;
//import org.springframework.boot.context.event.ApplicationReadyEvent;
import
io.dataease.commons.utils.CommonThreadPool
;
//import org.springframework.context.ApplicationListener;
import
io.dataease.service.dataset.DataSetTableFieldsService
;
//import org.springframework.core.annotation.Order;
import
io.dataease.service.spark.SparkCalc
;
//import org.springframework.core.env.Environment;
import
org.springframework.boot.context.event.ApplicationReadyEvent
;
//import org.springframework.stereotype.Component;
import
org.springframework.context.ApplicationListener
;
//
import
org.springframework.core.annotation.Order
;
//import javax.annotation.Resource;
import
org.springframework.core.env.Environment
;
//
import
org.springframework.stereotype.Component
;
//@Component
//@Order(value = 2)
import
javax.annotation.Resource
;
//public class AppStartReadHBaseListener implements ApplicationListener<ApplicationReadyEvent> {
import
java.util.List
;
// @Resource
// private CommonThreadPool commonThreadPool;
@Component
//// @Resource
@Order
(
value
=
2
)
//// private SparkCalc sparkCalc;
public
class
AppStartReadHBaseListener
implements
ApplicationListener
<
ApplicationReadyEvent
>
{
// @Resource
@Resource
// private Environment env; // 保存了配置文件的信息
private
CommonThreadPool
commonThreadPool
;
//
@Resource
// @Resource
private
SparkCalc
sparkCalc
;
// private DatasetTableMapper datasetTableMapper;
@Resource
// @Resource
private
Environment
env
;
// 保存了配置文件的信息
// private DataSetTableFieldsService dataSetTableFieldsService;
//
@Resource
// @Override
private
DatasetTableMapper
datasetTableMapper
;
// public void onApplicationEvent(ApplicationReadyEvent applicationReadyEvent) {
@Resource
//// System.out.println("================= Read HBase start =================");
private
DataSetTableFieldsService
dataSetTableFieldsService
;
//// // 项目启动,从数据集中找到定时抽取的表,从HBase中读取放入缓存
//// DatasetTableExample datasetTableExample = new DatasetTableExample();
@Override
//// datasetTableExample.createCriteria().andModeEqualTo(1);
public
void
onApplicationEvent
(
ApplicationReadyEvent
applicationReadyEvent
)
{
//// List<DatasetTable> datasetTables = datasetTableMapper.selectByExampleWithBLOBs(datasetTableExample);
// System.out.println("================= Read HBase start =================");
//// for (DatasetTable table : datasetTables) {
// // 项目启动,从数据集中找到定时抽取的表,从HBase中读取放入缓存
////// commonThreadPool.addTask(() -> {
// DatasetTableExample datasetTableExample = new DatasetTableExample();
//// try {
// datasetTableExample.createCriteria().andModeEqualTo(1);
//// List<DatasetTableField> fields = dataSetTableFieldsService.getFieldsByTableId(table.getId());
// List<DatasetTable> datasetTables = datasetTableMapper.selectByExampleWithBLOBs(datasetTableExample);
//// sparkCalc.getHBaseDataAndCache(table.getId(), fields);
// for (DatasetTable table : datasetTables) {
//// } catch (Exception e) {
//// commonThreadPool.addTask(() -> {
//// e.printStackTrace();
// try {
//// }
// List<DatasetTableField> fields = dataSetTableFieldsService.getFieldsByTableId(table.getId());
////// });
// sparkCalc.getHBaseDataAndCache(table.getId(), fields);
//// }
// } catch (Exception e) {
// }
// e.printStackTrace();
//}
// }
//// });
// }
}
}
backend/src/main/java/io/dataease/service/chart/ChartViewService.java
浏览文件 @
9302d0f0
package
io
.
dataease
.
service
.
chart
;
package
io
.
dataease
.
service
.
chart
;
import
com.alibaba.fastjson.JSONObject
;
import
com.google.gson.Gson
;
import
com.google.gson.Gson
;
import
com.google.gson.reflect.TypeToken
;
import
com.google.gson.reflect.TypeToken
;
import
io.dataease.base.domain.*
;
import
io.dataease.base.domain.*
;
import
io.dataease.base.mapper.ChartViewMapper
;
import
io.dataease.base.mapper.ChartViewMapper
;
import
io.dataease.commons.utils.AuthUtils
;
import
io.dataease.commons.utils.AuthUtils
;
import
io.dataease.commons.utils.BeanUtils
;
import
io.dataease.commons.utils.BeanUtils
;
import
io.dataease.commons.utils.CommonBeanFactory
;
import
io.dataease.controller.request.chart.ChartExtFilterRequest
;
import
io.dataease.controller.request.chart.ChartExtFilterRequest
;
import
io.dataease.controller.request.chart.ChartExtRequest
;
import
io.dataease.controller.request.chart.ChartExtRequest
;
import
io.dataease.controller.request.chart.ChartViewRequest
;
import
io.dataease.controller.request.chart.ChartViewRequest
;
...
@@ -20,7 +22,6 @@ import io.dataease.dto.chart.Series;
...
@@ -20,7 +22,6 @@ import io.dataease.dto.chart.Series;
import
io.dataease.dto.dataset.DataTableInfoDTO
;
import
io.dataease.dto.dataset.DataTableInfoDTO
;
import
io.dataease.service.dataset.DataSetTableFieldsService
;
import
io.dataease.service.dataset.DataSetTableFieldsService
;
import
io.dataease.service.dataset.DataSetTableService
;
import
io.dataease.service.dataset.DataSetTableService
;
import
io.dataease.service.spark.SparkCalc
;
import
org.apache.commons.collections4.CollectionUtils
;
import
org.apache.commons.collections4.CollectionUtils
;
import
org.apache.commons.lang3.ObjectUtils
;
import
org.apache.commons.lang3.ObjectUtils
;
import
org.apache.commons.lang3.StringUtils
;
import
org.apache.commons.lang3.StringUtils
;
...
@@ -43,8 +44,8 @@ public class ChartViewService {
...
@@ -43,8 +44,8 @@ public class ChartViewService {
private
DataSetTableService
dataSetTableService
;
private
DataSetTableService
dataSetTableService
;
@Resource
@Resource
private
DatasourceService
datasourceService
;
private
DatasourceService
datasourceService
;
@Resource
//
@Resource
private
SparkCalc
sparkCalc
;
//
private SparkCalc sparkCalc;
@Resource
@Resource
private
DataSetTableFieldsService
dataSetTableFieldsService
;
private
DataSetTableFieldsService
dataSetTableFieldsService
;
...
@@ -146,8 +147,18 @@ public class ChartViewService {
...
@@ -146,8 +147,18 @@ public class ChartViewService {
data
=
datasourceProvider
.
getData
(
datasourceRequest
);
data
=
datasourceProvider
.
getData
(
datasourceRequest
);
}
else
if
(
table
.
getMode
()
==
1
)
{
// 抽取
}
else
if
(
table
.
getMode
()
==
1
)
{
// 抽取
// 获取数据集de字段
// 获取数据集de字段
List
<
DatasetTableField
>
fields
=
dataSetTableFieldsService
.
getFieldsByTableId
(
table
.
getId
());
// List<DatasetTableField> fields = dataSetTableFieldsService.getFieldsByTableId(table.getId());
data
=
sparkCalc
.
getData
(
table
.
getId
(),
fields
,
xAxis
,
yAxis
,
"tmp_"
+
view
.
getId
().
split
(
"-"
)[
0
],
extFilterList
);
// data = sparkCalc.getData(table.getId(), fields, xAxis, yAxis, "tmp_" + view.getId().split("-")[0], extFilterList);
// 连接doris,构建doris数据源查询
Datasource
ds
=
dorisDatasource
();
DatasourceProvider
datasourceProvider
=
ProviderFactory
.
getProvider
(
ds
.
getType
());
DatasourceRequest
datasourceRequest
=
new
DatasourceRequest
();
datasourceRequest
.
setDatasource
(
ds
);
String
tableName
=
"ds_"
+
table
.
getId
().
replaceAll
(
"-"
,
"_"
);
datasourceRequest
.
setTable
(
tableName
);
datasourceRequest
.
setQuery
(
getSQL
(
ds
.
getType
(),
tableName
,
xAxis
,
yAxis
,
extFilterList
));
data
=
datasourceProvider
.
getData
(
datasourceRequest
);
}
}
// 图表组件可再扩展
// 图表组件可再扩展
...
@@ -214,6 +225,24 @@ public class ChartViewService {
...
@@ -214,6 +225,24 @@ public class ChartViewService {
return
filter
.
toString
();
return
filter
.
toString
();
}
}
public
Datasource
dorisDatasource
()
{
JSONObject
jsonObject
=
new
JSONObject
();
jsonObject
.
put
(
"dataSourceType"
,
"jdbc"
);
jsonObject
.
put
(
"dataBase"
,
"example_db"
);
jsonObject
.
put
(
"username"
,
"root"
);
jsonObject
.
put
(
"password"
,
"dataease"
);
jsonObject
.
put
(
"host"
,
"59.110.64.159"
);
jsonObject
.
put
(
"port"
,
"9030"
);
Datasource
datasource
=
new
Datasource
();
datasource
.
setId
(
"doris"
);
datasource
.
setName
(
"doris"
);
datasource
.
setDesc
(
"doris"
);
datasource
.
setType
(
"mysql"
);
datasource
.
setConfiguration
(
jsonObject
.
toJSONString
());
return
datasource
;
}
public
String
getSQL
(
String
type
,
String
table
,
List
<
ChartViewFieldDTO
>
xAxis
,
List
<
ChartViewFieldDTO
>
yAxis
,
List
<
ChartExtFilterRequest
>
extFilterRequestList
)
{
public
String
getSQL
(
String
type
,
String
table
,
List
<
ChartViewFieldDTO
>
xAxis
,
List
<
ChartViewFieldDTO
>
yAxis
,
List
<
ChartExtFilterRequest
>
extFilterRequestList
)
{
DatasourceTypes
datasourceType
=
DatasourceTypes
.
valueOf
(
type
);
DatasourceTypes
datasourceType
=
DatasourceTypes
.
valueOf
(
type
);
switch
(
datasourceType
)
{
switch
(
datasourceType
)
{
...
@@ -321,7 +350,7 @@ public class ChartViewService {
...
@@ -321,7 +350,7 @@ public class ChartViewService {
return
map
;
return
map
;
}
}
public
List
<
ChartView
>
viewsByIds
(
List
<
String
>
viewIds
){
public
List
<
ChartView
>
viewsByIds
(
List
<
String
>
viewIds
)
{
ChartViewExample
example
=
new
ChartViewExample
();
ChartViewExample
example
=
new
ChartViewExample
();
example
.
createCriteria
().
andIdIn
(
viewIds
);
example
.
createCriteria
().
andIdIn
(
viewIds
);
return
chartViewMapper
.
selectByExample
(
example
);
return
chartViewMapper
.
selectByExample
(
example
);
...
...
backend/src/main/java/io/dataease/service/dataset/DataSetTableService.java
浏览文件 @
9302d0f0
...
@@ -637,11 +637,12 @@ public class DataSetTableService {
...
@@ -637,11 +637,12 @@ public class DataSetTableService {
private
String
saveFile
(
MultipartFile
file
)
throws
Exception
{
private
String
saveFile
(
MultipartFile
file
)
throws
Exception
{
String
filename
=
file
.
getOriginalFilename
();
String
filename
=
file
.
getOriginalFilename
();
File
p
=
new
File
(
path
);
String
dirPath
=
path
+
AuthUtils
.
getUser
().
getUsername
()
+
"/"
;
File
p
=
new
File
(
dirPath
);
if
(!
p
.
exists
())
{
if
(!
p
.
exists
())
{
p
.
mkdirs
();
p
.
mkdirs
();
}
}
String
filePath
=
path
+
AuthUtils
.
getUser
().
getUsername
()
+
"/"
+
filename
;
String
filePath
=
dirPath
+
filename
;
File
f
=
new
File
(
filePath
);
File
f
=
new
File
(
filePath
);
FileOutputStream
fileOutputStream
=
new
FileOutputStream
(
f
);
FileOutputStream
fileOutputStream
=
new
FileOutputStream
(
f
);
fileOutputStream
.
write
(
file
.
getBytes
());
fileOutputStream
.
write
(
file
.
getBytes
());
...
...
backend/src/main/java/io/dataease/service/dataset/ExtractDataService.java
浏览文件 @
9302d0f0
package
io
.
dataease
.
service
.
dataset
;
package
io
.
dataease
.
service
.
dataset
;
import
com.google.gson.Gson
;
import
com.google.gson.Gson
;
import
com.sun.org.apache.bcel.internal.generic.SWITCH
;
import
io.dataease.base.domain.*
;
import
io.dataease.base.domain.*
;
import
io.dataease.base.mapper.DatasourceMapper
;
import
io.dataease.base.mapper.DatasourceMapper
;
import
io.dataease.commons.constants.JobStatus
;
import
io.dataease.commons.constants.JobStatus
;
...
@@ -13,31 +12,15 @@ import io.dataease.datasource.constants.DatasourceTypes;
...
@@ -13,31 +12,15 @@ import io.dataease.datasource.constants.DatasourceTypes;
import
io.dataease.datasource.dto.MysqlConfigrationDTO
;
import
io.dataease.datasource.dto.MysqlConfigrationDTO
;
import
io.dataease.dto.dataset.DataSetTaskLogDTO
;
import
io.dataease.dto.dataset.DataSetTaskLogDTO
;
import
io.dataease.dto.dataset.DataTableInfoDTO
;
import
io.dataease.dto.dataset.DataTableInfoDTO
;
import
io.dataease.service.spark.SparkCalc
;
import
org.apache.commons.collections4.CollectionUtils
;
import
org.apache.commons.collections4.CollectionUtils
;
import
org.apache.commons.io.FileUtils
;
import
org.apache.commons.io.FileUtils
;
import
org.apache.commons.lang3.StringUtils
;
import
org.apache.commons.lang3.StringUtils
;
import
org.apache.hadoop.conf.Configuration
;
import
org.apache.hadoop.hbase.TableName
;
import
org.apache.hadoop.hbase.TableName
;
import
org.apache.hadoop.hbase.client.*
;
import
org.apache.hadoop.hbase.client.Connection
;
import
org.pentaho.big.data.api.cluster.NamedCluster
;
import
org.pentaho.big.data.api.cluster.NamedClusterService
;
import
org.pentaho.big.data.api.cluster.service.locator.NamedClusterServiceLocator
;
import
org.pentaho.big.data.api.cluster.service.locator.impl.NamedClusterServiceLocatorImpl
;
import
org.pentaho.big.data.api.initializer.ClusterInitializer
;
import
org.pentaho.big.data.api.initializer.ClusterInitializerProvider
;
import
org.pentaho.big.data.api.initializer.impl.ClusterInitializerImpl
;
import
org.pentaho.big.data.impl.cluster.NamedClusterImpl
;
import
org.pentaho.big.data.impl.cluster.NamedClusterManager
;
import
org.pentaho.big.data.kettle.plugins.hbase.MappingDefinition
;
import
org.pentaho.big.data.kettle.plugins.hbase.output.HBaseOutputMeta
;
import
org.pentaho.di.cluster.SlaveServer
;
import
org.pentaho.di.cluster.SlaveServer
;
import
org.pentaho.di.core.KettleEnvironment
;
import
org.pentaho.di.core.database.DatabaseMeta
;
import
org.pentaho.di.core.database.DatabaseMeta
;
import
org.pentaho.di.core.plugins.PluginRegistry
;
import
org.pentaho.di.core.plugins.PluginRegistry
;
import
org.pentaho.di.core.plugins.StepPluginType
;
import
org.pentaho.di.core.plugins.StepPluginType
;
import
org.pentaho.di.core.util.EnvUtil
;
import
org.pentaho.di.engine.configuration.impl.pentaho.DefaultRunConfiguration
;
import
org.pentaho.di.job.Job
;
import
org.pentaho.di.job.Job
;
import
org.pentaho.di.job.JobExecutionConfiguration
;
import
org.pentaho.di.job.JobExecutionConfiguration
;
import
org.pentaho.di.job.JobHopMeta
;
import
org.pentaho.di.job.JobHopMeta
;
...
@@ -45,49 +28,25 @@ import org.pentaho.di.job.JobMeta;
...
@@ -45,49 +28,25 @@ import org.pentaho.di.job.JobMeta;
import
org.pentaho.di.job.entries.special.JobEntrySpecial
;
import
org.pentaho.di.job.entries.special.JobEntrySpecial
;
import
org.pentaho.di.job.entries.success.JobEntrySuccess
;
import
org.pentaho.di.job.entries.success.JobEntrySuccess
;
import
org.pentaho.di.job.entries.trans.JobEntryTrans
;
import
org.pentaho.di.job.entries.trans.JobEntryTrans
;
import
org.pentaho.di.job.entries.writetolog.JobEntryWriteToLog
;
import
org.pentaho.di.job.entry.JobEntryCopy
;
import
org.pentaho.di.job.entry.JobEntryCopy
;
import
org.pentaho.di.repository.RepositoryDirectoryInterface
;
import
org.pentaho.di.repository.RepositoryDirectoryInterface
;
import
org.pentaho.di.repository.filerep.KettleFileRepository
;
import
org.pentaho.di.repository.filerep.KettleFileRepository
;
import
org.pentaho.di.repository.filerep.KettleFileRepositoryMeta
;
import
org.pentaho.di.trans.TransConfiguration
;
import
org.pentaho.di.trans.TransExecutionConfiguration
;
import
org.pentaho.di.trans.TransHopMeta
;
import
org.pentaho.di.trans.TransHopMeta
;
import
org.pentaho.di.trans.TransMeta
;
import
org.pentaho.di.trans.TransMeta
;
import
org.pentaho.di.trans.step.StepMeta
;
import
org.pentaho.di.trans.step.StepMeta
;
import
org.pentaho.di.trans.steps.tableinput.TableInputMeta
;
import
org.pentaho.di.trans.steps.tableinput.TableInputMeta
;
import
org.pentaho.di.trans.steps.textfileoutput.TextFileField
;
import
org.pentaho.di.trans.steps.textfileoutput.TextFileField
;
import
org.pentaho.di.trans.steps.textfileoutput.TextFileOutput
;
import
org.pentaho.di.trans.steps.textfileoutput.TextFileOutputMeta
;
import
org.pentaho.di.trans.steps.textfileoutput.TextFileOutputMeta
;
import
org.pentaho.di.trans.steps.userdefinedjavaclass.InfoStepDefinition
;
import
org.pentaho.di.trans.steps.userdefinedjavaclass.UserDefinedJavaClassDef
;
import
org.pentaho.di.trans.steps.userdefinedjavaclass.UserDefinedJavaClassMeta
;
import
org.pentaho.di.www.SlaveServerJobStatus
;
import
org.pentaho.di.www.SlaveServerJobStatus
;
import
org.pentaho.runtime.test.RuntimeTest
;
import
org.pentaho.runtime.test.RuntimeTester
;
import
org.pentaho.runtime.test.action.RuntimeTestActionHandler
;
import
org.pentaho.runtime.test.action.RuntimeTestActionService
;
import
org.pentaho.runtime.test.action.impl.RuntimeTestActionServiceImpl
;
import
org.pentaho.runtime.test.impl.RuntimeTesterImpl
;
import
org.springframework.beans.factory.annotation.Value
;
import
org.springframework.beans.factory.annotation.Value
;
import
org.springframework.stereotype.Service
;
import
org.springframework.stereotype.Service
;
import
org.pentaho.di.core.row.ValueMetaInterface
;
import
scala.annotation.meta.field
;
import
javax.annotation.Resource
;
import
javax.annotation.Resource
;
import
javax.sound.sampled.Line
;
import
java.io.File
;
import
java.io.File
;
import
java.security.MessageDigest
;
import
java.sql.ResultSet
;
import
java.util.ArrayList
;
import
java.util.Arrays
;
import
java.util.Collection
;
import
java.util.List
;
import
java.util.List
;
import
java.util.concurrent.ExecutorService
;
import
java.util.concurrent.ExecutorService
;
import
java.util.concurrent.Executors
;
import
java.util.concurrent.Executors
;
import
static
org
.
mockito
.
Mockito
.
mock
;
@Service
@Service
public
class
ExtractDataService
{
public
class
ExtractDataService
{
...
@@ -125,8 +84,8 @@ public class ExtractDataService {
...
@@ -125,8 +84,8 @@ public class ExtractDataService {
@Value
(
"${hbase.zookeeper.property.clientPort:2181}"
)
@Value
(
"${hbase.zookeeper.property.clientPort:2181}"
)
private
String
zkPort
;
private
String
zkPort
;
@Resource
//
@Resource
private
SparkCalc
sparkCalc
;
//
private SparkCalc sparkCalc;
public
void
extractData
(
String
datasetTableId
,
String
taskId
,
String
type
)
{
public
void
extractData
(
String
datasetTableId
,
String
taskId
,
String
type
)
{
...
...
backend/src/main/java/io/dataease/service/spark/SparkCalc.java
浏览文件 @
9302d0f0
差异被折叠。
点击展开。
frontend/src/lang/zh.js
浏览文件 @
9302d0f0
...
@@ -600,8 +600,8 @@ export default {
...
@@ -600,8 +600,8 @@ export default {
avg
:
'平均'
,
avg
:
'平均'
,
max
:
'最大值'
,
max
:
'最大值'
,
min
:
'最小值'
,
min
:
'最小值'
,
std
:
'标准差'
,
std
dev_pop
:
'标准差'
,
var_
sam
p
:
'方差'
,
var_
po
p
:
'方差'
,
quick_calc
:
'快速计算'
,
quick_calc
:
'快速计算'
,
show_name_set
:
'显示名设置'
,
show_name_set
:
'显示名设置'
,
color
:
'颜色'
,
color
:
'颜色'
,
...
...
frontend/src/views/chart/components/drag-item/QuotaItem.vue
浏览文件 @
9302d0f0
...
@@ -22,8 +22,8 @@
...
@@ -22,8 +22,8 @@
<el-dropdown-item
:command=
"beforeSummary('avg')"
>
{{
$t
(
'chart.avg'
)
}}
</el-dropdown-item>
<el-dropdown-item
:command=
"beforeSummary('avg')"
>
{{
$t
(
'chart.avg'
)
}}
</el-dropdown-item>
<el-dropdown-item
:command=
"beforeSummary('max')"
>
{{
$t
(
'chart.max'
)
}}
</el-dropdown-item>
<el-dropdown-item
:command=
"beforeSummary('max')"
>
{{
$t
(
'chart.max'
)
}}
</el-dropdown-item>
<el-dropdown-item
:command=
"beforeSummary('min')"
>
{{
$t
(
'chart.min'
)
}}
</el-dropdown-item>
<el-dropdown-item
:command=
"beforeSummary('min')"
>
{{
$t
(
'chart.min'
)
}}
</el-dropdown-item>
<el-dropdown-item
:command=
"beforeSummary('std
')"
>
{{
$t
(
'chart.std
'
)
}}
</el-dropdown-item>
<el-dropdown-item
:command=
"beforeSummary('std
dev_pop')"
>
{{
$t
(
'chart.stddev_pop
'
)
}}
</el-dropdown-item>
<el-dropdown-item
:command=
"beforeSummary('var_
samp')"
>
{{
$t
(
'chart.var_sam
p'
)
}}
</el-dropdown-item>
<el-dropdown-item
:command=
"beforeSummary('var_
pop')"
>
{{
$t
(
'chart.var_po
p'
)
}}
</el-dropdown-item>
</el-dropdown-menu>
</el-dropdown-menu>
</el-dropdown>
</el-dropdown>
</el-dropdown-item>
</el-dropdown-item>
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论