Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
S
seatunnel-web
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
宋勇
seatunnel-web
Commits
373502c0
提交
373502c0
authored
4月 12, 2024
作者:
宋勇
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
atasource-jdbc-demeng
datasource-jdbc-access datasource-http datasource-xml datasource-csv datasource-excel 增加
上级
f06ee4a0
全部展开
隐藏空白字符变更
内嵌
并排
正在显示
36 个修改的文件
包含
2442 行增加
和
1 行删除
+2442
-1
pom.xml
...ource/seatunnel-datasource-plugins/datasource-csv/pom.xml
+147
-0
CSVAConfiguration.java
...he/seatunnel/datasource/plugin/csv/CSVAConfiguration.java
+75
-0
CSVClientService.java
...che/seatunnel/datasource/plugin/csv/CSVClientService.java
+42
-0
CSVDataSourceFactory.java
...seatunnel/datasource/plugin/csv/CSVDataSourceFactory.java
+57
-0
CSVDatasourceChannel.java
...seatunnel/datasource/plugin/csv/CSVDatasourceChannel.java
+0
-0
CSVOptionRule.java
...apache/seatunnel/datasource/plugin/csv/CSVOptionRule.java
+168
-0
pom.xml
...rce/seatunnel-datasource-plugins/datasource-excel/pom.xml
+147
-0
ExcelAConfiguration.java
...eatunnel/datasource/plugin/excel/ExcelAConfiguration.java
+75
-0
ExcelClientService.java
...seatunnel/datasource/plugin/excel/ExcelClientService.java
+42
-0
ExcelDataSourceFactory.java
...unnel/datasource/plugin/excel/ExcelDataSourceFactory.java
+57
-0
ExcelDatasourceChannel.java
...unnel/datasource/plugin/excel/ExcelDatasourceChannel.java
+0
-0
ExcelOptionRule.java
...he/seatunnel/datasource/plugin/excel/ExcelOptionRule.java
+164
-0
pom.xml
...urce/seatunnel-datasource-plugins/datasource-http/pom.xml
+53
-0
HttpAConfiguration.java
.../seatunnel/datasource/plugin/http/HttpAConfiguration.java
+43
-0
HttpClientService.java
...e/seatunnel/datasource/plugin/http/HttpClientService.java
+21
-0
HttpConfiguration.java
...e/seatunnel/datasource/plugin/http/HttpConfiguration.java
+53
-0
HttpDataSourceFactory.java
...atunnel/datasource/plugin/http/HttpDataSourceFactory.java
+58
-0
HttpDatasourceChannel.java
...atunnel/datasource/plugin/http/HttpDatasourceChannel.java
+213
-0
HttpOptionRule.java
...ache/seatunnel/datasource/plugin/http/HttpOptionRule.java
+73
-0
pom.xml
...atunnel-datasource-plugins/datasource-jdbc-access/pom.xml
+62
-0
AccessDataSourceConfig.java
...datasource/plugin/access/jdbc/AccessDataSourceConfig.java
+51
-0
AccessJdbcDataSourceChannel.java
...ource/plugin/access/jdbc/AccessJdbcDataSourceChannel.java
+0
-0
AccessJdbcDataSourceFactory.java
...ource/plugin/access/jdbc/AccessJdbcDataSourceFactory.java
+48
-0
AccessOptionRule.java
...unnel/datasource/plugin/access/jdbc/AccessOptionRule.java
+69
-0
pom.xml
...atunnel-datasource-plugins/datasource-jdbc-demeng/pom.xml
+62
-0
DemengDataSourceConfig.java
...datasource/plugin/demeng/jdbc/DemengDataSourceConfig.java
+51
-0
DemengJdbcDataSourceChannel.java
...ource/plugin/demeng/jdbc/DemengJdbcDataSourceChannel.java
+0
-0
DemengJdbcDataSourceFactory.java
...ource/plugin/demeng/jdbc/DemengJdbcDataSourceFactory.java
+48
-0
DemengOptionRule.java
...unnel/datasource/plugin/demeng/jdbc/DemengOptionRule.java
+69
-0
pom.xml
...ource/seatunnel-datasource-plugins/datasource-xml/pom.xml
+147
-0
XMLAConfiguration.java
...he/seatunnel/datasource/plugin/xml/XMLAConfiguration.java
+75
-0
XMLClientService.java
...che/seatunnel/datasource/plugin/xml/XMLClientService.java
+42
-0
XMLDataSourceFactory.java
...seatunnel/datasource/plugin/xml/XMLDataSourceFactory.java
+57
-0
XMLDatasourceChannel.java
...seatunnel/datasource/plugin/xml/XMLDatasourceChannel.java
+0
-0
XMLOptionRule.java
...apache/seatunnel/datasource/plugin/xml/XMLOptionRule.java
+167
-0
pom.xml
seatunnel-datasource/seatunnel-datasource-plugins/pom.xml
+6
-1
没有找到文件。
seatunnel-datasource/seatunnel-datasource-plugins/datasource-csv/pom.xml
0 → 100644
浏览文件 @
373502c0
<?xml version="1.0" encoding="UTF-8"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project
xmlns=
"http://maven.apache.org/POM/4.0.0"
xmlns:xsi=
"http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation=
"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>
4.0.0
</modelVersion>
<parent>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-datasource-plugins
</artifactId>
<version>
1.0.0-SNAPSHOT
</version>
</parent>
<artifactId>
datasource-csv
</artifactId>
<dependencies>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
datasource-plugins-api
</artifactId>
<version>
${project.version}
</version>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-hadoop3-3.1.4-uber
</artifactId>
<exclusions>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
ch.qos.logback
</groupId>
<artifactId>
logback-classic
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-slf4j-impl
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
jcl-over-slf4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-core
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-1.2-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-1.2-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-client
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
log4j-over-slf4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-reload4j
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
io.quarkiverse.minio
</groupId>
<artifactId>
minio-client
</artifactId>
<version>
0.2.0
</version>
</dependency>
<!-- <dependency>-->
<!-- <groupId>org.slf4j</groupId>-->
<!-- <artifactId>slf4j-reload4j</artifactId>-->
<!-- <version>1.7.35</version>-->
<!-- <scope>test</scope>-->
<!-- </dependency>-->
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<version>
3.3.5
</version>
<exclusions>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-reload4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
log4j-over-slf4j
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-client
</artifactId>
<version>
3.3.5
</version>
<exclusions>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-reload4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
log4j-over-slf4j
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-aws
</artifactId>
<version>
3.3.5
</version>
</dependency>
<dependency>
<groupId>
com.amazonaws
</groupId>
<artifactId>
aws-java-sdk-bundle
</artifactId>
</dependency>
</dependencies>
</project>
seatunnel-datasource/seatunnel-datasource-plugins/datasource-csv/src/main/java/org/apache/seatunnel/datasource/plugin/csv/CSVAConfiguration.java
0 → 100644
浏览文件 @
373502c0
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
csv
;
import
lombok.extern.slf4j.Slf4j
;
import
org.apache.hadoop.conf.Configuration
;
import
org.apache.seatunnel.shade.com.typesafe.config.Config
;
import
org.apache.seatunnel.shade.com.typesafe.config.ConfigFactory
;
import
java.util.Map
;
@Slf4j
public
class
CSVAConfiguration
{
/* S3 constants */
private
static
final
String
S3A_SCHEMA
=
"s3a"
;
private
static
final
String
HDFS_S3N_IMPL
=
"org.apache.hadoop.fs.s3native.NativeS3FileSystem"
;
private
static
final
String
HDFS_S3A_IMPL
=
"org.apache.hadoop.fs.s3a.S3AFileSystem"
;
private
static
final
String
S3A_PROTOCOL
=
"s3a"
;
private
static
final
String
DEFAULT_PROTOCOL
=
"s3n"
;
private
static
final
String
S3_FORMAT_KEY
=
"fs.%s.%s"
;
private
static
final
String
HDFS_IMPL_KEY
=
"impl"
;
public
static
Configuration
getConfiguration
(
Map
<
String
,
String
>
s3Options
)
{
if
(!
s3Options
.
containsKey
(
CSVOptionRule
.
BUCKET
.
key
()))
{
throw
new
IllegalArgumentException
(
"S3 datasource bucket is null, please check your config"
);
}
if
(!
s3Options
.
containsKey
(
CSVOptionRule
.
FS_S3A_ENDPOINT
.
key
()))
{
throw
new
IllegalArgumentException
(
"S3 datasource endpoint is null, please check your config"
);
}
String
bucket
=
s3Options
.
get
(
CSVOptionRule
.
BUCKET
.
key
());
String
protocol
=
DEFAULT_PROTOCOL
;
if
(
bucket
.
startsWith
(
S3A_PROTOCOL
))
{
protocol
=
S3A_PROTOCOL
;
}
String
fsImpl
=
protocol
.
equals
(
S3A_PROTOCOL
)
?
HDFS_S3A_IMPL
:
HDFS_S3N_IMPL
;
Configuration
hadoopConf
=
new
Configuration
();
hadoopConf
.
set
(
"fs.defaut.name"
,
bucket
);
hadoopConf
.
set
(
CSVOptionRule
.
FS_S3A_ENDPOINT
.
key
(),
s3Options
.
get
(
CSVOptionRule
.
FS_S3A_ENDPOINT
.
key
()));
hadoopConf
.
set
(
formatKey
(
protocol
,
HDFS_IMPL_KEY
),
fsImpl
);
if
(
s3Options
.
containsKey
(
CSVOptionRule
.
HADOOP_S3_PROPERTIES
.
key
()))
{
Config
configObject
=
ConfigFactory
.
parseString
(
s3Options
.
get
(
CSVOptionRule
.
HADOOP_S3_PROPERTIES
.
key
()));
configObject
.
entrySet
()
.
forEach
(
entry
->
{
hadoopConf
.
set
(
entry
.
getKey
(),
entry
.
getValue
().
unwrapped
().
toString
());
});
}
if
(
CSVOptionRule
.
S3aAwsCredentialsProvider
.
SimpleAWSCredentialsProvider
.
getProvider
()
.
equals
(
s3Options
.
get
(
CSVOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
())))
{
hadoopConf
.
set
(
CSVOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
(),
s3Options
.
get
(
CSVOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
()));
hadoopConf
.
set
(
"fs.s3a.access.key"
,
s3Options
.
get
(
CSVOptionRule
.
ACCESS_KEY
.
key
()));
hadoopConf
.
set
(
"fs.s3a.secret.key"
,
s3Options
.
get
(
CSVOptionRule
.
SECRET_KEY
.
key
()));
}
else
{
hadoopConf
.
set
(
CSVOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
(),
s3Options
.
get
(
CSVOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
()));
}
return
hadoopConf
;
}
private
static
String
formatKey
(
String
protocol
,
String
key
)
{
return
String
.
format
(
S3_FORMAT_KEY
,
protocol
,
key
);
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-csv/src/main/java/org/apache/seatunnel/datasource/plugin/csv/CSVClientService.java
0 → 100644
浏览文件 @
373502c0
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
csv
;
import
io.minio.MinioClient
;
import
io.minio.errors.MinioException
;
public
class
CSVClientService
{
private
String
ENDPOINT
;
private
String
PROVIDER
;
private
String
USERNAME
;
private
String
PASSWORD
;
private
String
BUCKET
;
private
Integer
PORT
;
private
final
String
clientId
=
"Client"
+
(
int
)
(
Math
.
random
()
*
100000000
);
private
MinioClient
minioClient
;
public
CSVClientService
(
String
endpoint
,
String
provider
,
String
username
,
String
password
,
Integer
port
)
throws
MinioException
{
this
.
ENDPOINT
=
endpoint
;
this
.
PROVIDER
=
provider
;
this
.
USERNAME
=
username
;
this
.
PASSWORD
=
password
;
this
.
PORT
=
port
;
setMinioClient
(
endpoint
,
provider
,
username
,
password
,
port
);
}
public
MinioClient
getMinioClient
()
{
return
minioClient
;
}
public
void
setMinioClient
(
String
endpoint
,
String
provider
,
String
username
,
String
password
,
Integer
port
)
throws
MinioException
{
minioClient
=
new
MinioClient
.
Builder
()
.
endpoint
(
endpoint
,
port
,
false
)
.
credentials
(
username
,
password
)
.
build
();
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-csv/src/main/java/org/apache/seatunnel/datasource/plugin/csv/CSVDataSourceFactory.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
csv
;
import
com.google.auto.service.AutoService
;
import
com.google.common.collect.Sets
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceChannel
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceFactory
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourcePluginInfo
;
import
org.apache.seatunnel.datasource.plugin.api.DatasourcePluginTypeEnum
;
import
java.util.Set
;
@AutoService
(
DataSourceFactory
.
class
)
public
class
CSVDataSourceFactory
implements
DataSourceFactory
{
private
static
final
String
PLUGIN_NAME
=
"S3"
;
@Override
public
String
factoryIdentifier
()
{
return
PLUGIN_NAME
;
}
@Override
public
Set
<
DataSourcePluginInfo
>
supportedDataSources
()
{
DataSourcePluginInfo
s3DatasourcePluginInfo
=
DataSourcePluginInfo
.
builder
()
.
name
(
PLUGIN_NAME
)
.
type
(
DatasourcePluginTypeEnum
.
FILE
.
getCode
())
.
version
(
"1.0.0"
)
.
supportVirtualTables
(
false
)
.
icon
(
"S3File"
)
.
build
();
return
Sets
.
newHashSet
(
s3DatasourcePluginInfo
);
}
@Override
public
DataSourceChannel
createChannel
()
{
return
CSVDatasourceChannel
.
getInstance
();
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-csv/src/main/java/org/apache/seatunnel/datasource/plugin/csv/CSVDatasourceChannel.java
0 → 100644
浏览文件 @
373502c0
差异被折叠。
点击展开。
seatunnel-datasource/seatunnel-datasource-plugins/datasource-csv/src/main/java/org/apache/seatunnel/datasource/plugin/csv/CSVOptionRule.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
csv
;
import
org.apache.seatunnel.api.configuration.Option
;
import
org.apache.seatunnel.api.configuration.Options
;
import
org.apache.seatunnel.api.configuration.util.OptionRule
;
import
java.util.Arrays
;
import
java.util.Map
;
public
class
CSVOptionRule
{
public
static
final
Option
<
String
>
ACCESS_KEY
=
Options
.
key
(
"access_key"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 access key"
);
public
static
final
Option
<
String
>
SECRET_KEY
=
Options
.
key
(
"secret_key"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 secret key"
);
public
static
final
Option
<
String
>
BUCKET
=
Options
.
key
(
"bucket"
).
stringType
().
noDefaultValue
().
withDescription
(
"S3 bucket name"
);
public
static
final
Option
<
String
>
FS_S3A_ENDPOINT
=
Options
.
key
(
"fs.s3a.endpoint"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"fs s3a endpoint"
);
public
static
final
Option
<
S3aAwsCredentialsProvider
>
S3A_AWS_CREDENTIALS_PROVIDER
=
Options
.
key
(
"fs.s3a.aws.credentials.provider"
)
.
enumType
(
S3aAwsCredentialsProvider
.
class
)
.
defaultValue
(
S3aAwsCredentialsProvider
.
InstanceProfileCredentialsProvider
)
.
withDescription
(
"s3a aws credentials provider"
);
public
static
final
Option
<
Map
<
String
,
String
>>
HADOOP_S3_PROPERTIES
=
Options
.
key
(
"hadoop_s3_properties"
)
.
mapType
()
.
noDefaultValue
()
.
withDescription
(
"{\n"
+
"fs.s3a.buffer.dir=/data/st_test/s3a\n"
+
"fs.s3a.fast.upload.buffer=disk\n"
+
"}"
);
public
static
OptionRule
optionRule
()
{
return
OptionRule
.
builder
()
.
required
(
BUCKET
,
FS_S3A_ENDPOINT
,
S3A_AWS_CREDENTIALS_PROVIDER
)
.
optional
(
HADOOP_S3_PROPERTIES
)
.
conditional
(
S3A_AWS_CREDENTIALS_PROVIDER
,
S3aAwsCredentialsProvider
.
SimpleAWSCredentialsProvider
,
ACCESS_KEY
,
SECRET_KEY
)
.
build
();
}
public
static
final
Option
<
String
>
PATH
=
Options
.
key
(
"path"
).
stringType
().
noDefaultValue
().
withDescription
(
"S3 write path"
);
public
static
final
Option
<
String
>
TYPE
=
Options
.
key
(
"file_format_type"
)
.
stringType
()
.
defaultValue
(
"csv"
)
.
withDescription
(
"S3 write type"
);
public
static
final
Option
<
String
>
DELIMITER
=
Options
.
key
(
"delimiter"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write delimiter"
);
public
static
final
Option
<
Map
<
String
,
String
>>
SCHEMA
=
Options
.
key
(
"schema"
).
mapType
().
noDefaultValue
().
withDescription
(
"SeaTunnel Schema"
);
public
static
final
Option
<
Boolean
>
PARSE_PARSE_PARTITION_FROM_PATH
=
Options
.
key
(
"parse_partition_from_path"
)
.
booleanType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write parse_partition_from_path"
);
public
static
final
Option
<
String
>
DATE_FORMAT
=
Options
.
key
(
"date_format"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write date_format"
);
public
static
final
Option
<
String
>
DATETIME_FORMAT
=
Options
.
key
(
"time_format"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write time_format"
);
public
static
final
Option
<
String
>
TIME_FORMAT
=
Options
.
key
(
"datetime_format"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write datetime_format"
);
public
static
OptionRule
metadataRule
()
{
return
OptionRule
.
builder
()
.
required
(
PATH
,
TYPE
)
.
conditional
(
TYPE
,
FileFormat
.
CSV
.
type
,
DELIMITER
)
.
conditional
(
TYPE
,
FileFormat
.
CSV
.
type
,
SCHEMA
)
.
optional
(
PARSE_PARSE_PARTITION_FROM_PATH
)
.
optional
(
DATE_FORMAT
)
.
optional
(
DATETIME_FORMAT
)
.
optional
(
TIME_FORMAT
)
.
build
();
}
public
enum
S3aAwsCredentialsProvider
{
SimpleAWSCredentialsProvider
(
"org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider"
),
InstanceProfileCredentialsProvider
(
"com.amazonaws.auth.InstanceProfileCredentialsProvider"
);
private
String
provider
;
S3aAwsCredentialsProvider
(
String
provider
)
{
this
.
provider
=
provider
;
}
public
String
getProvider
()
{
return
provider
;
}
@Override
public
String
toString
()
{
return
provider
;
}
}
public
enum
FileFormat
{
CSV
(
"csv"
),
TEXT
(
"txt"
),
PARQUET
(
"parquet"
),
ORC
(
"orc"
),
JSON
(
"json"
),
XML
(
"xml"
);
private
final
String
type
;
FileFormat
(
String
type
)
{
this
.
type
=
type
;
}
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-excel/pom.xml
0 → 100644
浏览文件 @
373502c0
<?xml version="1.0" encoding="UTF-8"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project
xmlns=
"http://maven.apache.org/POM/4.0.0"
xmlns:xsi=
"http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation=
"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>
4.0.0
</modelVersion>
<parent>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-datasource-plugins
</artifactId>
<version>
1.0.0-SNAPSHOT
</version>
</parent>
<artifactId>
datasource-excel
</artifactId>
<dependencies>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
datasource-plugins-api
</artifactId>
<version>
${project.version}
</version>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-hadoop3-3.1.4-uber
</artifactId>
<exclusions>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
ch.qos.logback
</groupId>
<artifactId>
logback-classic
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-slf4j-impl
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
jcl-over-slf4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-core
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-1.2-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-1.2-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-client
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
log4j-over-slf4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-reload4j
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
io.quarkiverse.minio
</groupId>
<artifactId>
minio-client
</artifactId>
<version>
0.2.0
</version>
</dependency>
<!-- <dependency>-->
<!-- <groupId>org.slf4j</groupId>-->
<!-- <artifactId>slf4j-reload4j</artifactId>-->
<!-- <version>1.7.35</version>-->
<!-- <scope>test</scope>-->
<!-- </dependency>-->
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<version>
3.3.5
</version>
<exclusions>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-reload4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
log4j-over-slf4j
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-client
</artifactId>
<version>
3.3.5
</version>
<exclusions>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-reload4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
log4j-over-slf4j
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-aws
</artifactId>
<version>
3.3.5
</version>
</dependency>
<dependency>
<groupId>
com.amazonaws
</groupId>
<artifactId>
aws-java-sdk-bundle
</artifactId>
</dependency>
</dependencies>
</project>
seatunnel-datasource/seatunnel-datasource-plugins/datasource-excel/src/main/java/org/apache/seatunnel/datasource/plugin/excel/ExcelAConfiguration.java
0 → 100644
浏览文件 @
373502c0
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
excel
;
import
lombok.extern.slf4j.Slf4j
;
import
org.apache.hadoop.conf.Configuration
;
import
org.apache.seatunnel.shade.com.typesafe.config.Config
;
import
org.apache.seatunnel.shade.com.typesafe.config.ConfigFactory
;
import
java.util.Map
;
@Slf4j
public
class
ExcelAConfiguration
{
/* S3 constants */
private
static
final
String
S3A_SCHEMA
=
"s3a"
;
private
static
final
String
HDFS_S3N_IMPL
=
"org.apache.hadoop.fs.s3native.NativeS3FileSystem"
;
private
static
final
String
HDFS_S3A_IMPL
=
"org.apache.hadoop.fs.s3a.S3AFileSystem"
;
private
static
final
String
S3A_PROTOCOL
=
"s3a"
;
private
static
final
String
DEFAULT_PROTOCOL
=
"s3n"
;
private
static
final
String
S3_FORMAT_KEY
=
"fs.%s.%s"
;
private
static
final
String
HDFS_IMPL_KEY
=
"impl"
;
public
static
Configuration
getConfiguration
(
Map
<
String
,
String
>
s3Options
)
{
if
(!
s3Options
.
containsKey
(
ExcelOptionRule
.
BUCKET
.
key
()))
{
throw
new
IllegalArgumentException
(
"S3 datasource bucket is null, please check your config"
);
}
if
(!
s3Options
.
containsKey
(
ExcelOptionRule
.
FS_S3A_ENDPOINT
.
key
()))
{
throw
new
IllegalArgumentException
(
"S3 datasource endpoint is null, please check your config"
);
}
String
bucket
=
s3Options
.
get
(
ExcelOptionRule
.
BUCKET
.
key
());
String
protocol
=
DEFAULT_PROTOCOL
;
if
(
bucket
.
startsWith
(
S3A_PROTOCOL
))
{
protocol
=
S3A_PROTOCOL
;
}
String
fsImpl
=
protocol
.
equals
(
S3A_PROTOCOL
)
?
HDFS_S3A_IMPL
:
HDFS_S3N_IMPL
;
Configuration
hadoopConf
=
new
Configuration
();
hadoopConf
.
set
(
"fs.defaut.name"
,
bucket
);
hadoopConf
.
set
(
ExcelOptionRule
.
FS_S3A_ENDPOINT
.
key
(),
s3Options
.
get
(
ExcelOptionRule
.
FS_S3A_ENDPOINT
.
key
()));
hadoopConf
.
set
(
formatKey
(
protocol
,
HDFS_IMPL_KEY
),
fsImpl
);
if
(
s3Options
.
containsKey
(
ExcelOptionRule
.
HADOOP_S3_PROPERTIES
.
key
()))
{
Config
configObject
=
ConfigFactory
.
parseString
(
s3Options
.
get
(
ExcelOptionRule
.
HADOOP_S3_PROPERTIES
.
key
()));
configObject
.
entrySet
()
.
forEach
(
entry
->
{
hadoopConf
.
set
(
entry
.
getKey
(),
entry
.
getValue
().
unwrapped
().
toString
());
});
}
if
(
ExcelOptionRule
.
S3aAwsCredentialsProvider
.
SimpleAWSCredentialsProvider
.
getProvider
()
.
equals
(
s3Options
.
get
(
ExcelOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
())))
{
hadoopConf
.
set
(
ExcelOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
(),
s3Options
.
get
(
ExcelOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
()));
hadoopConf
.
set
(
"fs.s3a.access.key"
,
s3Options
.
get
(
ExcelOptionRule
.
ACCESS_KEY
.
key
()));
hadoopConf
.
set
(
"fs.s3a.secret.key"
,
s3Options
.
get
(
ExcelOptionRule
.
SECRET_KEY
.
key
()));
}
else
{
hadoopConf
.
set
(
ExcelOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
(),
s3Options
.
get
(
ExcelOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
()));
}
return
hadoopConf
;
}
private
static
String
formatKey
(
String
protocol
,
String
key
)
{
return
String
.
format
(
S3_FORMAT_KEY
,
protocol
,
key
);
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-excel/src/main/java/org/apache/seatunnel/datasource/plugin/excel/ExcelClientService.java
0 → 100644
浏览文件 @
373502c0
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
excel
;
import
io.minio.MinioClient
;
import
io.minio.errors.MinioException
;
public
class
ExcelClientService
{
private
String
ENDPOINT
;
private
String
PROVIDER
;
private
String
USERNAME
;
private
String
PASSWORD
;
private
String
BUCKET
;
private
Integer
PORT
;
private
final
String
clientId
=
"Client"
+
(
int
)
(
Math
.
random
()
*
100000000
);
private
MinioClient
minioClient
;
public
ExcelClientService
(
String
endpoint
,
String
provider
,
String
username
,
String
password
,
Integer
port
)
throws
MinioException
{
this
.
ENDPOINT
=
endpoint
;
this
.
PROVIDER
=
provider
;
this
.
USERNAME
=
username
;
this
.
PASSWORD
=
password
;
this
.
PORT
=
port
;
setMinioClient
(
endpoint
,
provider
,
username
,
password
,
port
);
}
public
MinioClient
getMinioClient
()
{
return
minioClient
;
}
public
void
setMinioClient
(
String
endpoint
,
String
provider
,
String
username
,
String
password
,
Integer
port
)
throws
MinioException
{
minioClient
=
new
MinioClient
.
Builder
()
.
endpoint
(
endpoint
,
port
,
false
)
.
credentials
(
username
,
password
)
.
build
();
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-excel/src/main/java/org/apache/seatunnel/datasource/plugin/excel/ExcelDataSourceFactory.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
excel
;
import
com.google.auto.service.AutoService
;
import
com.google.common.collect.Sets
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceChannel
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceFactory
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourcePluginInfo
;
import
org.apache.seatunnel.datasource.plugin.api.DatasourcePluginTypeEnum
;
import
java.util.Set
;
@AutoService
(
DataSourceFactory
.
class
)
public
class
ExcelDataSourceFactory
implements
DataSourceFactory
{
private
static
final
String
PLUGIN_NAME
=
"S3"
;
@Override
public
String
factoryIdentifier
()
{
return
PLUGIN_NAME
;
}
@Override
public
Set
<
DataSourcePluginInfo
>
supportedDataSources
()
{
DataSourcePluginInfo
s3DatasourcePluginInfo
=
DataSourcePluginInfo
.
builder
()
.
name
(
PLUGIN_NAME
)
.
type
(
DatasourcePluginTypeEnum
.
FILE
.
getCode
())
.
version
(
"1.0.0"
)
.
supportVirtualTables
(
false
)
.
icon
(
"S3File"
)
.
build
();
return
Sets
.
newHashSet
(
s3DatasourcePluginInfo
);
}
@Override
public
DataSourceChannel
createChannel
()
{
return
ExcelDatasourceChannel
.
getInstance
();
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-excel/src/main/java/org/apache/seatunnel/datasource/plugin/excel/ExcelDatasourceChannel.java
0 → 100644
浏览文件 @
373502c0
差异被折叠。
点击展开。
seatunnel-datasource/seatunnel-datasource-plugins/datasource-excel/src/main/java/org/apache/seatunnel/datasource/plugin/excel/ExcelOptionRule.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
excel
;
import
org.apache.seatunnel.api.configuration.Option
;
import
org.apache.seatunnel.api.configuration.Options
;
import
org.apache.seatunnel.api.configuration.util.OptionRule
;
import
java.util.Arrays
;
import
java.util.Map
;
public
class
ExcelOptionRule
{
public
static
final
Option
<
String
>
ACCESS_KEY
=
Options
.
key
(
"access_key"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 access key"
);
public
static
final
Option
<
String
>
SECRET_KEY
=
Options
.
key
(
"secret_key"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 secret key"
);
public
static
final
Option
<
String
>
BUCKET
=
Options
.
key
(
"bucket"
).
stringType
().
noDefaultValue
().
withDescription
(
"S3 bucket name"
);
public
static
final
Option
<
String
>
FS_S3A_ENDPOINT
=
Options
.
key
(
"fs.s3a.endpoint"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"fs s3a endpoint"
);
public
static
final
Option
<
S3aAwsCredentialsProvider
>
S3A_AWS_CREDENTIALS_PROVIDER
=
Options
.
key
(
"fs.s3a.aws.credentials.provider"
)
.
enumType
(
S3aAwsCredentialsProvider
.
class
)
.
defaultValue
(
S3aAwsCredentialsProvider
.
InstanceProfileCredentialsProvider
)
.
withDescription
(
"s3a aws credentials provider"
);
public
static
final
Option
<
Map
<
String
,
String
>>
HADOOP_S3_PROPERTIES
=
Options
.
key
(
"hadoop_s3_properties"
)
.
mapType
()
.
noDefaultValue
()
.
withDescription
(
"{\n"
+
"fs.s3a.buffer.dir=/data/st_test/s3a\n"
+
"fs.s3a.fast.upload.buffer=disk\n"
+
"}"
);
public
static
OptionRule
optionRule
()
{
return
OptionRule
.
builder
()
.
required
(
BUCKET
,
FS_S3A_ENDPOINT
,
S3A_AWS_CREDENTIALS_PROVIDER
)
.
optional
(
HADOOP_S3_PROPERTIES
)
.
conditional
(
S3A_AWS_CREDENTIALS_PROVIDER
,
S3aAwsCredentialsProvider
.
SimpleAWSCredentialsProvider
,
ACCESS_KEY
,
SECRET_KEY
)
.
build
();
}
public
static
final
Option
<
String
>
PATH
=
Options
.
key
(
"path"
).
stringType
().
noDefaultValue
().
withDescription
(
"S3 write path"
);
public
static
final
Option
<
FileFormat
>
TYPE
=
Options
.
key
(
"file_format_type"
)
.
enumType
(
FileFormat
.
class
)
.
noDefaultValue
()
.
withDescription
(
"S3 write type"
);
public
static
final
Option
<
String
>
DELIMITER
=
Options
.
key
(
"delimiter"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write delimiter"
);
public
static
final
Option
<
Map
<
String
,
String
>>
SCHEMA
=
Options
.
key
(
"schema"
).
mapType
().
noDefaultValue
().
withDescription
(
"SeaTunnel Schema"
);
public
static
final
Option
<
Boolean
>
PARSE_PARSE_PARTITION_FROM_PATH
=
Options
.
key
(
"parse_partition_from_path"
)
.
booleanType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write parse_partition_from_path"
);
public
static
final
Option
<
String
>
DATE_FORMAT
=
Options
.
key
(
"date_format"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write date_format"
);
public
static
final
Option
<
String
>
DATETIME_FORMAT
=
Options
.
key
(
"time_format"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write time_format"
);
public
static
final
Option
<
String
>
TIME_FORMAT
=
Options
.
key
(
"datetime_format"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write datetime_format"
);
public
static
OptionRule
metadataRule
()
{
return
OptionRule
.
builder
()
.
required
(
PATH
,
TYPE
)
.
conditional
(
TYPE
,
Arrays
.
asList
(
FileFormat
.
XLSX
,
FileFormat
.
XLS
),
DELIMITER
)
.
conditional
(
TYPE
,
Arrays
.
asList
(
FileFormat
.
XLSX
,
FileFormat
.
XLS
),
SCHEMA
)
.
optional
(
PARSE_PARSE_PARTITION_FROM_PATH
)
.
optional
(
DATE_FORMAT
)
.
optional
(
DATETIME_FORMAT
)
.
optional
(
TIME_FORMAT
)
.
build
();
}
public
enum
S3aAwsCredentialsProvider
{
SimpleAWSCredentialsProvider
(
"org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider"
),
InstanceProfileCredentialsProvider
(
"com.amazonaws.auth.InstanceProfileCredentialsProvider"
);
private
String
provider
;
S3aAwsCredentialsProvider
(
String
provider
)
{
this
.
provider
=
provider
;
}
public
String
getProvider
()
{
return
provider
;
}
@Override
public
String
toString
()
{
return
provider
;
}
}
public
enum
FileFormat
{
XLSX
(
"xlsx"
),
XLS
(
"xls"
);
private
final
String
type
;
FileFormat
(
String
type
)
{
this
.
type
=
type
;
}
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-http/pom.xml
0 → 100644
浏览文件 @
373502c0
<?xml version="1.0" encoding="UTF-8"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project
xmlns=
"http://maven.apache.org/POM/4.0.0"
xmlns:xsi=
"http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation=
"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>
4.0.0
</modelVersion>
<parent>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-datasource-plugins
</artifactId>
<version>
1.0.0-SNAPSHOT
</version>
</parent>
<artifactId>
datasource-http
</artifactId>
<dependencies>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
datasource-plugins-api
</artifactId>
<version>
${project.version}
</version>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.httpcomponents
</groupId>
<artifactId>
httpclient
</artifactId>
<version>
4.5.14
</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>
org.apache.maven.plugins
</groupId>
<artifactId>
maven-compiler-plugin
</artifactId>
<configuration>
<source>
11
</source>
<target>
11
</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
seatunnel-datasource/seatunnel-datasource-plugins/datasource-http/src/main/java/org/apache/seatunnel/datasource/plugin/http/HttpAConfiguration.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
http
;
import
lombok.extern.slf4j.Slf4j
;
import
java.util.Map
;
@Slf4j
public
class
HttpAConfiguration
{
public
static
HttpConfiguration
getConfiguration
(
Map
<
String
,
String
>
ftpOption
)
{
if
(!
ftpOption
.
containsKey
(
HttpOptionRule
.
URL
.
key
()))
{
throw
new
IllegalArgumentException
(
"url is null, please check your config"
);
}
HttpConfiguration
httpAConfiguration
=
new
HttpConfiguration
();
httpAConfiguration
.
setUrl
(
HttpOptionRule
.
URL
.
key
());
httpAConfiguration
.
setMethod
(
HttpOptionRule
.
METHOD
.
key
());
httpAConfiguration
.
setToken
(
HttpOptionRule
.
TOKEN
.
key
());
httpAConfiguration
.
setRequest_params
(
HttpOptionRule
.
REQUEST_PARAMS
.
key
());
return
httpAConfiguration
;
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-http/src/main/java/org/apache/seatunnel/datasource/plugin/http/HttpClientService.java
0 → 100644
浏览文件 @
373502c0
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
http
;
import
org.apache.http.client.HttpClient
;
import
org.apache.http.impl.client.HttpClients
;
public
class
HttpClientService
{
public
static
HttpClient
connect
(
HttpConfiguration
conf
)
throws
Exception
{
// 创建HttpClient实例
HttpClient
client
=
HttpClients
.
createDefault
();
return
client
;
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-http/src/main/java/org/apache/seatunnel/datasource/plugin/http/HttpConfiguration.java
0 → 100644
浏览文件 @
373502c0
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
http
;
public
class
HttpConfiguration
{
private
String
url
;
private
String
token
;
private
String
method
;
private
String
request_params
;
public
HttpConfiguration
()
{}
public
HttpConfiguration
(
String
url
,
String
method
,
String
token
,
String
request_params
)
{
this
.
url
=
url
;
this
.
token
=
token
;
this
.
method
=
method
;
this
.
request_params
=
request_params
;
}
public
String
getUrl
()
{
return
url
;
}
public
void
setUrl
(
String
url
)
{
this
.
url
=
url
;
}
public
String
getMethod
()
{
return
method
;
}
public
void
setMethod
(
String
method
)
{
this
.
method
=
method
;
}
public
String
getToken
()
{
return
token
;
}
public
void
setToken
(
String
token
)
{
this
.
token
=
token
;
}
public
String
getRequest_params
()
{
return
request_params
;
}
public
void
setRequest_params
(
String
request_params
)
{
this
.
request_params
=
request_params
;
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-http/src/main/java/org/apache/seatunnel/datasource/plugin/http/HttpDataSourceFactory.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
http
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceChannel
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceFactory
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourcePluginInfo
;
import
org.apache.seatunnel.datasource.plugin.api.DatasourcePluginTypeEnum
;
import
com.google.auto.service.AutoService
;
import
com.google.common.collect.Sets
;
import
java.util.Set
;
@AutoService
(
DataSourceFactory
.
class
)
public
class
HttpDataSourceFactory
implements
DataSourceFactory
{
private
static
final
String
PLUGIN_NAME
=
"Http"
;
@Override
public
String
factoryIdentifier
()
{
return
PLUGIN_NAME
;
}
@Override
public
Set
<
DataSourcePluginInfo
>
supportedDataSources
()
{
DataSourcePluginInfo
ftpDatasourcePluginInfo
=
DataSourcePluginInfo
.
builder
()
.
name
(
PLUGIN_NAME
)
.
type
(
DatasourcePluginTypeEnum
.
FILE
.
getCode
())
.
version
(
"1.0.0"
)
.
supportVirtualTables
(
false
)
.
icon
(
"FtpFile"
)
.
build
();
return
Sets
.
newHashSet
(
ftpDatasourcePluginInfo
);
}
@Override
public
DataSourceChannel
createChannel
()
{
return
new
HttpDatasourceChannel
();
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-http/src/main/java/org/apache/seatunnel/datasource/plugin/http/HttpDatasourceChannel.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
http
;
import
org.apache.commons.lang3.StringUtils
;
import
org.apache.http.HttpResponse
;
import
org.apache.http.client.HttpClient
;
import
org.apache.http.client.methods.*
;
import
org.apache.http.entity.StringEntity
;
import
org.apache.http.util.EntityUtils
;
import
org.apache.seatunnel.api.configuration.util.OptionRule
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceChannel
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourcePluginException
;
import
org.apache.seatunnel.datasource.plugin.api.model.TableField
;
import
lombok.NonNull
;
import
java.net.URI
;
import
java.util.List
;
import
java.util.Map
;
import
java.util.Objects
;
public
class
HttpDatasourceChannel
implements
DataSourceChannel
{
@Override
public
OptionRule
getDataSourceOptions
(
@NonNull
String
pluginName
)
{
return
HttpOptionRule
.
optionRule
();
}
@Override
public
OptionRule
getDatasourceMetadataFieldsByDataSourceName
(
@NonNull
String
pluginName
)
{
return
null
;
}
@Override
public
boolean
checkDataSourceConnectivity
(
@NonNull
String
pluginName
,
@NonNull
Map
<
String
,
String
>
requestParams
)
{
HttpConfiguration
conf
=
HttpAConfiguration
.
getConfiguration
(
requestParams
);
try
{
HttpClient
httpClient
=
HttpClientService
.
connect
(
conf
);
if
(
Objects
.
isNull
(
conf
))
{
throw
new
DataSourcePluginException
(
String
.
format
(
"check ftp connectivity failed, config is: %s"
,
requestParams
));
}
String
url
=
conf
.
getUrl
();
String
method
=
conf
.
getMethod
();
String
token
=
conf
.
getToken
();
String
parmams
=
conf
.
getRequest_params
();
// 目标URL
URI
uri
=
new
URI
(
url
);
HttpResponse
response
=
null
;
if
(
StringUtils
.
isBlank
(
method
)
||
"GET"
.
equals
(
method
.
toUpperCase
()))
{
if
(
StringUtils
.
isNotBlank
(
parmams
))
{
url
=
url
+
"?"
+
parmams
;
}
System
.
out
.
println
(
"url:"
+
url
);
HttpGet
httpGet
=
new
HttpGet
(
url
);
if
(
StringUtils
.
isNotBlank
(
token
))
{
httpGet
.
setHeader
(
"Authorization"
,
"Bearer "
+
token
.
replace
(
"Bearer "
,
""
).
trim
());
}
// 执行请求并获得响应
response
=
httpClient
.
execute
(
httpGet
);
}
else
if
(
StringUtils
.
isBlank
(
method
)
||
"POST"
.
equals
(
method
.
toUpperCase
()))
{
HttpPost
httpPost
=
new
HttpPost
(
url
);
if
(
StringUtils
.
isNotBlank
(
token
))
{
httpPost
.
setHeader
(
"Authorization"
,
"Bearer "
+
token
.
replace
(
"Bearer "
,
""
).
trim
());
}
// 设置请求体(例如:JSON数据)
StringEntity
requestEntity
=
new
StringEntity
(
parmams
);
httpPost
.
setEntity
(
requestEntity
);
// 执行请求并获得响应
response
=
httpClient
.
execute
(
httpPost
);
}
else
if
(
StringUtils
.
isBlank
(
method
)
||
"PUT"
.
equals
(
method
.
toUpperCase
()))
{
HttpPut
httpPut
=
new
HttpPut
(
url
);
if
(
StringUtils
.
isNotBlank
(
token
))
{
httpPut
.
setHeader
(
"Authorization"
,
"Bearer "
+
token
.
replace
(
"Bearer "
,
""
).
trim
());
}
// 设置请求体(例如:JSON数据)
StringEntity
requestEntity
=
new
StringEntity
(
parmams
);
httpPut
.
setEntity
(
requestEntity
);
// 执行请求并获得响应
response
=
httpClient
.
execute
(
httpPut
);
}
else
if
(
StringUtils
.
isBlank
(
method
)
||
"DELETE"
.
equals
(
method
.
toUpperCase
()))
{
if
(
StringUtils
.
isNotBlank
(
parmams
))
{
url
=
url
+
"?"
+
parmams
;
}
System
.
out
.
println
(
"url:"
+
url
);
HttpDelete
httpDelete
=
new
HttpDelete
(
url
);
if
(
StringUtils
.
isNotBlank
(
token
))
{
httpDelete
.
setHeader
(
"Authorization"
,
"Bearer "
+
token
.
replace
(
"Bearer "
,
""
).
trim
());
}
// 执行请求并获得响应
response
=
httpClient
.
execute
(
httpDelete
);
}
else
if
(
StringUtils
.
isBlank
(
method
)
||
"PATCH"
.
equals
(
method
.
toUpperCase
()))
{
HttpPatch
httpPatch
=
new
HttpPatch
(
url
);
if
(
StringUtils
.
isNotBlank
(
token
))
{
httpPatch
.
setHeader
(
"Authorization"
,
"Bearer "
+
token
.
replace
(
"Bearer "
,
""
).
trim
());
}
// 设置请求体(例如:JSON数据)
StringEntity
requestEntity
=
new
StringEntity
(
parmams
);
httpPatch
.
setEntity
(
requestEntity
);
// 执行请求并获得响应
response
=
httpClient
.
execute
(
httpPatch
);
}
else
if
(
StringUtils
.
isBlank
(
method
)
||
"OPTIONS"
.
equals
(
method
.
toUpperCase
()))
{
if
(
StringUtils
.
isNotBlank
(
parmams
))
{
url
=
url
+
"?"
+
parmams
;
}
HttpOptions
httpOptions
=
new
HttpOptions
(
url
);
if
(
StringUtils
.
isNotBlank
(
token
))
{
httpOptions
.
setHeader
(
"Authorization"
,
"Bearer "
+
token
.
replace
(
"Bearer "
,
""
).
trim
());
}
// 执行请求并获得响应
response
=
httpClient
.
execute
(
httpOptions
);
}
else
if
(
StringUtils
.
isBlank
(
method
)
||
"HEAD"
.
equals
(
method
.
toUpperCase
()))
{
if
(
StringUtils
.
isNotBlank
(
parmams
))
{
url
=
url
+
"?"
+
parmams
;
}
HttpHead
httpHead
=
new
HttpHead
(
url
);
if
(
StringUtils
.
isNotBlank
(
token
))
{
httpHead
.
setHeader
(
"Authorization"
,
"Bearer "
+
token
.
replace
(
"Bearer "
,
""
).
trim
());
}
// 执行请求并获得响应
response
=
httpClient
.
execute
(
httpHead
);
}
int
statusCode
=
response
.
getStatusLine
().
getStatusCode
();
// 打印响应状态
System
.
out
.
println
(
"Response Status: "
+
statusCode
);
// 获取响应内容
String
responseBody
=
EntityUtils
.
toString
(
response
.
getEntity
());
System
.
out
.
println
(
"Response Body: "
+
responseBody
);
if
(
statusCode
==
200
)
{
return
true
;
}
else
{
throw
new
DataSourcePluginException
(
String
.
format
(
"check http connectivity failed, config is: %s"
,
requestParams
));
}
}
catch
(
Exception
e
)
{
throw
new
DataSourcePluginException
(
String
.
format
(
"check http connectivity failed, config is: %s"
,
requestParams
));
}
// return true;
}
@Override
public
List
<
String
>
getTables
(
@NonNull
String
pluginName
,
Map
<
String
,
String
>
requestParams
,
String
database
,
Map
<
String
,
String
>
options
)
{
throw
new
UnsupportedOperationException
(
"getTables is not supported for Ftp datasource"
);
}
@Override
public
List
<
String
>
getDatabases
(
@NonNull
String
pluginName
,
@NonNull
Map
<
String
,
String
>
requestParams
)
{
throw
new
UnsupportedOperationException
(
"getDatabases is not supported for Ftp datasource"
);
}
@Override
public
List
<
TableField
>
getTableFields
(
@NonNull
String
pluginName
,
@NonNull
Map
<
String
,
String
>
requestParams
,
@NonNull
String
database
,
@NonNull
String
table
)
{
throw
new
UnsupportedOperationException
(
"getTableFields is not supported for Ftp datasource"
);
}
@Override
public
Map
<
String
,
List
<
TableField
>>
getTableFields
(
@NonNull
String
pluginName
,
@NonNull
Map
<
String
,
String
>
requestParams
,
@NonNull
String
database
,
@NonNull
List
<
String
>
tables
)
{
throw
new
UnsupportedOperationException
(
"getTableFields is not supported for Ftp datasource"
);
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-http/src/main/java/org/apache/seatunnel/datasource/plugin/http/HttpOptionRule.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
http
;
import
org.apache.seatunnel.api.configuration.Option
;
import
org.apache.seatunnel.api.configuration.Options
;
import
org.apache.seatunnel.api.configuration.util.OptionRule
;
public
class
HttpOptionRule
{
public
static
final
Option
<
String
>
URL
=
Options
.
key
(
"url"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"the default host to use for connections"
);
public
static
final
Option
<
Integer
>
METHOD
=
Options
.
key
(
"method"
)
.
intType
()
.
noDefaultValue
()
.
withDescription
(
"the default port to use for connections"
);
public
static
final
Option
<
Integer
>
REQUEST_PARAMS
=
Options
.
key
(
"requestparams"
)
.
intType
()
.
noDefaultValue
()
.
withDescription
(
"the default port to use for connections"
);
public
static
final
Option
<
String
>
TOKEN
=
Options
.
key
(
"token"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"the http user token to use when connecting to the broker"
);
public
static
OptionRule
optionRule
()
{
return
OptionRule
.
builder
().
required
(
URL
,
METHOD
).
optional
(
TOKEN
,
REQUEST_PARAMS
).
build
();
}
public
static
OptionRule
metadataRule
()
{
return
null
;
}
public
enum
FileFormat
{
JSON
(
"json"
),
;
private
final
String
type
;
FileFormat
(
String
type
)
{
this
.
type
=
type
;
}
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-jdbc-access/pom.xml
0 → 100644
浏览文件 @
373502c0
<?xml version="1.0" encoding="UTF-8"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project
xmlns=
"http://maven.apache.org/POM/4.0.0"
xmlns:xsi=
"http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation=
"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>
4.0.0
</modelVersion>
<parent>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-datasource-plugins
</artifactId>
<version>
1.0.0-SNAPSHOT
</version>
</parent>
<artifactId>
datasource-jdbc-Access
</artifactId>
<properties>
<mysql-connector.version>
8.0.28
</mysql-connector.version>
</properties>
<dependencies>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
datasource-plugins-api
</artifactId>
<version>
${project.version}
</version>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.commons
</groupId>
<artifactId>
commons-lang3
</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/com.google.auto.service/auto-service -->
<dependency>
<groupId>
com.google.auto.service
</groupId>
<artifactId>
auto-service
</artifactId>
</dependency>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-api
</artifactId>
<scope>
provided
</scope>
</dependency>
<!-- driver -->
<dependency>
<groupId>
net.sf.ucanaccess
</groupId>
<artifactId>
ucanaccess
</artifactId>
<version>
5.0.1
</version>
</dependency>
</dependencies>
</project>
seatunnel-datasource/seatunnel-datasource-plugins/datasource-jdbc-access/src/main/java/org/apache/seatunnel/datasource/plugin/access/jdbc/AccessDataSourceConfig.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
access
.
jdbc
;
import
org.apache.seatunnel.api.configuration.util.OptionRule
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourcePluginInfo
;
import
org.apache.seatunnel.datasource.plugin.api.DatasourcePluginTypeEnum
;
import
com.google.common.collect.Sets
;
import
java.util.Set
;
public
class
AccessDataSourceConfig
{
public
static
final
String
PLUGIN_NAME
=
"JDBC-Access"
;
public
static
final
DataSourcePluginInfo
MYSQL_DATASOURCE_PLUGIN_INFO
=
DataSourcePluginInfo
.
builder
()
.
name
(
PLUGIN_NAME
)
.
icon
(
PLUGIN_NAME
)
.
version
(
"1.0.0"
)
.
type
(
DatasourcePluginTypeEnum
.
DATABASE
.
getCode
())
.
build
();
public
static
final
Set
<
String
>
MYSQL_SYSTEM_DATABASES
=
Sets
.
newHashSet
(
"SYSTEM"
,
"ROLL"
);
public
static
final
OptionRule
OPTION_RULE
=
OptionRule
.
builder
()
.
required
(
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
access
.
jdbc
.
AccessOptionRule
.
URL
,
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
access
.
jdbc
.
AccessOptionRule
.
DRIVER
)
.
optional
(
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
access
.
jdbc
.
AccessOptionRule
.
USER
,
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
access
.
jdbc
.
AccessOptionRule
.
PASSWORD
)
.
build
();
// public static final OptionRule METADATA_RULE =
// OptionRule.builder().required(org.apache.seatunnel.datasource.plugin.demeng.jdbc.DemengOptionRule.DATABASE, org.apache.seatunnel.datasource.plugin.demeng.jdbc.DemengOptionRule.TABLE).build();
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-jdbc-access/src/main/java/org/apache/seatunnel/datasource/plugin/access/jdbc/AccessJdbcDataSourceChannel.java
0 → 100644
浏览文件 @
373502c0
差异被折叠。
点击展开。
seatunnel-datasource/seatunnel-datasource-plugins/datasource-jdbc-access/src/main/java/org/apache/seatunnel/datasource/plugin/access/jdbc/AccessJdbcDataSourceFactory.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
access
.
jdbc
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceChannel
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceFactory
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourcePluginInfo
;
import
com.google.auto.service.AutoService
;
import
com.google.common.collect.Sets
;
import
lombok.extern.slf4j.Slf4j
;
import
java.util.Set
;
@Slf4j
@AutoService
(
DataSourceFactory
.
class
)
public
class
AccessJdbcDataSourceFactory
implements
DataSourceFactory
{
@Override
public
String
factoryIdentifier
()
{
return
AccessDataSourceConfig
.
PLUGIN_NAME
;
}
@Override
public
Set
<
DataSourcePluginInfo
>
supportedDataSources
()
{
return
Sets
.
newHashSet
(
AccessDataSourceConfig
.
MYSQL_DATASOURCE_PLUGIN_INFO
);
}
@Override
public
DataSourceChannel
createChannel
()
{
return
new
AccessJdbcDataSourceChannel
();
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-jdbc-access/src/main/java/org/apache/seatunnel/datasource/plugin/access/jdbc/AccessOptionRule.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
access
.
jdbc
;
import
org.apache.seatunnel.api.configuration.Option
;
import
org.apache.seatunnel.api.configuration.Options
;
public
class
AccessOptionRule
{
public
static
final
Option
<
String
>
URL
=
Options
.
key
(
"url"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"jdbc url, eg:"
+
" http://localhost:9000/bucket/filename.mdb"
);
public
static
final
Option
<
String
>
USER
=
Options
.
key
(
"user"
).
stringType
().
noDefaultValue
().
withDescription
(
"jdbc user"
);
public
static
final
Option
<
String
>
PASSWORD
=
Options
.
key
(
"password"
).
stringType
().
noDefaultValue
().
withDescription
(
"jdbc password"
);
// public static final Option<String> DATABASE =
// Options.key("database").stringType().noDefaultValue().withDescription("jdbc database");
// public static final Option<String> TABLE =
// Options.key("table").stringType().noDefaultValue().withDescription("jdbc table");
public
static
final
Option
<
DriverType
>
DRIVER
=
Options
.
key
(
"driver"
)
.
enumType
(
DriverType
.
class
)
.
defaultValue
(
DriverType
.
DEMENG
)
.
withDescription
(
"driver"
);
public
enum
DriverType
{
DEMENG
(
"net.ucanaccess.jdbc.UcanaccessDriver"
),
;
private
final
String
driverClassName
;
DriverType
(
String
driverClassName
)
{
this
.
driverClassName
=
driverClassName
;
}
public
String
getDriverClassName
()
{
return
driverClassName
;
}
@Override
public
String
toString
()
{
return
driverClassName
;
}
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-jdbc-demeng/pom.xml
0 → 100644
浏览文件 @
373502c0
<?xml version="1.0" encoding="UTF-8"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project
xmlns=
"http://maven.apache.org/POM/4.0.0"
xmlns:xsi=
"http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation=
"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>
4.0.0
</modelVersion>
<parent>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-datasource-plugins
</artifactId>
<version>
1.0.0-SNAPSHOT
</version>
</parent>
<artifactId>
datasource-jdbc-demeng
</artifactId>
<properties>
<mysql-connector.version>
8.0.28
</mysql-connector.version>
</properties>
<dependencies>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
datasource-plugins-api
</artifactId>
<version>
${project.version}
</version>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.commons
</groupId>
<artifactId>
commons-lang3
</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/com.google.auto.service/auto-service -->
<dependency>
<groupId>
com.google.auto.service
</groupId>
<artifactId>
auto-service
</artifactId>
</dependency>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-api
</artifactId>
<scope>
provided
</scope>
</dependency>
<!-- driver -->
<dependency>
<groupId>
com.dameng
</groupId>
<artifactId>
Dm8JdbcDriver18
</artifactId>
<version>
8.1.1.49
</version>
</dependency>
</dependencies>
</project>
seatunnel-datasource/seatunnel-datasource-plugins/datasource-jdbc-demeng/src/main/java/org/apache/seatunnel/datasource/plugin/demeng/jdbc/DemengDataSourceConfig.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
demeng
.
jdbc
;
import
org.apache.seatunnel.api.configuration.util.OptionRule
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourcePluginInfo
;
import
org.apache.seatunnel.datasource.plugin.api.DatasourcePluginTypeEnum
;
import
com.google.common.collect.Sets
;
import
java.util.Set
;
public
class
DemengDataSourceConfig
{
public
static
final
String
PLUGIN_NAME
=
"JDBC-Demeng"
;
public
static
final
DataSourcePluginInfo
MYSQL_DATASOURCE_PLUGIN_INFO
=
DataSourcePluginInfo
.
builder
()
.
name
(
PLUGIN_NAME
)
.
icon
(
PLUGIN_NAME
)
.
version
(
"1.0.0"
)
.
type
(
DatasourcePluginTypeEnum
.
DATABASE
.
getCode
())
.
build
();
public
static
final
Set
<
String
>
MYSQL_SYSTEM_DATABASES
=
Sets
.
newHashSet
(
"SYSTEM"
,
"ROLL"
);
public
static
final
OptionRule
OPTION_RULE
=
OptionRule
.
builder
()
.
required
(
DemengOptionRule
.
URL
,
DemengOptionRule
.
DRIVER
)
.
optional
(
DemengOptionRule
.
USER
,
DemengOptionRule
.
PASSWORD
)
.
build
();
public
static
final
OptionRule
METADATA_RULE
=
OptionRule
.
builder
().
required
(
DemengOptionRule
.
DATABASE
,
DemengOptionRule
.
TABLE
).
build
();
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-jdbc-demeng/src/main/java/org/apache/seatunnel/datasource/plugin/demeng/jdbc/DemengJdbcDataSourceChannel.java
0 → 100644
浏览文件 @
373502c0
差异被折叠。
点击展开。
seatunnel-datasource/seatunnel-datasource-plugins/datasource-jdbc-demeng/src/main/java/org/apache/seatunnel/datasource/plugin/demeng/jdbc/DemengJdbcDataSourceFactory.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
demeng
.
jdbc
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceChannel
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceFactory
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourcePluginInfo
;
import
com.google.auto.service.AutoService
;
import
com.google.common.collect.Sets
;
import
lombok.extern.slf4j.Slf4j
;
import
java.util.Set
;
@Slf4j
@AutoService
(
DataSourceFactory
.
class
)
public
class
DemengJdbcDataSourceFactory
implements
DataSourceFactory
{
@Override
public
String
factoryIdentifier
()
{
return
DemengDataSourceConfig
.
PLUGIN_NAME
;
}
@Override
public
Set
<
DataSourcePluginInfo
>
supportedDataSources
()
{
return
Sets
.
newHashSet
(
DemengDataSourceConfig
.
MYSQL_DATASOURCE_PLUGIN_INFO
);
}
@Override
public
DataSourceChannel
createChannel
()
{
return
new
DemengJdbcDataSourceChannel
();
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-jdbc-demeng/src/main/java/org/apache/seatunnel/datasource/plugin/demeng/jdbc/DemengOptionRule.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
demeng
.
jdbc
;
import
org.apache.seatunnel.api.configuration.Option
;
import
org.apache.seatunnel.api.configuration.Options
;
public
class
DemengOptionRule
{
public
static
final
Option
<
String
>
URL
=
Options
.
key
(
"url"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"jdbc url, eg:"
+
" jdbc:dm://localhost:5236"
);
public
static
final
Option
<
String
>
USER
=
Options
.
key
(
"user"
).
stringType
().
noDefaultValue
().
withDescription
(
"jdbc user"
);
public
static
final
Option
<
String
>
PASSWORD
=
Options
.
key
(
"password"
).
stringType
().
noDefaultValue
().
withDescription
(
"jdbc password"
);
public
static
final
Option
<
String
>
DATABASE
=
Options
.
key
(
"database"
).
stringType
().
noDefaultValue
().
withDescription
(
"jdbc database"
);
public
static
final
Option
<
String
>
TABLE
=
Options
.
key
(
"table"
).
stringType
().
noDefaultValue
().
withDescription
(
"jdbc table"
);
public
static
final
Option
<
DriverType
>
DRIVER
=
Options
.
key
(
"driver"
)
.
enumType
(
DriverType
.
class
)
.
defaultValue
(
DriverType
.
DEMENG
)
.
withDescription
(
"driver"
);
public
enum
DriverType
{
DEMENG
(
"dm.jdbc.driver.DmDriver"
),
;
private
final
String
driverClassName
;
DriverType
(
String
driverClassName
)
{
this
.
driverClassName
=
driverClassName
;
}
public
String
getDriverClassName
()
{
return
driverClassName
;
}
@Override
public
String
toString
()
{
return
driverClassName
;
}
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-xml/pom.xml
0 → 100644
浏览文件 @
373502c0
<?xml version="1.0" encoding="UTF-8"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project
xmlns=
"http://maven.apache.org/POM/4.0.0"
xmlns:xsi=
"http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation=
"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>
4.0.0
</modelVersion>
<parent>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-datasource-plugins
</artifactId>
<version>
1.0.0-SNAPSHOT
</version>
</parent>
<artifactId>
datasource-xml
</artifactId>
<dependencies>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
datasource-plugins-api
</artifactId>
<version>
${project.version}
</version>
<scope>
provided
</scope>
</dependency>
<dependency>
<groupId>
org.apache.seatunnel
</groupId>
<artifactId>
seatunnel-hadoop3-3.1.4-uber
</artifactId>
<exclusions>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
ch.qos.logback
</groupId>
<artifactId>
logback-classic
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-slf4j-impl
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
jcl-over-slf4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-core
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-1.2-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.logging.log4j
</groupId>
<artifactId>
log4j-1.2-api
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-client
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
log4j-over-slf4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-reload4j
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
io.quarkiverse.minio
</groupId>
<artifactId>
minio-client
</artifactId>
<version>
0.2.0
</version>
</dependency>
<!-- <dependency>-->
<!-- <groupId>org.slf4j</groupId>-->
<!-- <artifactId>slf4j-reload4j</artifactId>-->
<!-- <version>1.7.35</version>-->
<!-- <scope>test</scope>-->
<!-- </dependency>-->
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-common
</artifactId>
<version>
3.3.5
</version>
<exclusions>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-reload4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
log4j-over-slf4j
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-client
</artifactId>
<version>
3.3.5
</version>
<exclusions>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-reload4j
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.slf4j
</groupId>
<artifactId>
log4j-over-slf4j
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.apache.hadoop
</groupId>
<artifactId>
hadoop-aws
</artifactId>
<version>
3.3.5
</version>
</dependency>
<dependency>
<groupId>
com.amazonaws
</groupId>
<artifactId>
aws-java-sdk-bundle
</artifactId>
</dependency>
</dependencies>
</project>
seatunnel-datasource/seatunnel-datasource-plugins/datasource-xml/src/main/java/org/apache/seatunnel/datasource/plugin/xml/XMLAConfiguration.java
0 → 100644
浏览文件 @
373502c0
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
xml
;
import
lombok.extern.slf4j.Slf4j
;
import
org.apache.hadoop.conf.Configuration
;
import
org.apache.seatunnel.shade.com.typesafe.config.Config
;
import
org.apache.seatunnel.shade.com.typesafe.config.ConfigFactory
;
import
java.util.Map
;
@Slf4j
public
class
XMLAConfiguration
{
/* S3 constants */
private
static
final
String
S3A_SCHEMA
=
"s3a"
;
private
static
final
String
HDFS_S3N_IMPL
=
"org.apache.hadoop.fs.s3native.NativeS3FileSystem"
;
private
static
final
String
HDFS_S3A_IMPL
=
"org.apache.hadoop.fs.s3a.S3AFileSystem"
;
private
static
final
String
S3A_PROTOCOL
=
"s3a"
;
private
static
final
String
DEFAULT_PROTOCOL
=
"s3n"
;
private
static
final
String
S3_FORMAT_KEY
=
"fs.%s.%s"
;
private
static
final
String
HDFS_IMPL_KEY
=
"impl"
;
public
static
Configuration
getConfiguration
(
Map
<
String
,
String
>
s3Options
)
{
if
(!
s3Options
.
containsKey
(
XMLOptionRule
.
BUCKET
.
key
()))
{
throw
new
IllegalArgumentException
(
"S3 datasource bucket is null, please check your config"
);
}
if
(!
s3Options
.
containsKey
(
XMLOptionRule
.
FS_S3A_ENDPOINT
.
key
()))
{
throw
new
IllegalArgumentException
(
"S3 datasource endpoint is null, please check your config"
);
}
String
bucket
=
s3Options
.
get
(
XMLOptionRule
.
BUCKET
.
key
());
String
protocol
=
DEFAULT_PROTOCOL
;
if
(
bucket
.
startsWith
(
S3A_PROTOCOL
))
{
protocol
=
S3A_PROTOCOL
;
}
String
fsImpl
=
protocol
.
equals
(
S3A_PROTOCOL
)
?
HDFS_S3A_IMPL
:
HDFS_S3N_IMPL
;
Configuration
hadoopConf
=
new
Configuration
();
hadoopConf
.
set
(
"fs.defaut.name"
,
bucket
);
hadoopConf
.
set
(
XMLOptionRule
.
FS_S3A_ENDPOINT
.
key
(),
s3Options
.
get
(
XMLOptionRule
.
FS_S3A_ENDPOINT
.
key
()));
hadoopConf
.
set
(
formatKey
(
protocol
,
HDFS_IMPL_KEY
),
fsImpl
);
if
(
s3Options
.
containsKey
(
XMLOptionRule
.
HADOOP_S3_PROPERTIES
.
key
()))
{
Config
configObject
=
ConfigFactory
.
parseString
(
s3Options
.
get
(
XMLOptionRule
.
HADOOP_S3_PROPERTIES
.
key
()));
configObject
.
entrySet
()
.
forEach
(
entry
->
{
hadoopConf
.
set
(
entry
.
getKey
(),
entry
.
getValue
().
unwrapped
().
toString
());
});
}
if
(
XMLOptionRule
.
S3aAwsCredentialsProvider
.
SimpleAWSCredentialsProvider
.
getProvider
()
.
equals
(
s3Options
.
get
(
XMLOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
())))
{
hadoopConf
.
set
(
XMLOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
(),
s3Options
.
get
(
XMLOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
()));
hadoopConf
.
set
(
"fs.s3a.access.key"
,
s3Options
.
get
(
XMLOptionRule
.
ACCESS_KEY
.
key
()));
hadoopConf
.
set
(
"fs.s3a.secret.key"
,
s3Options
.
get
(
XMLOptionRule
.
SECRET_KEY
.
key
()));
}
else
{
hadoopConf
.
set
(
XMLOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
(),
s3Options
.
get
(
XMLOptionRule
.
S3A_AWS_CREDENTIALS_PROVIDER
.
key
()));
}
return
hadoopConf
;
}
private
static
String
formatKey
(
String
protocol
,
String
key
)
{
return
String
.
format
(
S3_FORMAT_KEY
,
protocol
,
key
);
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-xml/src/main/java/org/apache/seatunnel/datasource/plugin/xml/XMLClientService.java
0 → 100644
浏览文件 @
373502c0
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
xml
;
import
io.minio.MinioClient
;
import
io.minio.errors.MinioException
;
public
class
XMLClientService
{
private
String
ENDPOINT
;
private
String
PROVIDER
;
private
String
USERNAME
;
private
String
PASSWORD
;
private
String
BUCKET
;
private
Integer
PORT
;
private
final
String
clientId
=
"Client"
+
(
int
)
(
Math
.
random
()
*
100000000
);
private
MinioClient
minioClient
;
public
XMLClientService
(
String
endpoint
,
String
provider
,
String
username
,
String
password
,
Integer
port
)
throws
MinioException
{
this
.
ENDPOINT
=
endpoint
;
this
.
PROVIDER
=
provider
;
this
.
USERNAME
=
username
;
this
.
PASSWORD
=
password
;
this
.
PORT
=
port
;
setMinioClient
(
endpoint
,
provider
,
username
,
password
,
port
);
}
public
MinioClient
getMinioClient
()
{
return
minioClient
;
}
public
void
setMinioClient
(
String
endpoint
,
String
provider
,
String
username
,
String
password
,
Integer
port
)
throws
MinioException
{
minioClient
=
new
MinioClient
.
Builder
()
.
endpoint
(
endpoint
,
port
,
false
)
.
credentials
(
username
,
password
)
.
build
();
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-xml/src/main/java/org/apache/seatunnel/datasource/plugin/xml/XMLDataSourceFactory.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
xml
;
import
com.google.auto.service.AutoService
;
import
com.google.common.collect.Sets
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceChannel
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourceFactory
;
import
org.apache.seatunnel.datasource.plugin.api.DataSourcePluginInfo
;
import
org.apache.seatunnel.datasource.plugin.api.DatasourcePluginTypeEnum
;
import
java.util.Set
;
@AutoService
(
DataSourceFactory
.
class
)
public
class
XMLDataSourceFactory
implements
DataSourceFactory
{
private
static
final
String
PLUGIN_NAME
=
"S3"
;
@Override
public
String
factoryIdentifier
()
{
return
PLUGIN_NAME
;
}
@Override
public
Set
<
DataSourcePluginInfo
>
supportedDataSources
()
{
DataSourcePluginInfo
s3DatasourcePluginInfo
=
DataSourcePluginInfo
.
builder
()
.
name
(
PLUGIN_NAME
)
.
type
(
DatasourcePluginTypeEnum
.
FILE
.
getCode
())
.
version
(
"1.0.0"
)
.
supportVirtualTables
(
false
)
.
icon
(
"S3File"
)
.
build
();
return
Sets
.
newHashSet
(
s3DatasourcePluginInfo
);
}
@Override
public
DataSourceChannel
createChannel
()
{
return
XMLDatasourceChannel
.
getInstance
();
}
}
seatunnel-datasource/seatunnel-datasource-plugins/datasource-xml/src/main/java/org/apache/seatunnel/datasource/plugin/xml/XMLDatasourceChannel.java
0 → 100644
浏览文件 @
373502c0
差异被折叠。
点击展开。
seatunnel-datasource/seatunnel-datasource-plugins/datasource-xml/src/main/java/org/apache/seatunnel/datasource/plugin/xml/XMLOptionRule.java
0 → 100644
浏览文件 @
373502c0
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
seatunnel
.
datasource
.
plugin
.
xml
;
import
org.apache.seatunnel.api.configuration.Option
;
import
org.apache.seatunnel.api.configuration.Options
;
import
org.apache.seatunnel.api.configuration.util.OptionRule
;
import
java.util.Arrays
;
import
java.util.Map
;
public
class
XMLOptionRule
{
public
static
final
Option
<
String
>
ACCESS_KEY
=
Options
.
key
(
"access_key"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 access key"
);
public
static
final
Option
<
String
>
SECRET_KEY
=
Options
.
key
(
"secret_key"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 secret key"
);
public
static
final
Option
<
String
>
BUCKET
=
Options
.
key
(
"bucket"
).
stringType
().
noDefaultValue
().
withDescription
(
"S3 bucket name"
);
public
static
final
Option
<
String
>
FS_S3A_ENDPOINT
=
Options
.
key
(
"fs.s3a.endpoint"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"fs s3a endpoint"
);
public
static
final
Option
<
S3aAwsCredentialsProvider
>
S3A_AWS_CREDENTIALS_PROVIDER
=
Options
.
key
(
"fs.s3a.aws.credentials.provider"
)
.
enumType
(
S3aAwsCredentialsProvider
.
class
)
.
defaultValue
(
S3aAwsCredentialsProvider
.
InstanceProfileCredentialsProvider
)
.
withDescription
(
"s3a aws credentials provider"
);
public
static
final
Option
<
Map
<
String
,
String
>>
HADOOP_S3_PROPERTIES
=
Options
.
key
(
"hadoop_s3_properties"
)
.
mapType
()
.
noDefaultValue
()
.
withDescription
(
"{\n"
+
"fs.s3a.buffer.dir=/data/st_test/s3a\n"
+
"fs.s3a.fast.upload.buffer=disk\n"
+
"}"
);
public
static
OptionRule
optionRule
()
{
return
OptionRule
.
builder
()
.
required
(
BUCKET
,
FS_S3A_ENDPOINT
,
S3A_AWS_CREDENTIALS_PROVIDER
)
.
optional
(
HADOOP_S3_PROPERTIES
)
.
conditional
(
S3A_AWS_CREDENTIALS_PROVIDER
,
S3aAwsCredentialsProvider
.
SimpleAWSCredentialsProvider
,
ACCESS_KEY
,
SECRET_KEY
)
.
build
();
}
public
static
final
Option
<
String
>
PATH
=
Options
.
key
(
"path"
).
stringType
().
noDefaultValue
().
withDescription
(
"S3 write path"
);
public
static
final
Option
<
String
>
TYPE
=
Options
.
key
(
"file_format_type"
)
.
stringType
()
.
defaultValue
(
"xml"
)
.
withDescription
(
"S3 write type"
);
public
static
final
Option
<
String
>
DELIMITER
=
Options
.
key
(
"delimiter"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write delimiter"
);
public
static
final
Option
<
Map
<
String
,
String
>>
SCHEMA
=
Options
.
key
(
"schema"
).
mapType
().
noDefaultValue
().
withDescription
(
"SeaTunnel Schema"
);
public
static
final
Option
<
Boolean
>
PARSE_PARSE_PARTITION_FROM_PATH
=
Options
.
key
(
"parse_partition_from_path"
)
.
booleanType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write parse_partition_from_path"
);
public
static
final
Option
<
String
>
DATE_FORMAT
=
Options
.
key
(
"date_format"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write date_format"
);
public
static
final
Option
<
String
>
DATETIME_FORMAT
=
Options
.
key
(
"time_format"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write time_format"
);
public
static
final
Option
<
String
>
TIME_FORMAT
=
Options
.
key
(
"datetime_format"
)
.
stringType
()
.
noDefaultValue
()
.
withDescription
(
"S3 write datetime_format"
);
public
static
OptionRule
metadataRule
()
{
return
OptionRule
.
builder
()
.
required
(
PATH
,
TYPE
)
.
conditional
(
TYPE
,
FileFormat
.
XML
.
type
,
DELIMITER
)
.
conditional
(
TYPE
,
FileFormat
.
XML
.
type
,
SCHEMA
)
.
optional
(
PARSE_PARSE_PARTITION_FROM_PATH
)
.
optional
(
DATE_FORMAT
)
.
optional
(
DATETIME_FORMAT
)
.
optional
(
TIME_FORMAT
)
.
build
();
}
public
enum
S3aAwsCredentialsProvider
{
SimpleAWSCredentialsProvider
(
"org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider"
),
InstanceProfileCredentialsProvider
(
"com.amazonaws.auth.InstanceProfileCredentialsProvider"
);
private
String
provider
;
S3aAwsCredentialsProvider
(
String
provider
)
{
this
.
provider
=
provider
;
}
public
String
getProvider
()
{
return
provider
;
}
@Override
public
String
toString
()
{
return
provider
;
}
}
public
enum
FileFormat
{
CSV
(
"csv"
),
TEXT
(
"txt"
),
PARQUET
(
"parquet"
),
ORC
(
"orc"
),
JSON
(
"json"
),
XML
(
"xml"
);
private
final
String
type
;
FileFormat
(
String
type
)
{
this
.
type
=
type
;
}
}
}
seatunnel-datasource/seatunnel-datasource-plugins/pom.xml
浏览文件 @
373502c0
...
...
@@ -50,7 +50,12 @@
<module>
datasource-redis
</module>
<module>
datasource-rabbitmq
</module>
<module>
datasource-ftp
</module>
<module>
datasource-jdbc-demeng
</module>
<module>
datasource-jdbc-access
</module>
<module>
datasource-http
</module>
<module>
datasource-xml
</module>
<module>
datasource-csv
</module>
<module>
datasource-excel
</module>
</modules>
<build>
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论