├── .gitignore
├── blink-job
├── src
│ ├── test
│ │ └── resources
│ │ │ ├── demo.json
│ │ │ └── test-data-flat.orc
│ └── main
│ │ └── java
│ │ └── ambition
│ │ └── blink
│ │ └── yarn
│ │ └── job
│ │ └── YarnBase.java
└── pom.xml
├── doc
├── v3.0.0.md
├── v2.0.1.md
├── v2.0.0.md
└── v1.0.0.md
├── blink-libraries
├── mysql
│ ├── mysql-side
│ │ └── pom.xml
│ ├── mysql-sink
│ │ └── pom.xml
│ └── pom.xml
└── pom.xml
├── blink-common
├── src
│ └── main
│ │ └── java
│ │ └── ambition
│ │ └── blink
│ │ └── common
│ │ ├── exception
│ │ ├── FlinkSqlException.java
│ │ └── FlinkJobException.java
│ │ ├── job
│ │ └── JobParameter.java
│ │ ├── api
│ │ └── Result.java
│ │ └── utils
│ │ └── BlinkStringUtils.java
└── pom.xml
├── blink-client
├── src
│ └── main
│ │ └── java
│ │ └── ambition
│ │ └── blink
│ │ ├── sql
│ │ ├── SqlConvertService.java
│ │ └── SqlParserService.java
│ │ └── job
│ │ └── JobClient.java
└── pom.xml
├── blink-sql
├── src
│ ├── main
│ │ └── java
│ │ │ └── ambition
│ │ │ └── blink
│ │ │ └── sql
│ │ │ ├── SqlConstant.java
│ │ │ ├── SqlUtils.java
│ │ │ └── impl
│ │ │ ├── SqlConvertServiceImpl.java
│ │ │ └── SqlParserServiceImpl.java
│ └── test
│ │ └── java
│ │ └── ambition
│ │ └── blink
│ │ └── sql
│ │ ├── SqlUtilsTest.java
│ │ └── impl
│ │ ├── SqlParserServiceImplTest.java
│ │ └── SqlConvertServiceImplTest.java
└── pom.xml
├── blink-stream
├── src
│ ├── main
│ │ └── java
│ │ │ └── ambition
│ │ │ └── blink
│ │ │ └── stream
│ │ │ ├── StreamTableUtils.java
│ │ │ └── job
│ │ │ └── impl
│ │ │ └── StreamJobClientImpl.java
│ └── test
│ │ └── java
│ │ └── ambition
│ │ └── blink
│ │ └── stream
│ │ └── sql
│ │ └── SqlContent.java
└── pom.xml
├── README.md
├── LICENSE
└── pom.xml
/.gitignore:
--------------------------------------------------------------------------------
1 | .idea
2 | target
3 |
--------------------------------------------------------------------------------
/blink-job/src/test/resources/demo.json:
--------------------------------------------------------------------------------
1 | {"id":1001,"name":"zs","date":"2010-10-08"}
2 | {"id":1002,"name":"ls","date":"2019-10-08","age":24}
3 |
--------------------------------------------------------------------------------
/blink-job/src/test/resources/test-data-flat.orc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ambition119/FlinkSQL/HEAD/blink-job/src/test/resources/test-data-flat.orc
--------------------------------------------------------------------------------
/doc/v3.0.0.md:
--------------------------------------------------------------------------------
1 | 1.使用flink 1.10版本
2 | 在1.10之前的版本自带的sql解析功能不完善,如解析function,watermark等,所以比较鸡肋,还不如不用更换以前开发的解析层功能。
3 |
4 |
5 | 2.SqlParserServiceImpl
6 |
7 |
8 | 3.JobClientImpl提供JobGraph生成功能
9 |
10 | 4.job提交的功能, 参考test目录的example
11 |
12 | 5.微信或者钉钉告警功能
13 |
--------------------------------------------------------------------------------
/doc/v2.0.1.md:
--------------------------------------------------------------------------------
1 | 1. v2.0.1版本SQL书写语法使用标准create table写法,with参数增加type和connect.type用来标识表类型和连接器类型
2 |
3 |
4 | 2.SqlServiceImpl提供sql解析功能
5 |
6 |
7 | 3.BatchJobClientImpl提供JobGraph和Plan生成功能
8 |
9 |
10 | 4.StreamJobClientImpl提供StreamGraph生成功能
11 |
12 |
13 | 5.job提交的功能, 参考test目录的example
14 |
--------------------------------------------------------------------------------
/doc/v2.0.0.md:
--------------------------------------------------------------------------------
1 | 1.SQL书写语法参考flink issues和对应提供的doc:
2 | [SQL DDL](https://issues.apache.org/jira/browse/FLINK-8039),
3 | [SQL DDL DOC](https://docs.google.com/document/d/1TTP-GCC8wSsibJaSUyFZ_5NBAHYEB1FVmPpP7RgDGBA/edit?usp=sharing)。
4 |
5 |
6 | 2.SqlServiceImpl提供sql解析功能
7 |
8 |
9 | 3.BatchJobClientImpl提供JobGraph和Plan生成功能
10 |
11 |
12 | 4.StreamJobClientImpl提供StreamGraph生成功能
13 |
14 |
15 | 5.job提交的功能, 参考test目录的example
16 |
--------------------------------------------------------------------------------
/blink-libraries/mysql/mysql-side/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
5 |
6 | mysql
7 | ambition
8 | v3.0.0
9 |
10 | 4.0.0
11 |
12 | mysql-side
13 | v3.0.0
14 |
15 |
16 |
--------------------------------------------------------------------------------
/blink-libraries/mysql/mysql-sink/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
5 |
6 | mysql
7 | ambition
8 | v3.0.0
9 |
10 | 4.0.0
11 |
12 | mysql-sink
13 | v3.0.0
14 |
15 |
16 |
--------------------------------------------------------------------------------
/doc/v1.0.0.md:
--------------------------------------------------------------------------------
1 | 1. SqlConvertService是将用户sql语句解析为不同类型,source,sink,view,dml(目前只支持insert into)
2 |
3 | SqlParserImpl SQL的解析
4 |
5 | Validator 验证器
6 |
7 | Planner 计划解析
8 |
9 | 2. FlinkJobImpl是实现Flink的Source和Sink,以及JobGraph
10 |
11 | 3. JobGraph的提交和执行,如StandaloneClusterClient.submitJob或者Yarn的远程提交方式YarnClusterClient.runDetached
12 |
13 | 4. CREATE Function解析更改calcite源代码实现,参见[CALCITE-2663](https://issues.apache.org/jira/browse/CALCITE-2663)
14 |
15 | 5. SQL CEP 实现参考源码更改目录[flink-oraginalCode](https://github.com/ambition119/FlinkSQL/commit/32d7ed123f8d3f0669b430f68a14852ff3819bca)
16 |
--------------------------------------------------------------------------------
/blink-libraries/mysql/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
5 |
6 | blink-libraries
7 | ambition
8 | v3.0.0
9 |
10 | 4.0.0
11 |
12 | mysql
13 | pom
14 | v3.0.0
15 |
16 | mysql-side
17 | mysql-sink
18 |
19 |
20 |
21 |
--------------------------------------------------------------------------------
/blink-common/src/main/java/ambition/blink/common/exception/FlinkSqlException.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.common.exception;
20 |
21 | public class FlinkSqlException extends Exception {
22 | public FlinkSqlException() {
23 | super();
24 | }
25 |
26 | public FlinkSqlException(String message) {
27 | super(message);
28 | }
29 | }
30 |
--------------------------------------------------------------------------------
/blink-common/src/main/java/ambition/blink/common/exception/FlinkJobException.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.common.exception;
20 |
21 | public class FlinkJobException extends Exception {
22 |
23 | public FlinkJobException() {
24 | super();
25 | }
26 |
27 | public FlinkJobException(String message) {
28 | super(message);
29 | }
30 | }
31 |
--------------------------------------------------------------------------------
/blink-client/src/main/java/ambition/blink/sql/SqlConvertService.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.sql;
20 |
21 | import java.util.List;
22 | import java.util.Map;
23 |
24 | public interface SqlConvertService {
25 | /**
26 | * @param sqlContext sql语句集合
27 | * @return sql语句分类对应的集合
28 | * @throws Exception
29 | */
30 | Map> sqlConvert(String sqlContext) throws Exception;
31 | }
32 |
--------------------------------------------------------------------------------
/blink-client/src/main/java/ambition/blink/job/JobClient.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.job;
20 |
21 | import org.apache.flink.api.common.Plan;
22 | import org.apache.flink.runtime.jobgraph.JobGraph;
23 | import ambition.blink.common.job.JobParameter;
24 |
25 | import java.util.Map;
26 |
27 | public interface JobClient {
28 |
29 | Plan getJobPlan(JobParameter jobParameter, Map extParams) throws Exception;
30 |
31 | JobGraph getJobGraph(JobParameter jobParameter, Map extParams) throws Exception;
32 | }
33 |
--------------------------------------------------------------------------------
/blink-sql/src/main/java/ambition/blink/sql/SqlConstant.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.sql;
20 |
21 | public interface SqlConstant {
22 | String CREATE_TABLE="(?i)^CREATE\\s+TABLE";
23 |
24 | String CREATE_TMP_FUNCTION="(?i)^CREATE\\s+TEMPORARY\\s+FUNCTION";
25 | String CREATE_FUNCTION="(?i)^CREATE\\s+FUNCTION";
26 |
27 | String CREATE_VIEW="(?i)^CREATE\\s+VIEW";
28 |
29 | String INSERT_INTO="(?i)^INSERT\\s+INTO";
30 |
31 | String FUNCTION="FUNCTION";
32 | String TABLE ="TABLE";
33 | String VIEW ="VIEW";
34 | String INSERT ="INSERT";
35 | String SQL_END_FLAG=";";
36 | }
37 |
--------------------------------------------------------------------------------
/blink-common/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
18 |
21 |
22 | blink
23 | ambition
24 | v3.0.0
25 |
26 | 4.0.0
27 |
28 | blink-common
29 | v3.0.0
30 |
31 |
32 |
33 |
34 |
35 |
--------------------------------------------------------------------------------
/blink-libraries/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
18 |
21 |
22 | blink
23 | ambition
24 | v3.0.0
25 |
26 | 4.0.0
27 |
28 | blink-libraries
29 | pom
30 | v3.0.0
31 |
32 | mysql
33 |
34 |
35 |
--------------------------------------------------------------------------------
/blink-common/src/main/java/ambition/blink/common/job/JobParameter.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.common.job;
20 |
21 | import java.util.List;
22 | import java.util.Map;
23 |
24 | public class JobParameter {
25 | private String jobName;
26 | private Map> sqls;
27 |
28 | public JobParameter() {
29 | }
30 |
31 | public JobParameter(String jobName, Map> sqls) {
32 | this.jobName = jobName;
33 | this.sqls = sqls;
34 | }
35 |
36 | public String getJobName() {
37 | return jobName;
38 | }
39 |
40 | public void setJobName(String jobName) {
41 | this.jobName = jobName;
42 | }
43 |
44 | public Map> getSqls() {
45 | return sqls;
46 | }
47 |
48 | public void setSqls(Map> sqls) {
49 | this.sqls = sqls;
50 | }
51 | }
52 |
--------------------------------------------------------------------------------
/blink-common/src/main/java/ambition/blink/common/api/Result.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.common.api;
20 |
21 | public class Result {
22 | private String message;
23 | private String code;
24 | private T result;
25 |
26 | public Result() {
27 | }
28 |
29 | public Result(String message, String code, T result) {
30 | this.message = message;
31 | this.code = code;
32 | this.result = result;
33 | }
34 |
35 | public T getResult() {
36 | return result;
37 | }
38 |
39 | public void setResult(T result) {
40 | this.result = result;
41 | }
42 |
43 | public String getMessage() {
44 | return message;
45 | }
46 |
47 | public void setMessage(String message) {
48 | this.message = message;
49 | }
50 |
51 | public String getCode() {
52 | return code;
53 | }
54 |
55 | public void setCode(String code) {
56 | this.code = code;
57 | }
58 | }
59 |
--------------------------------------------------------------------------------
/blink-stream/src/main/java/ambition/blink/stream/StreamTableUtils.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.stream;
20 |
21 | import java.util.Map;
22 | import org.apache.calcite.sql.SqlNode;
23 | import org.apache.calcite.sql.SqlNodeList;
24 | import org.apache.flink.sql.parser.ddl.SqlCreateTable;
25 | import org.apache.flink.table.sinks.TableSink;
26 | import org.apache.flink.table.sources.TableSource;
27 |
28 | public class StreamTableUtils {
29 | public static TableSource getTableSource(SqlCreateTable sqlCreateTable, Map extParams) {
30 | TableSource result = null;
31 |
32 | SqlNodeList columnList = sqlCreateTable.getColumnList();
33 | SqlNodeList propertyList = sqlCreateTable.getPropertyList();
34 |
35 | for (SqlNode prop : propertyList) {
36 | }
37 |
38 |
39 | return result;
40 | }
41 |
42 | public static TableSink getTableSink(SqlCreateTable sqlCreateTable, Map extParams) {
43 | TableSink result = null;
44 |
45 |
46 | return result;
47 | }
48 | }
49 |
--------------------------------------------------------------------------------
/blink-client/src/main/java/ambition/blink/sql/SqlParserService.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.sql;
20 |
21 | import java.util.List;
22 | import org.apache.flink.sql.parser.ddl.SqlCreateFunction;
23 | import org.apache.flink.sql.parser.ddl.SqlCreateTable;
24 | import org.apache.flink.sql.parser.ddl.SqlCreateView;
25 | import org.apache.flink.sql.parser.dml.RichSqlInsert;
26 |
27 | /**
28 | * @Date: 2020/1/2
29 | */
30 | public interface SqlParserService {
31 | /**
32 | * function语句解析
33 | * @param funSql
34 | * @return
35 | * @throws Exception
36 | */
37 | List sqlFunctionParser(String funSql) throws Exception;
38 |
39 | /**
40 | * ddl语句解析
41 | * @param ddlSql
42 | * @return
43 | * @throws Exception
44 | */
45 | List sqlTableParser(String ddlSql) throws Exception;
46 |
47 | /**
48 | * sql中视图sql解析
49 | * @param viewSql
50 | * @return sql的封装信息类
51 | * @throws Exception
52 | */
53 | List sqlViewParser(String viewSql) throws Exception;
54 |
55 | /**
56 | * sql中的insert into内容
57 | * @param insertSql
58 | * @return
59 | * @throws Exception
60 | */
61 | List sqlDmlParser(String insertSql) throws Exception;
62 | }
63 |
--------------------------------------------------------------------------------
/blink-client/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
18 |
21 |
22 | blink
23 | ambition
24 | v3.0.0
25 |
26 | 4.0.0
27 |
28 | blink-client
29 | v3.0.0
30 |
31 |
32 |
33 | ambition
34 | blink-common
35 | ${project.version}
36 |
37 |
38 |
39 | org.apache.calcite
40 | calcite-core
41 |
42 |
43 |
44 | org.apache.flink
45 | flink-runtime_${scala.binary.version}
46 |
47 |
48 |
49 | org.apache.flink
50 | flink-sql-parser
51 |
52 |
53 |
54 |
--------------------------------------------------------------------------------
/blink-job/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
18 |
21 |
22 | blink
23 | ambition
24 | v3.0.0
25 |
26 | 4.0.0
27 |
28 | blink-job
29 | v3.0.0
30 |
31 |
32 |
33 | org.apache.flink
34 | flink-core
35 |
36 |
37 |
38 | org.apache.flink
39 | flink-runtime_${scala.binary.version}
40 |
41 |
42 |
43 | org.apache.flink
44 | flink-optimizer_${scala.binary.version}
45 |
46 |
47 |
48 | org.apache.flink
49 | flink-java
50 |
51 |
52 |
53 | org.apache.flink
54 | flink-yarn_${scala.binary.version}
55 |
56 |
57 |
58 | org.apache.flink
59 | flink-clients_${scala.binary.version}
60 |
61 |
62 |
63 |
--------------------------------------------------------------------------------
/blink-sql/src/main/java/ambition/blink/sql/SqlUtils.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.sql;
20 |
21 | import java.io.ByteArrayInputStream;
22 | import java.io.InputStream;
23 | import java.io.InputStreamReader;
24 | import java.io.Reader;
25 | import java.nio.charset.Charset;
26 | import org.apache.calcite.avatica.util.Casing;
27 | import org.apache.calcite.avatica.util.Quoting;
28 | import org.apache.calcite.sql.SqlNodeList;
29 | import org.apache.calcite.sql.parser.SqlParser;
30 | import org.apache.flink.sql.parser.impl.FlinkSqlParserImpl;
31 | import org.apache.flink.sql.parser.validate.FlinkSqlConformance;
32 |
33 | /**
34 | * @Date: 2020/1/2
35 | */
36 | public class SqlUtils {
37 |
38 | public static SqlNodeList parseSql(String sql) throws Exception {
39 | InputStream stream = new ByteArrayInputStream(sql.getBytes());
40 | SqlParser sqlParser = getSqlParser(stream);
41 |
42 | // 仅限于解析 query 的内容,比如解析ddl会报 Encountered ""
43 | // SqlNode sqlNode = sqlParser.parseStmt();
44 | // 解析的范围广
45 | SqlNodeList sqlNodeList = sqlParser.parseStmtList();
46 | return sqlNodeList;
47 | }
48 |
49 | public static SqlParser getSqlParser(InputStream stream) {
50 | Reader source = new InputStreamReader(stream,Charset.defaultCharset());
51 |
52 | // SqlConformance.HIVE; 如果有分区,则使用该sql标准
53 |
54 | return SqlParser.create(source,
55 | SqlParser.configBuilder()
56 | .setParserFactory(FlinkSqlParserImpl.FACTORY)
57 | .setQuoting(Quoting.BACK_TICK)
58 | .setUnquotedCasing(Casing.UNCHANGED)
59 | .setQuotedCasing(Casing.UNCHANGED)
60 | .setConformance(FlinkSqlConformance.HIVE)
61 | .build());
62 | }
63 |
64 | }
65 |
--------------------------------------------------------------------------------
/blink-sql/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
18 |
21 |
22 | blink
23 | ambition
24 | v3.0.0
25 |
26 | 4.0.0
27 |
28 | blink-sql
29 | v3.0.0
30 |
31 |
32 |
33 | ambition
34 | blink-client
35 | ${project.version}
36 |
37 |
38 | ambition
39 | blink-common
40 | ${project.version}
41 |
42 |
43 |
44 | org.apache.commons
45 | commons-lang3
46 |
47 |
48 |
49 | commons-collections
50 | commons-collections
51 |
52 |
53 |
54 | org.apache.calcite
55 | calcite-core
56 |
57 |
58 |
59 | org.apache.flink
60 | flink-sql-parser
61 |
62 |
63 |
64 | junit
65 | junit
66 | test
67 |
68 |
69 |
70 |
--------------------------------------------------------------------------------
/blink-stream/src/test/java/ambition/blink/stream/sql/SqlContent.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.stream.sql;
20 |
21 | public interface SqlContent {
22 | String sqls = "CREATE FUNCTION demouf AS \n"
23 | + " 'ambition.api.sql.function.DemoUDF' \n"
24 | + "LIBRARY 'hdfs://flink/udf/jedis.jar','hdfs://flink/udf/customudf.jar';\n"
25 | + " \n"
26 | + "CREATE SOURCE TABLE kafka_source (\n"
27 | + " `date` varchar,\n"
28 | + " amount float, \n"
29 | + " proctime timestamp\n"
30 | + " ) \n"
31 | + "with (\n"
32 | + " type=kafka,\n"
33 | + " 'flink.parallelism'=1,\n"
34 | + " 'kafka.topic'=topic,\n"
35 | + " 'kafka.group.id'=flinks,\n"
36 | + " 'kafka.enable.auto.commit'=true,\n"
37 | + " 'kafka.bootstrap.servers'='localhost:9092'\n"
38 | + ");\n"
39 | + "\n"
40 | + "CREATE SINK TABLE mysql_sink (\n"
41 | + " `date` varchar, \n"
42 | + " amount float, \n"
43 | + " PRIMARY KEY (`date`)\n"
44 | + " ) \n"
45 | + "with (\n"
46 | + " type=mysql,\n"
47 | + " 'mysql.connection'='localhost:3306',\n"
48 | + " 'mysql.db.name'=flink,\n"
49 | + " 'mysql.batch.size'=0,\n"
50 | + " 'mysql.table.name'=flink_table,\n"
51 | + " 'mysql.user'=root,\n"
52 | + " 'mysql.pass'=root\n"
53 | + ");\n"
54 | + "\n"
55 | + "CREATE VIEW view_select AS \n"
56 | + " SELECT `date`, \n"
57 | + " amount \n"
58 | + " FROM kafka_source \n"
59 | + " GROUP BY \n"
60 | + " `date`,\n"
61 | + " amount\n"
62 | + " ;\n"
63 | + "\n"
64 | + "\n"
65 | + "INSERT INTO mysql_sink \n"
66 | + " SELECT \n"
67 | + " `date`, \n"
68 | + " sum(amount) \n"
69 | + " FROM view_select \n"
70 | + " GROUP BY \n"
71 | + " `date`\n"
72 | + " ;";
73 | }
74 |
--------------------------------------------------------------------------------
/blink-sql/src/test/java/ambition/blink/sql/SqlUtilsTest.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.sql;
20 |
21 | import org.apache.calcite.sql.SqlNode;
22 | import org.apache.calcite.sql.SqlNodeList;
23 | import org.junit.Assert;
24 | import org.junit.Test;
25 |
26 | /**
27 | * @Date: 2020/1/2
28 | */
29 | public class SqlUtilsTest {
30 |
31 | @Test
32 | public void run(){
33 | System.out.println(System.currentTimeMillis());
34 | }
35 |
36 | @Test
37 | public void parseSqlTest() throws Exception {
38 | String table = "CREATE TABLE tbl1 (\n" +
39 | " a bigint,\n" +
40 | " h varchar, \n" +
41 | " g as 2 * (a + 1), \n" +
42 | " ts as toTimestamp(b, 'yyyy-MM-dd HH:mm:ss'), \n" +
43 | " b varchar,\n" +
44 | " proc as PROCTIME(), \n" +
45 | " PRIMARY KEY (a, b)\n" +
46 | ")\n" +
47 | "with (\n" +
48 | " 'connector' = 'kafka', \n" +
49 | " 'kafka.topic' = 'log.test'\n" +
50 | ")\n";
51 | SqlNodeList nodeList = SqlUtils.parseSql(table);
52 | for(SqlNode sqlNode : nodeList){
53 | Assert.assertNotNull(sqlNode);
54 | }
55 |
56 | String watermark = "CREATE TABLE tbl1 (\n" +
57 | " ts timestamp(3),\n" +
58 | " id varchar, \n" +
59 | " watermark FOR ts AS ts - interval '3' second\n" +
60 | ")\n" +
61 | " with (\n" +
62 | " 'connector' = 'kafka', \n" +
63 | " 'kafka.topic' = 'log.test'\n" +
64 | ")\n";
65 | SqlNodeList nodeList1 = SqlUtils.parseSql(watermark);
66 | for(SqlNode sqlNode : nodeList1){
67 | Assert.assertNotNull(sqlNode);
68 | }
69 |
70 | String fun = "create function function1 as 'org.apache.fink.function.function1' language java";
71 | SqlNodeList nodeList2 = SqlUtils.parseSql(fun);
72 | for(SqlNode sqlNode : nodeList2){
73 | Assert.assertNotNull(sqlNode);
74 | }
75 |
76 | String insert = "insert into emp(x,y) partition (x='ab', y='bc') select * from emp";
77 | SqlNodeList nodeList3 = SqlUtils.parseSql(insert);
78 | for(SqlNode sqlNode : nodeList3){
79 | Assert.assertNotNull(sqlNode);
80 | }
81 | }
82 | }
83 |
--------------------------------------------------------------------------------
/blink-sql/src/test/java/ambition/blink/sql/impl/SqlParserServiceImplTest.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.sql.impl;
20 |
21 | import ambition.blink.sql.SqlParserService;
22 | import java.util.List;
23 | import org.apache.flink.sql.parser.ddl.SqlCreateFunction;
24 | import org.apache.flink.sql.parser.ddl.SqlCreateTable;
25 | import org.apache.flink.sql.parser.ddl.SqlCreateView;
26 | import org.apache.flink.sql.parser.dml.RichSqlInsert;
27 | import org.junit.Assert;
28 | import org.junit.Test;
29 |
30 | /**
31 | * @Date: 2020/1/2
32 | */
33 | public class SqlParserServiceImplTest {
34 | private SqlParserService service = new SqlParserServiceImpl();
35 |
36 | @Test
37 | public void sqlFunctionParserTest() throws Exception {
38 | String fun = "create temporary function function1 as 'org.apache.fink.function.function1' language java";
39 | List list = service.sqlFunctionParser(fun);
40 | Assert.assertEquals(list.size(), 1);
41 | }
42 |
43 | @Test
44 | public void sqlTableParserTest() throws Exception {
45 | String table = "CREATE TABLE tbl1 (\n" +
46 | " `a` bigint,\n" +
47 | " `date` date,\n" +
48 | " h varchar, \n" +
49 | " g as 2 * (a + 1), \n" +
50 | " ts as toTimestamp(b, 'yyyy-MM-dd HH:mm:ss'), \n" +
51 | " b varchar,\n" +
52 | " proc as PROCTIME(), \n" +
53 | " PRIMARY KEY (a, b)\n" +
54 | ")\n" +
55 | "with (\n" +
56 | " 'connector' = 'kafka', \n" +
57 | " 'kafka.topic' = 'log.test'\n" +
58 | ")\n";
59 | List list = service.sqlTableParser(table);
60 | Assert.assertEquals(list.size(), 1);
61 | }
62 |
63 | @Test
64 | public void sqlViewParserTest() throws Exception {
65 | String view = "create view view_select as " +
66 | "SELECT " +
67 | "`date`, " +
68 | "amount " +
69 | "FROM " +
70 | "kafak_source " +
71 | "group by `date`,amount;";
72 |
73 | List list = service.sqlViewParser(view);
74 | Assert.assertEquals(list.size(), 1);
75 | }
76 |
77 | @Test
78 | public void sqlDmlParserTest() throws Exception {
79 | String insert = "insert into emp(x,y) partition (x='ab', y='bc') select * from emp";
80 | List list = service.sqlDmlParser(insert);
81 | Assert.assertEquals(list.size(), 1);
82 | }
83 |
84 | }
85 |
--------------------------------------------------------------------------------
/blink-stream/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
18 |
21 |
22 | blink
23 | ambition
24 | v3.0.0
25 |
26 | 4.0.0
27 |
28 | blink-stream
29 | v3.0.0
30 |
31 |
32 |
33 | junit
34 | junit
35 | test
36 |
37 |
38 |
39 | ambition
40 | blink-client
41 | ${project.version}
42 |
43 |
44 |
45 | ambition
46 | blink-sql
47 | ${project.version}
48 |
49 |
50 |
51 | org.apache.flink
52 | flink-core
53 |
54 |
55 |
56 | org.apache.flink
57 | flink-runtime_${scala.binary.version}
58 |
59 |
60 |
61 |
62 | org.apache.flink
63 | flink-table-common
64 |
65 |
66 |
67 | org.apache.flink
68 | flink-table-api-java
69 |
70 |
71 |
72 | org.apache.flink
73 | flink-table-api-scala_${scala.binary.version}
74 |
75 |
76 |
77 | org.apache.flink
78 | flink-scala_${scala.binary.version}
79 |
80 |
81 |
82 | org.apache.flink
83 | flink-streaming-scala_${scala.binary.version}
84 |
85 |
86 |
87 | org.apache.flink
88 | flink-table-api-java-bridge_${scala.binary.version}
89 |
90 |
91 |
92 | org.apache.flink
93 | flink-table-api-scala-bridge_${scala.binary.version}
94 |
95 |
96 |
97 |
98 |
99 |
--------------------------------------------------------------------------------
/blink-sql/src/main/java/ambition/blink/sql/impl/SqlConvertServiceImpl.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.sql.impl;
20 |
21 | import ambition.blink.sql.SqlConstant;
22 | import ambition.blink.sql.SqlConvertService;
23 | import java.util.HashMap;
24 | import java.util.LinkedList;
25 | import java.util.List;
26 | import java.util.Map;
27 | import org.apache.commons.collections.CollectionUtils;
28 | import org.apache.commons.lang3.StringUtils;
29 | import org.slf4j.Logger;
30 | import org.slf4j.LoggerFactory;
31 | import ambition.blink.common.exception.FlinkSqlException;
32 | import ambition.blink.common.utils.BlinkStringUtils;
33 |
34 | public class SqlConvertServiceImpl implements SqlConvertService {
35 |
36 | private static final Logger LOG = LoggerFactory.getLogger(SqlConvertServiceImpl.class);
37 |
38 | @Override
39 | public Map> sqlConvert(String sqlContext) throws Exception {
40 | if (StringUtils.isBlank(sqlContext)){
41 | LOG.warn("sql is null");
42 | throw new FlinkSqlException("input sql is null, sql not null");
43 | }
44 |
45 | List list = BlinkStringUtils.splitSemiColon(sqlContext);
46 | if (CollectionUtils.isEmpty(list)) {
47 | LOG.warn("sql not use quota error");
48 | throw new FlinkSqlException("sql not use quota error, sql not null");
49 | }
50 |
51 | if (BlinkStringUtils.isChinese(sqlContext)){
52 | LOG.info("sql contain Chinese char {} ", sqlContext);
53 | throw new FlinkSqlException("sql contain Chinese char");
54 | }
55 |
56 | //sql拆解
57 | List sqls = BlinkStringUtils.splitSemiColon(sqlContext);
58 | Map> result = new HashMap<>();
59 |
60 | //sql顺序的一致
61 | List funList = new LinkedList<>();
62 | List tableList = new LinkedList<>();
63 | List viewList = new LinkedList<>();
64 | List insertList = new LinkedList<>();
65 |
66 | for (String sql: sqls) {
67 | if (StringUtils.isNotBlank(sql)){
68 | if (BlinkStringUtils.isContain(SqlConstant.CREATE_FUNCTION, sql)) {
69 | funList.add(sql);
70 | }
71 |
72 | if (BlinkStringUtils.isContain(SqlConstant.CREATE_TABLE, sql)) {
73 | tableList.add(sql);
74 | }
75 |
76 | if (BlinkStringUtils.isContain(SqlConstant.CREATE_VIEW, sql)) {
77 | viewList.add(sql);
78 | }
79 |
80 | if (BlinkStringUtils.isContain(SqlConstant.INSERT_INTO, sql)) {
81 | insertList.add(sql);
82 | }
83 | }
84 | }
85 |
86 | if (CollectionUtils.isEmpty(tableList)) {
87 | LOG.warn("sql not contain create table");
88 | throw new FlinkSqlException("sql not contain create table");
89 | }
90 | result.put(SqlConstant.TABLE, tableList);
91 |
92 | if (CollectionUtils.isEmpty(insertList)) {
93 | LOG.warn("sql not contain insert into");
94 | throw new FlinkSqlException("sql not contain insert into");
95 | }
96 | result.put(SqlConstant.INSERT, insertList);
97 |
98 | if (CollectionUtils.isNotEmpty(funList)){
99 | result.put(SqlConstant.FUNCTION, funList);
100 | }
101 |
102 | if (CollectionUtils.isNotEmpty(viewList)){
103 | result.put(SqlConstant.VIEW, viewList);
104 | }
105 |
106 | return result;
107 | }
108 |
109 | }
110 |
--------------------------------------------------------------------------------
/blink-sql/src/main/java/ambition/blink/sql/impl/SqlParserServiceImpl.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.sql.impl;
20 |
21 | import ambition.blink.sql.SqlParserService;
22 | import ambition.blink.sql.SqlUtils;
23 | import java.util.ArrayList;
24 | import java.util.LinkedHashMap;
25 | import java.util.LinkedList;
26 | import java.util.List;
27 | import java.util.Map;
28 | import org.apache.calcite.sql.SqlBasicCall;
29 | import org.apache.calcite.sql.SqlDataTypeSpec;
30 | import org.apache.calcite.sql.SqlIdentifier;
31 | import org.apache.calcite.sql.SqlNode;
32 | import org.apache.calcite.sql.SqlNodeList;
33 | import org.apache.flink.sql.parser.ddl.SqlCreateFunction;
34 | import org.apache.flink.sql.parser.ddl.SqlCreateTable;
35 | import org.apache.flink.sql.parser.ddl.SqlCreateView;
36 | import org.apache.flink.sql.parser.ddl.SqlTableColumn;
37 | import org.apache.flink.sql.parser.dml.RichSqlInsert;
38 |
39 | /**
40 | * @Date: 2020/1/2
41 | */
42 | public class SqlParserServiceImpl implements SqlParserService {
43 |
44 | @Override
45 | public List sqlFunctionParser(String funSql) throws Exception {
46 | List result = new LinkedList<>();
47 | SqlNodeList nodeList = SqlUtils.parseSql(funSql);
48 | for (SqlNode sqlNode : nodeList) {
49 | if (sqlNode instanceof SqlCreateFunction) {
50 | result.add((SqlCreateFunction) sqlNode);
51 | }
52 | }
53 | return result;
54 | }
55 |
56 | @Override
57 | public List sqlTableParser(String ddlSql) throws Exception {
58 | List result = new LinkedList<>();
59 | SqlNodeList nodeList = SqlUtils.parseSql(ddlSql);
60 | for (SqlNode sqlNode : nodeList) {
61 | if (sqlNode instanceof SqlCreateTable) {
62 | result.add((SqlCreateTable) sqlNode);
63 |
64 | SqlNodeList columnList = ((SqlCreateTable) sqlNode).getColumnList();
65 |
66 | List tableColumnList = new ArrayList<>();
67 | List basicCallList = new ArrayList<>();
68 | Map tableColumnInfo = new LinkedHashMap<>();
69 | List basicCallInfo = new ArrayList<>();
70 |
71 | for (SqlNode sqlNode1: columnList) {
72 | if (sqlNode1 instanceof SqlTableColumn) {
73 | tableColumnList.add((SqlTableColumn) sqlNode1);
74 |
75 | SqlIdentifier name = ((SqlTableColumn) sqlNode1).getName();
76 | SqlDataTypeSpec type = ((SqlTableColumn) sqlNode1).getType();
77 | tableColumnInfo.put(name.toString(), type.toString());
78 | }
79 |
80 | if (sqlNode1 instanceof SqlBasicCall) {
81 | basicCallList.add((SqlBasicCall) sqlNode1);
82 | System.out.println(sqlNode1.toString());
83 | basicCallInfo.add(sqlNode1.toString());
84 | }
85 | }
86 | }
87 | }
88 | return result;
89 | }
90 |
91 | @Override
92 | public List sqlViewParser(String viewSql) throws Exception {
93 | List result = new LinkedList<>();
94 | SqlNodeList nodeList = SqlUtils.parseSql(viewSql);
95 | for (SqlNode sqlNode : nodeList) {
96 | if (sqlNode instanceof SqlCreateView) {
97 | result.add((SqlCreateView) sqlNode);
98 | }
99 | }
100 | return result;
101 | }
102 |
103 | @Override
104 | public List sqlDmlParser(String insertSql) throws Exception {
105 | List result = new LinkedList<>();
106 | SqlNodeList nodeList = SqlUtils.parseSql(insertSql);
107 | for (SqlNode sqlNode : nodeList) {
108 | if (sqlNode instanceof RichSqlInsert) {
109 | result.add((RichSqlInsert) sqlNode);
110 | }
111 | }
112 | return result;
113 | }
114 |
115 | }
116 |
--------------------------------------------------------------------------------
/blink-common/src/main/java/ambition/blink/common/utils/BlinkStringUtils.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.common.utils;
20 |
21 | import java.util.ArrayList;
22 | import java.util.List;
23 | import java.util.regex.Matcher;
24 | import java.util.regex.Pattern;
25 |
26 | public class BlinkStringUtils {
27 |
28 | public static String[] splitIgnoreQuotaBrackets(String str, String delimter){
29 | String splitPatternStr = delimter + "(?![^()]*+\\))(?![^{}]*+})(?![^\\[\\]]*+\\])(?=(?:[^\"]|\"[^\"]*\")*$)";
30 | return str.split(splitPatternStr);
31 | }
32 |
33 | public static List splitIgnoreQuota(String str, char delimiter){
34 | List tokensList = new ArrayList<>();
35 | boolean inQuotes = false;
36 | boolean inSingleQuotes = false;
37 | StringBuilder b = new StringBuilder();
38 | for (char c : str.toCharArray()) {
39 | if(c == delimiter){
40 | if (inQuotes) {
41 | b.append(c);
42 | } else if(inSingleQuotes){
43 | b.append(c);
44 | }else {
45 | tokensList.add(b.toString());
46 | b = new StringBuilder();
47 | }
48 | }else if(c == '\"'){
49 | inQuotes = !inQuotes;
50 | b.append(c);
51 | }else if(c == '\''){
52 | inSingleQuotes = !inSingleQuotes;
53 | b.append(c);
54 | }else{
55 | b.append(c);
56 | }
57 | }
58 |
59 | tokensList.add(b.toString());
60 |
61 | return tokensList;
62 | }
63 |
64 | public static List splitSemiColon(String str) {
65 | boolean inQuotes = false;
66 | boolean escape = false;
67 |
68 | List ret = new ArrayList<>();
69 |
70 | char quoteChar = '"';
71 | int beginIndex = 0;
72 | for (int index = 0; index < str.length(); index++) {
73 | char c = str.charAt(index);
74 | switch (c) {
75 | case ';':
76 | if (!inQuotes) {
77 | ret.add(str.substring(beginIndex, index));
78 | beginIndex = index + 1;
79 | }
80 | break;
81 | case '"':
82 | case '\'':
83 | if (!escape) {
84 | if (!inQuotes) {
85 | quoteChar = c;
86 | inQuotes = !inQuotes;
87 | } else {
88 | if (c == quoteChar) {
89 | inQuotes = !inQuotes;
90 | }
91 | }
92 | }
93 | break;
94 | default:
95 | break;
96 | }
97 |
98 | if (escape) {
99 | escape = false;
100 | } else if (c == '\\') {
101 | escape = true;
102 | }
103 | }
104 |
105 | if (beginIndex < str.length()) {
106 | ret.add(str.substring(beginIndex));
107 | }
108 |
109 | return ret;
110 | }
111 |
112 | public static boolean isContain(String regex, String sql){
113 | Pattern pattern = Pattern.compile(regex);
114 | Matcher matcher = pattern.matcher(sql);
115 | if(matcher.find()){
116 | return true;
117 | }
118 | return false;
119 | }
120 |
121 | public static boolean isChinese(String str) {
122 | if (str == null) return false;
123 | for (char c : str.toCharArray()) {
124 | if (isChinese(c)) return true;
125 | }
126 | return false;
127 | }
128 |
129 | private static boolean isChinese(char c) {
130 | Character.UnicodeBlock ub = Character.UnicodeBlock.of(c);
131 | if (ub == Character.UnicodeBlock.CJK_UNIFIED_IDEOGRAPHS
132 |
133 | || ub == Character.UnicodeBlock.CJK_COMPATIBILITY_IDEOGRAPHS
134 |
135 | || ub == Character.UnicodeBlock.CJK_UNIFIED_IDEOGRAPHS_EXTENSION_A
136 |
137 | || ub == Character.UnicodeBlock.GENERAL_PUNCTUATION
138 |
139 | || ub == Character.UnicodeBlock.CJK_SYMBOLS_AND_PUNCTUATION
140 |
141 | || ub == Character.UnicodeBlock.HALFWIDTH_AND_FULLWIDTH_FORMS) {
142 |
143 | return true;
144 | }
145 | return false;
146 | }
147 | }
148 |
--------------------------------------------------------------------------------
/blink-sql/src/test/java/ambition/blink/sql/impl/SqlConvertServiceImplTest.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.sql.impl;
20 |
21 | import ambition.blink.sql.SqlConstant;
22 | import ambition.blink.sql.SqlConvertService;
23 | import ambition.blink.sql.SqlParserService;
24 | import com.google.common.collect.MapDifference;
25 | import com.google.common.collect.Maps;
26 | import java.util.HashMap;
27 | import java.util.List;
28 | import java.util.Map;
29 | import org.apache.calcite.sql.SqlCall;
30 | import org.apache.calcite.sql.SqlNode;
31 | import org.apache.calcite.sql.SqlNodeList;
32 | import org.apache.commons.collections.CollectionUtils;
33 | import org.apache.flink.sql.parser.ddl.SqlCreateTable;
34 | import org.apache.flink.sql.parser.dml.RichSqlInsert;
35 | import org.junit.Assert;
36 | import org.junit.Test;
37 |
38 | public class SqlConvertServiceImplTest {
39 | private SqlConvertService sqlService = new SqlConvertServiceImpl();
40 | private SqlParserService sqlParserService = new SqlParserServiceImpl();
41 |
42 | @Test
43 | public void sqlConvertTest() throws Exception {
44 | String sqls = "CREATE FUNCTION " +
45 | "demouf " +
46 | "AS " +
47 | "'ambition.api.sql.function.DemoUDF';" +
48 |
49 | "CREATE TABLE kafak_source (" +
50 | "name varchar, " +
51 | "amount float, " +
52 | "`date` date," +
53 | "watermark for `date` AS withOffset(`date`,1000) " +
54 | ") " +
55 | "with (" +
56 | " 'connector' = 'kafka',\n" +
57 | " 'kafka.topic' = 'log.test'\n" +
58 | ");" +
59 |
60 | "CREATE TABLE mysql_sink (" +
61 | "`date` date, " +
62 | "amount float, " +
63 | "PRIMARY KEY (`date`,amount)) " +
64 | "with (" +
65 | " 'connector' = 'mysql',\n" +
66 | " 'kafka.topic' = 'log.test'\n" +
67 | ");" +
68 |
69 | "create view view_select as " +
70 | "SELECT " +
71 | "`date`, " +
72 | "amount " +
73 | "FROM " +
74 | "kafak_source " +
75 | "group by `date`,amount;" +
76 |
77 | "insert into mysql_sink " +
78 | "SELECT " +
79 | "`date`, " +
80 | "sum(amount) " +
81 | "FROM " +
82 | "view_select " +
83 | "group by `date`;";
84 |
85 | Map> map = sqlService.sqlConvert(sqls);
86 | Assert.assertEquals(map.size(), 4);
87 |
88 | // insert into
89 | List insertList = map.get(SqlConstant.INSERT);
90 | Map dmlSqlNodes = new HashMap<>();
91 | for (String sql : insertList) {
92 | List richSqlInserts = sqlParserService.sqlDmlParser(sql);
93 | if (CollectionUtils.isNotEmpty(richSqlInserts)) {
94 | for (RichSqlInsert richSqlInsert: richSqlInserts) {
95 | SqlNode table = richSqlInsert.getTargetTable();
96 | dmlSqlNodes.put(table.toString(), richSqlInsert);
97 | }
98 | }
99 | }
100 |
101 | //table
102 | List tableList = map.get(SqlConstant.TABLE);
103 | Map tableSqlNodes = new HashMap<>();
104 | for (String sql : tableList) {
105 | List sqlCreateTables = sqlParserService.sqlTableParser(sql);
106 | if (CollectionUtils.isNotEmpty(sqlCreateTables)) {
107 | for (SqlCreateTable sqlCreateTable : sqlCreateTables) {
108 | tableSqlNodes.put(sqlCreateTable.getTableName().toString(), sqlCreateTable);
109 | }
110 | }
111 | }
112 |
113 | MapDifference difference = Maps
114 | .difference(dmlSqlNodes, tableSqlNodes);
115 |
116 | System.out.println(difference);
117 |
118 | Map sourceMap = difference.entriesOnlyOnRight();
119 | MapDifference tableDifference = Maps.difference(sourceMap, tableSqlNodes);
120 | Map sinkMap = tableDifference.entriesOnlyOnRight();
121 |
122 | //处理source
123 | for (String sourceTableName: sourceMap.keySet()) {
124 | SqlCall sqlCall = sourceMap.get(sourceTableName);
125 | if (sqlCall instanceof SqlCreateTable) {
126 | SqlNodeList columnList = ((SqlCreateTable) sqlCall).getColumnList();
127 | System.out.println(columnList);
128 | System.out.println(sqlCall.toString());
129 | }
130 | }
131 | }
132 |
133 | }
134 |
--------------------------------------------------------------------------------
/blink-job/src/main/java/ambition/blink/yarn/job/YarnBase.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.yarn.job;
20 |
21 | import java.io.File;
22 | import java.io.FilenameFilter;
23 | import java.util.Collections;
24 | import javax.annotation.Nonnull;
25 | import javax.annotation.Nullable;
26 | import org.apache.flink.configuration.Configuration;
27 | import org.apache.flink.yarn.YarnClusterDescriptor;
28 | import org.apache.flink.yarn.cli.FlinkYarnSessionCli;
29 | import org.apache.hadoop.fs.Path;
30 | import org.apache.hadoop.yarn.api.records.ApplicationReport;
31 | import org.apache.hadoop.yarn.api.records.YarnApplicationState;
32 | import org.apache.hadoop.yarn.client.api.YarnClient;
33 | import org.apache.hadoop.yarn.conf.YarnConfiguration;
34 |
35 | public abstract class YarnBase {
36 |
37 | private YarnClient yarnClient = null;
38 |
39 |
40 | protected static final YarnConfiguration YARN_CONFIGURATION;
41 |
42 | private static org.apache.flink.configuration.Configuration globalConfiguration;
43 |
44 | protected org.apache.flink.configuration.Configuration flinkConfiguration;
45 |
46 | static {
47 | YARN_CONFIGURATION = new YarnConfiguration();
48 | YARN_CONFIGURATION.setInt(YarnConfiguration.RM_SCHEDULER_MINIMUM_ALLOCATION_MB, 32);
49 | YARN_CONFIGURATION.setInt(YarnConfiguration.RM_SCHEDULER_MAXIMUM_ALLOCATION_MB, 4096); // 4096 is the available memory anyways
50 | YARN_CONFIGURATION.setBoolean(YarnConfiguration.RM_SCHEDULER_INCLUDE_PORT_IN_NODE_NAME, true);
51 | YARN_CONFIGURATION.setInt(YarnConfiguration.RM_AM_MAX_ATTEMPTS, 2);
52 | YARN_CONFIGURATION.setInt(YarnConfiguration.RM_MAX_COMPLETED_APPLICATIONS, 2);
53 | YARN_CONFIGURATION.setInt(YarnConfiguration.RM_SCHEDULER_MAXIMUM_ALLOCATION_VCORES, 4);
54 | YARN_CONFIGURATION.setInt(YarnConfiguration.DEBUG_NM_DELETE_DELAY_SEC, 3600);
55 | YARN_CONFIGURATION.setBoolean(YarnConfiguration.LOG_AGGREGATION_ENABLED, false);
56 | YARN_CONFIGURATION.setInt(YarnConfiguration.NM_VCORES, 666); // memory is overwritten in the MiniYARNCluster.
57 | // so we have to change the number of cores for testing.
58 | YARN_CONFIGURATION.setInt(YarnConfiguration.RM_AM_EXPIRY_INTERVAL_MS, 20000); // 20 seconds expiry (to ensure we properly heartbeat with YARN).
59 | }
60 |
61 | protected static YarnConfiguration getYarnConfiguration() {
62 | return YARN_CONFIGURATION;
63 | }
64 |
65 | private static boolean isApplicationRunning(ApplicationReport app) {
66 | final YarnApplicationState yarnApplicationState = app.getYarnApplicationState();
67 | return yarnApplicationState != YarnApplicationState.FINISHED
68 | && app.getYarnApplicationState() != YarnApplicationState.KILLED
69 | && app.getYarnApplicationState() != YarnApplicationState.FAILED;
70 | }
71 |
72 |
73 | public void setupYarnClient() {
74 | if (yarnClient == null) {
75 | yarnClient = YarnClient.createYarnClient();
76 | yarnClient.init(getYarnConfiguration());
77 | yarnClient.start();
78 | }
79 |
80 | flinkConfiguration = new org.apache.flink.configuration.Configuration(globalConfiguration);
81 | }
82 |
83 | public void shutdownYarnClient() {
84 | yarnClient.stop();
85 | }
86 |
87 | @Nullable
88 | protected YarnClient getYarnClient() {
89 | return yarnClient;
90 | }
91 |
92 | public static YarnClusterDescriptor createClusterDescriptorWithLogging(
93 | final String flinkConfDir,
94 | final Configuration flinkConfiguration,
95 | final YarnConfiguration yarnConfiguration,
96 | final YarnClient yarnClient,
97 | final boolean sharedYarnClient) {
98 | final Configuration effectiveConfiguration = FlinkYarnSessionCli
99 | .setLogConfigFileInConfig(flinkConfiguration, flinkConfDir);
100 | return new YarnClusterDescriptor(effectiveConfiguration, yarnConfiguration, yarnClient, sharedYarnClient);
101 | }
102 |
103 | @Nonnull
104 | YarnClusterDescriptor createYarnClusterDescriptor(org.apache.flink.configuration.Configuration flinkConfiguration) {
105 | final YarnClusterDescriptor yarnClusterDescriptor = YarnTestUtils.createClusterDescriptorWithLogging(
106 | tempConfPathForSecureRun.getAbsolutePath(),
107 | flinkConfiguration,
108 | YARN_CONFIGURATION,
109 | yarnClient,
110 | true);
111 | yarnClusterDescriptor.setLocalJarPath(new Path(flinkUberjar.toURI()));
112 | yarnClusterDescriptor.addShipFiles(Collections.singletonList(flinkLibFolder));
113 | return yarnClusterDescriptor;
114 | }
115 |
116 |
117 |
118 |
119 |
120 | /**
121 | * Locate a file or directory.
122 | */
123 | public static File findFile(String startAt, FilenameFilter fnf) {
124 | File root = new File(startAt);
125 | String[] files = root.list();
126 | if (files == null) {
127 | return null;
128 | }
129 | for (String file : files) {
130 | File f = new File(startAt + File.separator + file);
131 | if (f.isDirectory()) {
132 | File r = findFile(f.getAbsolutePath(), fnf);
133 | if (r != null) {
134 | return r;
135 | }
136 | } else if (fnf.accept(f.getParentFile(), f.getName())) {
137 | return f;
138 | }
139 | }
140 | return null;
141 | }
142 |
143 | }
144 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | 一.背景
2 |
3 | 阿里工作的时候是使用Blink进行流数据处理和计算,通过编写sql实现Blink的计算job,开发简单高效,产品易用。
4 | 目前尝试实现Flink产品化,类似Blink。使用SQL为统一开发规范,SQL语言的好处是:声明式,易理解,稳定可靠,自动优化。
5 | 如果采用API开发的话,最大的问题是对于job调优依赖程序员经验,比较困难,同时API开发方式侵入性太强(数据安全,集群安全等),而sql可以自动调优,避免这种问题的产生。
6 |
7 | 二.实现思路:
8 |
9 | 用户输入sql(ddl,query,dml) -> ddl对应为Flink的source和sink
10 |
11 |
12 | -> query/dml的insert into数据处理和计算
13 |
14 |
15 | --> 封装为对应Flink的Job:env.sqlQuery/env.sqlUpdate
16 |
17 |
18 | --> JobGraph和对应job提交,ClusterClient.submitJob或者ClusterDescriptor.deployJobCluster
19 |
20 | 三.发布版本:
21 |
22 | [v3.0.0](https://github.com/ambition119/FlinkSQL/tree/v3.0.0) 2020年1月
23 |
24 | 1.使用flink 1.10版本
25 | 1.10之前的版本自带的sql解析功能不完善,如解析function,watermark等,所以比较鸡肋,还不如不用更换以前开发的解析层功能。
26 | 2.使用新接口ClusterClient.submitJob提交job
27 | 3.
28 | 4.
29 |
30 |
31 | [新特性](/doc/v3.0.0.md)
32 |
33 | 1. flink自带的sql解析
34 | 2. 使用新的job提交接口
35 | 2. 流批处理一体化实现
36 | 3. 钉钉/微信告警通知
37 |
38 |
39 | [v2.0.1](https://github.com/ambition119/FlinkSQL/tree/v2.0.1)
40 | [v2.0.0](https://github.com/ambition119/FlinkSQL/tree/v2.0.0) 2019年4月
41 |
42 | blink-client 接口定义
43 | blink-sql/calcite stream和batch table的sql解析
44 | blink-libraries 自定义source, sink, side开发
45 | blink-batch BatchTableSource和BatchTableSink
46 | blink-stream StreamTableSource和StreamTableSink
47 | blink-job batch/stream job 提交
48 |
49 | [v2.0.1新特性](/doc/v2.0.1.md)
50 | [v2.0.0新特性](/doc/v2.0.0.md)
51 |
52 | 1. 抽取sql层被流和批使用,SQL参考flink issues和对应提供的doc
53 | 2. 增加批处理开发
54 | 3. 增加维表功能
55 | 4. 升级flink版本为1.7.x
56 |
57 | [v1.0.0](https://github.com/ambition119/FlinkSQL/tree/v1.0.0) 2018年7月
58 |
59 | blink-client 接口定义
60 | blink-sqlserver stream table的sql解析
61 | blink-job 封装为stream job
62 |
63 | [新特性](/doc/v1.0.0.md)
64 |
65 | 1. 实现create function
66 | 2. 实现sql开发流处理程序任务
67 | 3. 更改源码实现sql CEP
68 |
69 | 四.样例
70 |
71 | #### v3.0.0 sql开发流任务示例:
72 |
73 | ```sql
74 |
75 | ```
76 |
77 | #### v2.0.1 sql开发流任务示例:
78 | batch sql示例:
79 | ```sql
80 | CREATE FUNCTION demouf AS 'ambition.api.sql.function.DemoUDF'
81 | LIBRARY 'hdfs://flink/udf/jedis.jar','hdfs://flink/udf/customudf.jar';
82 |
83 | CREATE TABLE csv_source (
84 | id int,
85 | name varchar,
86 | `date` date ,
87 | age int
88 | )
89 | with (
90 | type=source,
91 | connect.type=json,
92 | 'file.path'='file:///FlinkSQL/blink-job/src/test/resources/demo.json'
93 | );
94 |
95 | CREATE TABLE csv_sink (
96 | `date` date,
97 | age int,
98 | PRIMARY KEY (`date`)
99 | )
100 | with (
101 | type=sink,
102 | connect.type=csv,
103 | 'file.path'='file:///FlinkSQL/blink-job/src/test/resources/demo_out.csv'
104 | );
105 |
106 | create view view_select as
107 | SELECT `date`, age FROM csv_source group by `date`,age
108 | ;
109 |
110 | INSERT INTO csv_sink
111 | SELECT `date`, sum(age) FROM view_select group by `date`
112 | ;
113 |
114 | ```
115 | stream sql 示例:
116 | ```sql
117 | CREATE FUNCTION demouf AS
118 | 'ambition.api.sql.function.DemoUDF'
119 | LIBRARY 'hdfs://flink/udf/jedis.jar','hdfs://flink/udf/customudf.jar';
120 |
121 | CREATE TABLE kafka_source (
122 | `date` varchar,
123 | amount float,
124 | proctime timestamp
125 | )
126 | with (
127 | type=source,
128 | 'connect.type'=kafka,
129 | 'flink.parallelism'=1,
130 | 'kafka.topic'=topic,
131 | 'kafka.group.id'=flinks,
132 | 'kafka.enable.auto.commit'=true,
133 | 'kafka.bootstrap.servers'='localhost:9092'
134 | );
135 |
136 | CREATE TABLE mysql_sink (
137 | `date` varchar,
138 | total_amount float,
139 | PRIMARY KEY (`date`)
140 | )
141 | with (
142 | type=mysql,
143 | 'connect.type'=mysql,
144 | 'mysql.connection'='localhost:3306',
145 | 'mysql.db.name'=flink,
146 | 'mysql.batch.size'=10,
147 | 'mysql.table.name'=flink_table,
148 | 'mysql.user'=root,
149 | 'mysql.pass'=root
150 | );
151 |
152 | CREATE VIEW view_select AS
153 | SELECT `date`,
154 | amount
155 | FROM kafka_source
156 | GROUP BY
157 | `date`,
158 | amount
159 | ;
160 |
161 | INSERT INTO mysql_sink
162 | SELECT
163 | `date`,
164 | sum(amount) as total_amount
165 | FROM view_select
166 | GROUP BY
167 | `date`
168 | ;
169 | ```
170 |
171 | #### v2.0.0 sql开发流任务示例:
172 | batch sql示例:
173 | ```sql
174 | CREATE FUNCTION demouf AS 'ambition.api.sql.function.DemoUDF'
175 | LIBRARY 'hdfs://flink/udf/jedis.jar','hdfs://flink/udf/customudf.jar';
176 |
177 | CREATE SOURCE TABLE json_source (
178 | id int,
179 | name varchar,
180 | `date` date ,
181 | age int
182 | )
183 | with (
184 | type=json,
185 | 'file.path'='file:///FlinkSQL/blink-job/src/test/resources/demo.json'
186 | );
187 |
188 | CREATE SINK TABLE csv_sink (
189 | `date` date,
190 | total_age int
191 | )
192 | with (
193 | type=csv,
194 | 'file.path'='file:///FlinkSQL/blink-job/src/test/resources/demo_out.csv'
195 | );
196 |
197 | CREATE VIEW view_select as
198 | SELECT `date`, age
199 | FROM json_source
200 | GROUP BY `date`,age;
201 |
202 | INSERT INTO csv_sink
203 | SELECT `date`, sum(age) as total_age
204 | FROM view_select
205 | GROUP BY `date`;
206 |
207 | ```
208 | stream sql 示例:
209 | ```sql
210 | CREATE FUNCTION demouf AS
211 | 'ambition.api.sql.function.DemoUDF'
212 | LIBRARY 'hdfs://flink/udf/jedis.jar','hdfs://flink/udf/customudf.jar';
213 |
214 | CREATE SOURCE TABLE kafka_source (
215 | `date` varchar,
216 | amount float,
217 | proctime timestamp
218 | )
219 | with (
220 | type=kafka,
221 | 'flink.parallelism'=1,
222 | 'kafka.topic'=topic,
223 | 'kafka.group.id'=flinks,
224 | 'kafka.enable.auto.commit'=true,
225 | 'kafka.bootstrap.servers'='localhost:9092'
226 | );
227 |
228 | CREATE SINK TABLE mysql_sink (
229 | `date` varchar,
230 | total_amount float,
231 | PRIMARY KEY (`date`)
232 | )
233 | with (
234 | type=mysql,
235 | 'mysql.connection'='localhost:3306',
236 | 'mysql.db.name'=flink,
237 | 'mysql.batch.size'=10,
238 | 'mysql.table.name'=flink_table,
239 | 'mysql.user'=root,
240 | 'mysql.pass'=root
241 | );
242 |
243 | CREATE VIEW view_select AS
244 | SELECT `date`,
245 | amount
246 | FROM kafka_source
247 | GROUP BY
248 | `date`,
249 | amount
250 | ;
251 |
252 |
253 | INSERT INTO mysql_sink
254 | SELECT
255 | `date`,
256 | sum(amount) as total_amount
257 | FROM view_select
258 | GROUP BY
259 | `date`
260 | ;
261 | ```
262 |
263 |
264 | #### v1.0.0 sql开发流任务示例:
265 | ```sql
266 | CREATE FUNCTION demouf AS
267 | 'ambition.api.sql.function.DemoUDF'
268 | USING JAR 'hdfs://flink/udf/jedis.jar',
269 | JAR 'hdfs://flink/udf/customudf.jar';
270 |
271 |
272 | CREATE TABLE kafka_source (
273 | `date` string,
274 | amount float,
275 | proctime timestamp
276 | )
277 | with (
278 | type=kafka,
279 | 'flink.parallelism'=1,
280 | 'kafka.topic'=topic,
281 | 'kafka.group.id'=flinks,
282 | 'kafka.enable.auto.commit'=true,
283 | 'kafka.bootstrap.servers'='localhost:9092'
284 | );
285 | CREATE TABLE mysql_sink (
286 | `date` string,
287 | amount float,
288 | PRIMARY KEY (`date`,amount)
289 | )
290 | with (
291 | type=mysql,
292 | 'mysql.connection'='localhost:3306',
293 | 'mysql.db.name'=flink,
294 | 'mysql.batch.size'=0,
295 | 'mysql.table.name'=flink_table,
296 | 'mysql.user'=root,
297 | 'mysql.pass'=root
298 | );
299 | CREATE VIEW view_select AS
300 | SELECT `date`,
301 | amount
302 | FROM kafka_source
303 | GROUP BY
304 | `date`,
305 | amount
306 | ;
307 | INSERT INTO mysql_sink
308 | SELECT
309 | `date`,
310 | sum(amount)
311 | FROM view_select
312 | GROUP BY
313 | `date`
314 | ;
315 | ```
316 |
317 |
318 | 五.代码关注
319 |
320 | [apache flink](https://github.com/apache/flink)
321 |
322 |
323 | [apache calcite](https://github.com/apache/calcite)
324 |
325 |
326 | [uber AthenaX](https://github.com/uber/AthenaX)
327 |
328 |
329 | [DTStack flinkStreamSQL](https://github.com/DTStack/flinkStreamSQL)
--------------------------------------------------------------------------------
/blink-stream/src/main/java/ambition/blink/stream/job/impl/StreamJobClientImpl.java:
--------------------------------------------------------------------------------
1 | /**
2 | * Licensed to the Apache Software Foundation (ASF) under one
3 | * or more contributor license agreements. See the NOTICE file
4 | * distributed with this work for additional information
5 | * regarding copyright ownership. The ASF licenses this file
6 | * to you under the Apache License, Version 2.0 (the
7 | * "License"); you may not use this file except in compliance
8 | * with the License. You may obtain a copy of the License at
9 | *
10 | * http://www.apache.org/licenses/LICENSE-2.0
11 | *
12 | * Unless required by applicable law or agreed to in writing, software
13 | * distributed under the License is distributed on an "AS IS" BASIS,
14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
15 | * See the License for the specific language governing permissions and
16 | * limitations under the License.
17 | */
18 |
19 | package ambition.blink.stream.job.impl;
20 |
21 | import ambition.blink.common.job.JobParameter;
22 | import ambition.blink.job.JobClient;
23 | import ambition.blink.sql.SqlConstant;
24 | import ambition.blink.sql.SqlParserService;
25 | import ambition.blink.sql.impl.SqlParserServiceImpl;
26 | import com.google.common.collect.MapDifference;
27 | import com.google.common.collect.Maps;
28 | import java.util.HashMap;
29 | import java.util.List;
30 | import java.util.Map;
31 | import org.apache.calcite.sql.SqlCall;
32 | import org.apache.calcite.sql.SqlNode;
33 | import org.apache.commons.collections.CollectionUtils;
34 | import org.apache.flink.api.common.Plan;
35 | import org.apache.flink.runtime.jobgraph.JobGraph;
36 | import org.apache.flink.sql.parser.ddl.SqlCreateFunction;
37 | import org.apache.flink.sql.parser.ddl.SqlCreateTable;
38 | import org.apache.flink.sql.parser.ddl.SqlCreateView;
39 | import org.apache.flink.sql.parser.dml.RichSqlInsert;
40 | import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
41 | import org.apache.flink.table.api.EnvironmentSettings;
42 | import org.apache.flink.table.api.Table;
43 | import org.apache.flink.table.api.TableConfig;
44 | import org.apache.flink.table.api.java.StreamTableEnvironment;
45 | import org.slf4j.Logger;
46 | import org.slf4j.LoggerFactory;
47 |
48 | public class StreamJobClientImpl implements JobClient {
49 | private static final Logger LOG = LoggerFactory.getLogger(StreamJobClientImpl.class);
50 |
51 | private SqlParserService sqlParserService = new SqlParserServiceImpl();
52 |
53 | @Override
54 | public Plan getJobPlan(JobParameter jobParameter, Map extParams)
55 | throws Exception {
56 | return null;
57 | }
58 |
59 | @Override
60 | public JobGraph getJobGraph(JobParameter jobParameter, Map extParams)
61 | throws Exception {
62 | StreamExecutionEnvironment env = getStreamLocalEnvironmentInfo(
63 | jobParameter, extParams);
64 |
65 | return env.getStreamGraph(jobParameter.getJobName()).getJobGraph();
66 | }
67 |
68 | private StreamExecutionEnvironment getStreamLocalEnvironmentInfo(JobParameter jobParameter, Map extParams) throws Exception {
69 | StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironment();
70 | StreamTableEnvironment tEnv = StreamTableEnvironment.create(
71 | env,
72 | EnvironmentSettings.newInstance()
73 | // watermark is only supported in blink planner
74 | .useBlinkPlanner()
75 | .inStreamingMode()
76 | .build()
77 | );
78 |
79 | //config
80 | TableConfig config = tEnv.getConfig();
81 | // config.addConfiguration();
82 | // config.addJobParameter();
83 |
84 | // sqls
85 | Map> sqls = jobParameter.getSqls();
86 |
87 | // function
88 | if (sqls.containsKey(SqlConstant.FUNCTION)) {
89 | List funList = sqls.get(SqlConstant.FUNCTION);
90 | if (CollectionUtils.isNotEmpty(funList)) {
91 | for (String sql: funList) {
92 | List functions = sqlParserService.sqlFunctionParser(sql);
93 | for (SqlCreateFunction fun: functions) {
94 | tEnv.sqlUpdate(fun.toString());
95 | }
96 | }
97 | }
98 | }
99 |
100 | // insert into
101 | List insertList = sqls.get(SqlConstant.INSERT);
102 | if (CollectionUtils.isEmpty(insertList)) {
103 | throw new IllegalArgumentException("insert into sql is not null");
104 | }
105 |
106 | Map dmlSqlNodes = new HashMap<>();
107 | for (String sql : insertList) {
108 | List richSqlInserts = sqlParserService.sqlDmlParser(sql);
109 | if (CollectionUtils.isNotEmpty(richSqlInserts)) {
110 | for (RichSqlInsert richSqlInsert: richSqlInserts) {
111 | SqlNode table = richSqlInsert.getTargetTable();
112 | dmlSqlNodes.put(table.toString(), richSqlInsert);
113 | }
114 | }
115 | }
116 |
117 | //table
118 | List tableList = sqls.get(SqlConstant.TABLE);
119 | if (CollectionUtils.isEmpty(tableList)) {
120 | throw new IllegalArgumentException("create table sql is not null");
121 | }
122 |
123 | Map tableSqlNodes = new HashMap<>();
124 | for (String sql : tableList) {
125 | List sqlCreateTables = sqlParserService.sqlTableParser(sql);
126 | if (CollectionUtils.isNotEmpty(sqlCreateTables)) {
127 | for (SqlCreateTable sqlCreateTable : sqlCreateTables) {
128 | tableSqlNodes.put(sqlCreateTable.getTableName().toString(), sqlCreateTable);
129 | }
130 | }
131 | }
132 |
133 | // 通过insert into 获取 source, sink的table信息
134 | MapDifference difference = Maps.difference(dmlSqlNodes, tableSqlNodes);
135 | //source
136 | Map sourceMap = difference.entriesOnlyOnRight();
137 | //sink
138 | MapDifference tableDifference = Maps.difference(sourceMap, tableSqlNodes);
139 | Map sinkMap = tableDifference.entriesOnlyOnRight();
140 |
141 | //处理source
142 | for (String sourceTableName: sourceMap.keySet()) {
143 | SqlCall sqlCall = sourceMap.get(sourceTableName);
144 | if (sqlCall instanceof SqlCreateTable) {
145 | tEnv.sqlUpdate(sqlCall.toString());
146 | }
147 | }
148 | //处理sink
149 | for (String sinkTableName: sinkMap.keySet()) {
150 | SqlCall sqlCall = sinkMap.get(sinkTableName);
151 | if (sqlCall instanceof SqlCreateTable) {
152 | tEnv.sqlUpdate(sqlCall.toString());
153 | }
154 | }
155 |
156 | // view
157 | if (sqls.containsKey(SqlConstant.VIEW)) {
158 | List viewList = sqls.get(SqlConstant.VIEW);
159 | if (CollectionUtils.isNotEmpty(viewList)) {
160 | for (String sql : viewList) {
161 | List views = sqlParserService.sqlViewParser(sql);
162 | if (CollectionUtils.isNotEmpty(views)) {
163 | for (SqlCreateView view : views) {
164 | Table table = tEnv.sqlQuery(view.getQuery().toString());
165 | tEnv.createTemporaryView(view.getViewName().toString(), table);
166 | }
167 | }
168 | }
169 | }
170 | }
171 |
172 | for (String dml : dmlSqlNodes.keySet()) {
173 | RichSqlInsert richSqlInsert = dmlSqlNodes.get(dml);
174 | tEnv.sqlUpdate(richSqlInsert.toString());
175 | }
176 |
177 | return env;
178 | }
179 |
180 | private StreamExecutionEnvironment getStreamLocalEnvironmentInfoV2(JobParameter jobParameter, Map extParams) throws Exception {
181 | StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironment();
182 | StreamTableEnvironment tEnv = StreamTableEnvironment.create(
183 | env,
184 | EnvironmentSettings.newInstance()
185 | // watermark is only supported in blink planner
186 | .useBlinkPlanner()
187 | .inStreamingMode()
188 | .build()
189 | );
190 |
191 | //config
192 | TableConfig config = tEnv.getConfig();
193 | // config.addConfiguration();
194 | // config.addJobParameter();
195 |
196 | // sqls
197 | Map> sqls = jobParameter.getSqls();
198 | // function
199 | if (sqls.containsKey(SqlConstant.FUNCTION)) {
200 |
201 | }
202 |
203 | //table
204 | List tableList = sqls.get(SqlConstant.TABLE);
205 | if (CollectionUtils.isEmpty(tableList)) {
206 | throw new IllegalArgumentException("create table sql is not null");
207 | }
208 |
209 | for (String sql : tableList) {
210 | List sqlCreateTables = sqlParserService.sqlTableParser(sql);
211 | if (CollectionUtils.isNotEmpty(sqlCreateTables)) {
212 | for (SqlCreateTable sqlCreateTable : sqlCreateTables) {
213 | tEnv.sqlUpdate(sqlCreateTable.toString());
214 | }
215 | }
216 | }
217 |
218 | // view
219 | if (sqls.containsKey(SqlConstant.VIEW)) {
220 | List viewList = sqls.get(SqlConstant.VIEW);
221 | if (CollectionUtils.isNotEmpty(viewList)) {
222 | for (String sql : viewList) {
223 | List views = sqlParserService.sqlViewParser(sql);
224 | if (CollectionUtils.isNotEmpty(views)) {
225 | for (SqlCreateView view : views) {
226 | Table table = tEnv.sqlQuery(view.getQuery().toString());
227 | tEnv.createTemporaryView(view.getViewName().toString(), table);
228 | }
229 | }
230 | }
231 | }
232 | }
233 |
234 | // insert into
235 | List insertList = sqls.get(SqlConstant.INSERT);
236 | if (CollectionUtils.isEmpty(insertList)) {
237 | throw new IllegalArgumentException("insert into sql is not null");
238 | }
239 |
240 | Map dmlSqlNodes = new HashMap<>();
241 | for (String sql : insertList) {
242 | List richSqlInserts = sqlParserService.sqlDmlParser(sql);
243 | if (CollectionUtils.isNotEmpty(richSqlInserts)) {
244 | for (RichSqlInsert richSqlInsert: richSqlInserts) {
245 | tEnv.sqlUpdate(richSqlInsert.toString());
246 | }
247 | }
248 | }
249 |
250 | return env;
251 | }
252 |
253 | }
254 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | Apache License
2 | Version 2.0, January 2004
3 | http://www.apache.org/licenses/
4 |
5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6 |
7 | 1. Definitions.
8 |
9 | "License" shall mean the terms and conditions for use, reproduction,
10 | and distribution as defined by Sections 1 through 9 of this document.
11 |
12 | "Licensor" shall mean the copyright owner or entity authorized by
13 | the copyright owner that is granting the License.
14 |
15 | "Legal Entity" shall mean the union of the acting entity and all
16 | other entities that control, are controlled by, or are under common
17 | control with that entity. For the purposes of this definition,
18 | "control" means (i) the power, direct or indirect, to cause the
19 | direction or management of such entity, whether by contract or
20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the
21 | outstanding shares, or (iii) beneficial ownership of such entity.
22 |
23 | "You" (or "Your") shall mean an individual or Legal Entity
24 | exercising permissions granted by this License.
25 |
26 | "Source" form shall mean the preferred form for making modifications,
27 | including but not limited to software source code, documentation
28 | source, and configuration files.
29 |
30 | "Object" form shall mean any form resulting from mechanical
31 | transformation or translation of a Source form, including but
32 | not limited to compiled object code, generated documentation,
33 | and conversions to other media types.
34 |
35 | "Work" shall mean the work of authorship, whether in Source or
36 | Object form, made available under the License, as indicated by a
37 | copyright notice that is included in or attached to the work
38 | (an example is provided in the Appendix below).
39 |
40 | "Derivative Works" shall mean any work, whether in Source or Object
41 | form, that is based on (or derived from) the Work and for which the
42 | editorial revisions, annotations, elaborations, or other modifications
43 | represent, as a whole, an original work of authorship. For the purposes
44 | of this License, Derivative Works shall not include works that remain
45 | separable from, or merely link (or bind by name) to the interfaces of,
46 | the Work and Derivative Works thereof.
47 |
48 | "Contribution" shall mean any work of authorship, including
49 | the original version of the Work and any modifications or additions
50 | to that Work or Derivative Works thereof, that is intentionally
51 | submitted to Licensor for inclusion in the Work by the copyright owner
52 | or by an individual or Legal Entity authorized to submit on behalf of
53 | the copyright owner. For the purposes of this definition, "submitted"
54 | means any form of electronic, verbal, or written communication sent
55 | to the Licensor or its representatives, including but not limited to
56 | communication on electronic mailing lists, source code control systems,
57 | and issue tracking systems that are managed by, or on behalf of, the
58 | Licensor for the purpose of discussing and improving the Work, but
59 | excluding communication that is conspicuously marked or otherwise
60 | designated in writing by the copyright owner as "Not a Contribution."
61 |
62 | "Contributor" shall mean Licensor and any individual or Legal Entity
63 | on behalf of whom a Contribution has been received by Licensor and
64 | subsequently incorporated within the Work.
65 |
66 | 2. Grant of Copyright License. Subject to the terms and conditions of
67 | this License, each Contributor hereby grants to You a perpetual,
68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69 | copyright license to reproduce, prepare Derivative Works of,
70 | publicly display, publicly perform, sublicense, and distribute the
71 | Work and such Derivative Works in Source or Object form.
72 |
73 | 3. Grant of Patent License. Subject to the terms and conditions of
74 | this License, each Contributor hereby grants to You a perpetual,
75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76 | (except as stated in this section) patent license to make, have made,
77 | use, offer to sell, sell, import, and otherwise transfer the Work,
78 | where such license applies only to those patent claims licensable
79 | by such Contributor that are necessarily infringed by their
80 | Contribution(s) alone or by combination of their Contribution(s)
81 | with the Work to which such Contribution(s) was submitted. If You
82 | institute patent litigation against any entity (including a
83 | cross-claim or counterclaim in a lawsuit) alleging that the Work
84 | or a Contribution incorporated within the Work constitutes direct
85 | or contributory patent infringement, then any patent licenses
86 | granted to You under this License for that Work shall terminate
87 | as of the date such litigation is filed.
88 |
89 | 4. Redistribution. You may reproduce and distribute copies of the
90 | Work or Derivative Works thereof in any medium, with or without
91 | modifications, and in Source or Object form, provided that You
92 | meet the following conditions:
93 |
94 | (a) You must give any other recipients of the Work or
95 | Derivative Works a copy of this License; and
96 |
97 | (b) You must cause any modified files to carry prominent notices
98 | stating that You changed the files; and
99 |
100 | (c) You must retain, in the Source form of any Derivative Works
101 | that You distribute, all copyright, patent, trademark, and
102 | attribution notices from the Source form of the Work,
103 | excluding those notices that do not pertain to any part of
104 | the Derivative Works; and
105 |
106 | (d) If the Work includes a "NOTICE" text file as part of its
107 | distribution, then any Derivative Works that You distribute must
108 | include a readable copy of the attribution notices contained
109 | within such NOTICE file, excluding those notices that do not
110 | pertain to any part of the Derivative Works, in at least one
111 | of the following places: within a NOTICE text file distributed
112 | as part of the Derivative Works; within the Source form or
113 | documentation, if provided along with the Derivative Works; or,
114 | within a display generated by the Derivative Works, if and
115 | wherever such third-party notices normally appear. The contents
116 | of the NOTICE file are for informational purposes only and
117 | do not modify the License. You may add Your own attribution
118 | notices within Derivative Works that You distribute, alongside
119 | or as an addendum to the NOTICE text from the Work, provided
120 | that such additional attribution notices cannot be construed
121 | as modifying the License.
122 |
123 | You may add Your own copyright statement to Your modifications and
124 | may provide additional or different license terms and conditions
125 | for use, reproduction, or distribution of Your modifications, or
126 | for any such Derivative Works as a whole, provided Your use,
127 | reproduction, and distribution of the Work otherwise complies with
128 | the conditions stated in this License.
129 |
130 | 5. Submission of Contributions. Unless You explicitly state otherwise,
131 | any Contribution intentionally submitted for inclusion in the Work
132 | by You to the Licensor shall be under the terms and conditions of
133 | this License, without any additional terms or conditions.
134 | Notwithstanding the above, nothing herein shall supersede or modify
135 | the terms of any separate license agreement you may have executed
136 | with Licensor regarding such Contributions.
137 |
138 | 6. Trademarks. This License does not grant permission to use the trade
139 | names, trademarks, service marks, or product names of the Licensor,
140 | except as required for reasonable and customary use in describing the
141 | origin of the Work and reproducing the content of the NOTICE file.
142 |
143 | 7. Disclaimer of Warranty. Unless required by applicable law or
144 | agreed to in writing, Licensor provides the Work (and each
145 | Contributor provides its Contributions) on an "AS IS" BASIS,
146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147 | implied, including, without limitation, any warranties or conditions
148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149 | PARTICULAR PURPOSE. You are solely responsible for determining the
150 | appropriateness of using or redistributing the Work and assume any
151 | risks associated with Your exercise of permissions under this License.
152 |
153 | 8. Limitation of Liability. In no event and under no legal theory,
154 | whether in tort (including negligence), contract, or otherwise,
155 | unless required by applicable law (such as deliberate and grossly
156 | negligent acts) or agreed to in writing, shall any Contributor be
157 | liable to You for damages, including any direct, indirect, special,
158 | incidental, or consequential damages of any character arising as a
159 | result of this License or out of the use or inability to use the
160 | Work (including but not limited to damages for loss of goodwill,
161 | work stoppage, computer failure or malfunction, or any and all
162 | other commercial damages or losses), even if such Contributor
163 | has been advised of the possibility of such damages.
164 |
165 | 9. Accepting Warranty or Additional Liability. While redistributing
166 | the Work or Derivative Works thereof, You may choose to offer,
167 | and charge a fee for, acceptance of support, warranty, indemnity,
168 | or other liability obligations and/or rights consistent with this
169 | License. However, in accepting such obligations, You may act only
170 | on Your own behalf and on Your sole responsibility, not on behalf
171 | of any other Contributor, and only if You agree to indemnify,
172 | defend, and hold each Contributor harmless for any liability
173 | incurred by, or claims asserted against, such Contributor by reason
174 | of your accepting any such warranty or additional liability.
175 |
176 | END OF TERMS AND CONDITIONS
177 |
178 | APPENDIX: How to apply the Apache License to your work.
179 |
180 | To apply the Apache License to your work, attach the following
181 | boilerplate notice, with the fields enclosed by brackets "[]"
182 | replaced with your own identifying information. (Don't include
183 | the brackets!) The text should be enclosed in the appropriate
184 | comment syntax for the file format. We also recommend that a
185 | file or class name and description of purpose be included on the
186 | same "printed page" as the copyright notice for easier
187 | identification within third-party archives.
188 |
189 | Copyright [yyyy] [name of copyright owner]
190 |
191 | Licensed under the Apache License, Version 2.0 (the "License");
192 | you may not use this file except in compliance with the License.
193 | You may obtain a copy of the License at
194 |
195 | http://www.apache.org/licenses/LICENSE-2.0
196 |
197 | Unless required by applicable law or agreed to in writing, software
198 | distributed under the License is distributed on an "AS IS" BASIS,
199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200 | See the License for the specific language governing permissions and
201 | limitations under the License.
202 |
--------------------------------------------------------------------------------
/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
18 |
19 | 4.0.0
20 |
21 | ambition
22 | blink
23 | v3.0.0
24 |
25 | blink-common
26 | blink-client
27 | blink-sql
28 | blink-stream
29 | blink-job
30 | blink-libraries
31 |
32 | pom
33 |
34 |
35 | 1.21.0
36 | 6.3.1
37 | 1.10.0
38 | 9.0
39 | 2.7.5
40 | 2.11
41 | 2.11.11
42 |
43 |
44 | 3.2.2
45 | 3.3.2
46 |
47 |
48 |
49 |
50 |
51 | junit
52 | junit
53 | 4.12
54 | test
55 |
56 |
57 |
58 | org.apache.commons
59 | commons-lang3
60 | ${lang3.version}
61 |
62 |
63 |
64 | commons-collections
65 | commons-collections
66 | ${collections.version}
67 |
68 |
69 |
70 |
71 | org.apache.flink
72 | flink-core
73 | ${flink.version}
74 |
75 |
76 |
77 | org.apache.flink
78 | flink-runtime_${scala.binary.version}
79 | ${flink.version}
80 |
81 |
82 |
83 | org.apache.flink
84 | flink-table-common
85 | ${flink.version}
86 |
87 |
88 |
89 | org.apache.flink
90 | flink-table-api-java
91 | ${flink.version}
92 |
93 |
94 |
95 | org.apache.flink
96 | flink-table-api-scala_${scala.binary.version}
97 | ${flink.version}
98 |
99 |
100 |
101 | org.apache.flink
102 | flink-table-api-java-bridge_${scala.binary.version}
103 | ${flink.version}
104 |
105 |
106 |
107 | org.apache.flink
108 | flink-table-api-scala-bridge_${scala.binary.version}
109 | ${flink.version}
110 |
111 |
112 |
113 | org.apache.flink
114 | flink-sql-parser
115 | ${flink.version}
116 |
117 |
118 | org.apache.calcite
119 | calcite-core
120 |
121 |
122 |
123 |
124 |
125 | org.apache.flink
126 | flink-table-runtime-blink_${scala.binary.version}
127 | ${flink.version}
128 |
129 |
130 |
131 | org.apache.flink
132 | flink-scala_${scala.binary.version}
133 | ${flink.version}
134 |
135 |
136 |
137 | org.apache.flink
138 | flink-streaming-scala_${scala.binary.version}
139 | ${flink.version}
140 |
141 |
142 |
143 | org.apache.flink
144 | flink-cep_${scala.binary.version}
145 | ${flink.version}
146 |
147 |
148 |
149 | org.apache.flink
150 | flink-shaded-hadoop-2
151 | ${hadoop.version}-${flink.shaded.version}
152 |
153 |
154 |
155 | org.apache.flink
156 | flink-optimizer_${scala.binary.version}
157 | ${flink.version}
158 |
159 |
160 |
161 | org.apache.flink
162 | flink-java
163 | ${flink.version}
164 |
165 |
166 |
167 | org.apache.flink
168 | flink-yarn_${scala.binary.version}
169 | ${flink.version}
170 |
171 |
172 |
173 | org.apache.flink
174 | flink-clients_${scala.binary.version}
175 | ${flink.version}
176 |
177 |
178 |
179 |
180 | org.codehaus.janino
181 | janino
182 |
183 |
184 |
185 |
186 | org.apache.calcite
187 | calcite-core
188 |
189 | ${calcite.version}
190 |
191 |
206 |
207 | org.apache.calcite.avatica
208 | avatica-metrics
209 |
210 |
211 | com.google.protobuf
212 | protobuf-java
213 |
214 |
215 | org.apache.httpcomponents
216 | httpclient
217 |
218 |
219 | org.apache.httpcomponents
220 | httpcore
221 |
222 |
223 | org.apache.commons
224 | commons-dbcp2
225 |
226 |
227 | com.esri.geometry
228 | esri-geometry-api
229 |
230 |
231 | com.fasterxml.jackson.dataformat
232 | jackson-dataformat-yaml
233 |
234 |
235 | com.yahoo.datasketches
236 | sketches-core
237 |
238 |
239 | net.hydromatic
240 | aggdesigner-algorithm
241 |
242 |
243 |
244 |
245 |
246 |
247 |
248 | org.elasticsearch
249 | elasticsearch
250 | ${elasticsearch.version}
251 |
252 |
253 |
254 | org.elasticsearch.client
255 | transport
256 | ${elasticsearch.version}
257 |
258 |
259 |
260 |
261 |
262 |
263 |
264 |
265 | org.apache.maven.plugins
266 | maven-compiler-plugin
267 | 3.3
268 |
269 | 1.8
270 | 1.8
271 |
272 |
273 |
274 |
275 |
--------------------------------------------------------------------------------