从Demo入手,了解Paimon/Flink项目搭建的全过程。记录下采坑之旅。
创建Flink项目
在IDEA中创建Flink项目,由于没有Flink的archetype,因此需要手动创建一下。
参考:idea快速创建flink项目,至此Flink的项目框架就搭建起来了。
注意:必须注释掉pom文件中的
Error: A JNI error has occurred, please check your installation and try again

搭建Flink伪集群
在 Flink包地址 中,选择对应的版本,下载文件
解压后,其文件内容,如下

在bin目录下,运行start-cluster.bat脚本即可。打开浏览器访问:localhost:8081,就可以查看Flink的webui

高版本的Flink中已经没有bat脚本,可参考 flink新版本无bat启动文件的解决办法
补充缺失的依赖
Flink的框架搭建好之后,参考 新一代数据湖存储技术Apache Paimon入门Demo 写一个简单的Paimon程序。但在这个过程中,必须补充 缺失的POM依赖。而这些依赖在编译时并不会报错,一旦运行,各种各样的抛错:
java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
Unable to create catalog xxx
Unsupported SQL query! executeSql()
如下是所有需要的pom依赖:
<dependency>
<groupId>org.apache.flinkgroupId>
<artifactId>flink-streaming-javaartifactId>
<version>${flink.version}version>
dependency>
<dependency>
<groupId>org.apache.flinkgroupId>
<artifactId>flink-clientsartifactId>
<version>${flink.version}version>
dependency>
<dependency>
<groupId>org.apache.flinkgroupId>
<artifactId>flink-table-api-java-bridgeartifactId>
<version>1.18.0version>
dependency>
<dependency>
<groupId>org.apache.paimongroupId>
<artifactId>paimon-flink-1.18artifactId>
<version>0.6.0-incubatingversion>
dependency>
<dependency>
<groupId>org.apache.flinkgroupId>
<artifactId>flink-table-planner-loaderartifactId>
<version>${flink.version}version>
dependency>
<dependency>
<groupId>org.apache.flinkgroupId>
<artifactId>flink-table-runtimeartifactId>
<version>1.18.0version>
dependency>
<dependency>
<groupId>org.apache.flinkgroupId>
<artifactId>flink-connector-baseartifactId>
<version>${flink.version}version>
dependency>
<dependency>
<groupId>org.apache.logging.log4jgroupId>
<artifactId>log4j-slf4j-implartifactId>
<version>${log4j.version}version>
<scope>runtimescope>
dependency>
<dependency>
<groupId>org.apache.logging.log4jgroupId>
<artifactId>log4j-apiartifactId>
<version>${log4j.version}version>
<scope>runtimescope>
dependency>
<dependency>
<groupId>org.apache.logging.log4jgroupId>
<artifactId>log4j-coreartifactId>
<version>${log4j.version}version>
<scope>runtimescope>
dependency>
<dependency>
<groupId>org.apache.hadoopgroupId>
<artifactId>hadoop-commonartifactId>
<version>3.2.3version>
dependency>
<dependency>
<groupId>org.apache.hadoopgroupId>
<artifactId>hadoop-hdfs-clientartifactId>
<version>3.2.3version>
dependency>