• 在IDEA本地开发时Flink CDC和Flink的guava版本冲突解决办法


    1. 冲突原因

    使用Flink CDC 2.2.0版本的时候,会报ThreadFactoryBuilder这个类找不到的错误,如下所示:

    java.lang.NoClassDefFoundError: org/apache/flink/shaded/guava18/com/google/common/util/concurrent/ThreadFactoryBuilder
    
    • 1

    因为Flink CDC使用的是guava版本是18.0-13.0,如下所示:

            
                org.apache.flink
                flink-shaded-guava
                18.0-13.0
            
    
    • 1
    • 2
    • 3
    • 4
    • 5

    而Flink 1.14.4使用的guava版本是30.1.1-jre-14.0,如下所示:

            
                org.apache.flink
                flink-shaded-guava
                30.1.1-jre-14.0
            
    
    • 1
    • 2
    • 3
    • 4
    • 5

    但是项目中会使用30.1.1-jre-14.0版本的guava,所有会找不到guava 18.0-13.0这个版本,所以就会报错

    找到了原因,下面就可以想办法进行解决了

    2. 解决办法

    需要我们自己编译源码,源码的编译过程如下:

    1. 下载源码包并解压,如下所示
    [root@bigdata001 ~]# wget https://github.com/ververica/flink-cdc-connectors/archive/refs/tags/release-2.2.0.tar.gz
    [root@bigdata001 ~]#
    [root@bigdata001 ~]# tar -zxvf release-2.2.0.tar.gz
    [root@bigdata001 ~]#
    [root@bigdata001 ~]# cd flink-cdc-connectors-release-2.2.0/
    [root@bigdata001 flink-cdc-connectors-release-2.2.0]# 
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    1. 修改pom.xml后的guava版本如下
    [root@bigdata001 flink-cdc-connectors-release-2.2.0]# cat pom.xml
    ......省略部分......
            
                org.apache.flink
                flink-shaded-guava
                30.1.1-jre-14.0
            
    ......省略部分......
    [root@bigdata001 flink-cdc-connectors-release-2.2.0]#
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    1. 修改源码:将flink-cdc-connectors-release-2.2.0目录拷贝到本地,用IDEA打开,按Ctrl + Shift + R将所有guava18替换成guava30,然后替换Centos7上的flink-cdc-connectors-release-2.2.0目录

    2. 编译源码,如下所示

    [root@bigdata001 flink-cdc-connectors-release-2.2.0]# mvn clean install -Dmaven.test.skip=true
    [INFO] Scanning for projects...
    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Build Order:
    [INFO] 
    [INFO] flink-cdc-connectors                                               [pom]
    [INFO] flink-connector-debezium                                           [jar]
    [INFO] flink-cdc-base                                                     [jar]
    [INFO] flink-connector-test-util                                          [jar]
    [INFO] flink-connector-mysql-cdc                                          [jar]
    [INFO] flink-connector-postgres-cdc                                       [jar]
    [INFO] flink-connector-oracle-cdc                                         [jar]
    [INFO] flink-connector-mongodb-cdc                                        [jar]
    [INFO] flink-connector-oceanbase-cdc                                      [jar]
    [INFO] flink-connector-sqlserver-cdc                                      [jar]
    [INFO] flink-connector-tidb-cdc                                           [jar]
    [INFO] flink-sql-connector-mysql-cdc                                      [jar]
    [INFO] flink-sql-connector-postgres-cdc                                   [jar]
    [INFO] flink-sql-connector-mongodb-cdc                                    [jar]
    [INFO] flink-sql-connector-oracle-cdc                                     [jar]
    [INFO] flink-sql-connector-oceanbase-cdc                                  [jar]
    [INFO] flink-sql-connector-sqlserver-cdc                                  [jar]
    [INFO] flink-sql-connector-tidb-cdc                                       [jar]
    [INFO] flink-cdc-e2e-tests                                                [jar]
    [INFO] 
    [INFO] -----------------< com.ververica:flink-cdc-connectors >-----------------
    [INFO] Building flink-cdc-connectors 2.2.0                               [1/19]
    ......省略部分......
    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary for flink-cdc-connectors 2.2.0:
    [INFO] 
    [INFO] flink-cdc-connectors ............................... SUCCESS [  2.666 s]
    [INFO] flink-connector-debezium ........................... SUCCESS [  4.044 s]
    [INFO] flink-cdc-base ..................................... SUCCESS [  2.823 s]
    [INFO] flink-connector-test-util .......................... SUCCESS [  1.013 s]
    [INFO] flink-connector-mysql-cdc .......................... SUCCESS [  3.430 s]
    [INFO] flink-connector-postgres-cdc ....................... SUCCESS [  1.242 s]
    [INFO] flink-connector-oracle-cdc ......................... SUCCESS [  1.192 s]
    [INFO] flink-connector-mongodb-cdc ........................ SUCCESS [  1.806 s]
    [INFO] flink-connector-oceanbase-cdc ...................... SUCCESS [  1.285 s]
    [INFO] flink-connector-sqlserver-cdc ...................... SUCCESS [  0.747 s]
    [INFO] flink-connector-tidb-cdc ........................... SUCCESS [ 36.752 s]
    [INFO] flink-sql-connector-mysql-cdc ...................... SUCCESS [  8.316 s]
    [INFO] flink-sql-connector-postgres-cdc ................... SUCCESS [  5.348 s]
    [INFO] flink-sql-connector-mongodb-cdc .................... SUCCESS [  5.176 s]
    [INFO] flink-sql-connector-oracle-cdc ..................... SUCCESS [  7.375 s]
    [INFO] flink-sql-connector-oceanbase-cdc .................. SUCCESS [  5.118 s]
    [INFO] flink-sql-connector-sqlserver-cdc .................. SUCCESS [  4.721 s]
    [INFO] flink-sql-connector-tidb-cdc ....................... SUCCESS [ 18.166 s]
    [INFO] flink-cdc-e2e-tests ................................ SUCCESS [  2.886 s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time:  01:54 min
    [INFO] Finished at: 2022-06-09T09:08:57+08:00
    [INFO] ------------------------------------------------------------------------
    [root@bigdata001 flink-cdc-connectors-release-2.2.0]#
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49
    • 50
    • 51
    • 52
    • 53
    • 54
    • 55
    • 56
    • 57
    1. 将flink-sql-connector-mysql-cdc/target/flink-sql-connector-mysql-cdc-2.2.0.jar拷贝到本地进行引用,同时将原来pom.xml的flink-connector-mysql-cdc依赖注释掉,最后就可以在本地运行项目了

    2. 如果不放心可以在本地使用自己编译的flink-sql-connector-mysql-cdc-2.2.0.jar,在生成环境使用github flink cdc提供的jar包

  • 相关阅读:
    MySQL学习——从命令行调用MySQL 程序
    GoLang协程与通道---中
    回归预测 | MATLAB实现基于RF随机森林的用水量预测(多因素、多指标)
    2310C++子类已调用基类构造器
    【限流与Sentinel超详细分析】
    AIGC专栏9——Scalable Diffusion Models with Transformers (DiT)结构解析
    C++入门-day01
    记一次 kotlin 在 MutableList 中使用 remove 引发的问题
    LCR 088.使用最小花费爬楼梯
    vue中常见的3块标签的介绍
  • 原文地址:https://blog.csdn.net/yy8623977/article/details/125186258