• flink cdc MySQL2Doris 案例分享


    案例简单说明

    使用flink cdc,完成mysql 多库 多表同时同步到doris中

    版本信息

    flink 1.14.4

    doris 1.1.0

    依赖

     <properties>
            <maven.compiler.source>8maven.compiler.source>
            <maven.compiler.target>8maven.compiler.target>
            <scala.version>2.12scala.version>
            <java.version>1.8java.version>
            <flink.version>1.14.4flink.version>
            <fastjson.version>1.2.62fastjson.version>
            <hadoop.version>2.8.3hadoop.version>
            <scope.mode>compilescope.mode>
            <slf4j.version>1.7.30slf4j.version>
        properties>
    
        <dependencies>
            <dependency>
                <groupId>org.apache.flinkgroupId>
                <artifactId>flink-table-api-scala-bridge_${scala.version}artifactId>
                <version>${flink.version}version>
            dependency>
            <dependency>
                <groupId>org.apache.flinkgroupId>
                <artifactId>flink-table-planner_${scala.version}artifactId>
                <version>${flink.version}version>
            dependency>
            <dependency>
                <groupId>org.apache.flinkgroupId>
                <artifactId>flink-streaming-scala_${scala.version}artifactId>
                <version>${flink.version}version>
            dependency>
            <dependency>
                <groupId>org.apache.flinkgroupId>
                <artifactId>flink-clients_${scala.version}artifactId>
                <version>${flink.version}version>
            dependency>
            <dependency>
                <groupId>org.apache.flinkgroupId>
                <artifactId>flink-connector-jdbc_${scala.version}artifactId>
                <version>${flink.version}version>
            dependency>
            <dependency>
                <groupId>org.apache.flinkgroupId>
                <artifactId>flink-connector-kafka_${scala.version}artifactId>
                <version>${flink.version}version>
            dependency>
            <dependency>
                <groupId>com.alibabagroupId>
                <artifactId>fastjsonartifactId>
                <version>${fastjson.version}version>
            dependency>
            
            <dependency>
                <groupId>org.slf4jgroupId>
                <artifactId>slf4j-apiartifactId>
                <version>${slf4j.version}version>
            dependency>
            <dependency>
                <groupId>org.slf4jgroupId>
                <artifactId>slf4j-log4j12artifactId>
                <version>${slf4j.version}version>
            dependency>
            
            <dependency>
                <groupId>org.apache.dorisgroupId>
                <artifactId>flink-doris-connector-1.14_2.12artifactId>
                <version>1.1.0version>
            dependency>
            <dependency>
                <groupId>mysqlgroupId>
                <artifactId>mysql-connector-javaartifactId>
                <version>8.0.12version>
            dependency>
            <dependency>
                <groupId>com.ververicagroupId>
                <artifactId>flink-connector-mysql-cdcartifactId>
                <version>2.2.0version>
                <exclusions>
                    <exclusion>
                        <artifactId>flink-shaded-guavaartifactId>
                        <groupId>org.apache.flinkgroupId>
                    exclusion>
                exclusions>
            dependency>
            <dependency>
                <groupId>org.apache.flinkgroupId>
                <artifactId>flink-runtime-web_${scala.version}artifactId>
                <version>${flink.version}version>
            dependency>
        dependencies>
        <build>
            <plugins>
                <plugin>
                    <groupId>net.alchim31.mavengroupId>
                    <artifactId>scala-maven-pluginartifactId>
                    <version>3.2.1version>
                    <executions>
                        <execution>
                            <id>scala-compile-firstid>
                            <phase>process-resourcesphase>
                            <goals>
                                <goal>compilegoal>
                            goals>
                        execution>
                        <execution>
                            <id>scala-test-compileid>
                            <phase>process-test-resourcesphase>
                            <goals>
                                <goal>testCompilegoal>
                            goals>
                        execution>
                    executions>
                    <configuration>
                        <args>
                            <arg>-featurearg>
                        args>
                    configuration>
                plugin>
                <plugin>
                    <groupId>org.apache.maven.pluginsgroupId>
                    <artifactId>maven-compiler-pluginartifactId>
                    <version>3.8.1version>
                    <configuration>
                        <source>8source>
                        <target>8target>
                    configuration>
                plugin>
            plugins>
        build>
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49
    • 50
    • 51
    • 52
    • 53
    • 54
    • 55
    • 56
    • 57
    • 58
    • 59
    • 60
    • 61
    • 62
    • 63
    • 64
    • 65
    • 66
    • 67
    • 68
    • 69
    • 70
    • 71
    • 72
    • 73
    • 74
    • 75
    • 76
    • 77
    • 78
    • 79
    • 80
    • 81
    • 82
    • 83
    • 84
    • 85
    • 86
    • 87
    • 88
    • 89
    • 90
    • 91
    • 92
    • 93
    • 94
    • 95
    • 96
    • 97
    • 98
    • 99
    • 100
    • 101
    • 102
    • 103
    • 104
    • 105
    • 106
    • 107
    • 108
    • 109
    • 110
    • 111
    • 112
    • 113
    • 114
    • 115
    • 116
    • 117
    • 118
    • 119
    • 120
    • 121
    • 122
    • 123
    • 124
    • 125
    • 126

    flink-connector-mysql-cdc 2.2.1版本 一直会报异常

    java.lang.NoClassDefFoundError: org/apache/flink/shaded/guava18/com/google/common/util/concurrent/ThreadFactoryBuilder

    从官网下载依赖,然后本地添加进去flink-sql-connector-mysql-cdc-2.2.0

    准备mysql数据

    CREATE DATABASE emp_1;
     USE emp_1;
    CREATE TABLE employees_1 (
        emp_no      INT             NOT NULL,
        birth_date  DATE            NOT NULL,
        first_name  VARCHAR(14)     NOT NULL,
        last_name   VARCHAR(16)     NOT NULL,
        gender      ENUM ('M','F')  NOT NULL,    
        hire_date   DATE            NOT NULL,
        PRIMARY KEY (emp_no)
    );
    
    INSERT INTO `employees_1` VALUES (10001,'1953-09-02','Georgi','Facello','M','1986-06-26'),
    (10002,'1964-06-02','Bezalel','Simmel','F','1985-11-21'),
    (10003,'1959-12-03','Parto','Bamford','M','1986-08-28'),
    (10004,'1954-05-01','Chirstian','Koblick','M','1986-12-01'),
    (10005,'1955-01-21','Kyoichi','Maliniak','M','1989-09-12'),
    (10006,'1953-04-20','Anneke','Preusig','F','1989-06-02'),
    (10007,'1957-05-23','Tzvetan','Zielinski','F','1989-02-10'),
    (10008,'1958-02-19','Saniya','Kalloufi','M','1994-09-15'),
    (10009,'1952-04-19','Sumant','Peac','F','1985-02-18'),
    (10010,'1963-06-01','Duangkaew','Piveteau','F','1989-08-24'),
    (10011,'1953-11-07','Mary','Sluis','F','1990-01-22'),
    (10012,'1960-10-04','Patricio','Bridgland','M','1992-12-18'),
    (10013,'1963-06-07','Eberhardt','Terkki','M','1985-10-20'),
    (10014,'1956-02-12','Berni','Genin','M','1987-03-11'),
    (10015,'1959-08-19','Guoxiang','Nooteboom','M','1987-07-02'),
    (10016,'1961-05-02','Kazuhito','Cappelletti','M','1995-01-27'),
    (10017,'1958-07-06','Cristinel','Bouloucos','F','1993-08-03'),
    (10018,'1954-06-19','Kazuhide','Peha','F','1987-04-03'),
    (10019,'1953-01-23','Lillian','Haddadi','M','1999-04-30'),
    (10020,'1952-12-24','Mayuko','Warwick','M','1991-01-26'),
    (10021,'1960-02-20','Ramzi','Erde','M','1988-02-10'),
    (10022,'1952-07-08','Shahaf','Famili','M','1995-08-22'),
    (10023,'1953-09-29','Bojan','Montemayor','F','1989-12-17'),
    (10024,'1958-09-05','Suzette','Pettey','F','1997-05-19'),
    (10025,'1958-10-31','Prasadram','Heyers','M','1987-08-17'),
    (10026,'1953-04-03','Yongqiao','Berztiss','M','1995-03-20'),
    (10027,'1962-07-10','Divier','Reistad','F','1989-07-07'),
    (10028,'1963-11-26','Domenick','Tempesti','M','1991-10-22'),
    (10029,'1956-12-13','Otmar','Herbst','M','1985-11-20'),
    (10030,'1958-07-14','Elvis','Demeyer','M','1994-02-17'),
    (10031,'1959-01-27','Karsten','Joslin','M','1991-09-01'),
    (10032,'1960-08-09','Jeong','Reistad','F','1990-06-20'),
    (10033,'1956-11-14','Arif','Merlo','M','1987-03-18'),
    (10034,'1962-12-29','Bader','Swan','M','1988-09-21'),
    (10035,'1953-02-08','Alain','Chappelet','M','1988-09-05'),
    (10036,'1959-08-10','Adamantios','Portugali','M','1992-01-03');
    
    CREATE TABLE employees_2 (
        emp_no      INT             NOT NULL,
        birth_date  DATE            NOT NULL,
        first_name  VARCHAR(14)     NOT NULL,
        last_name   VARCHAR(16)     NOT NULL,
        gender      ENUM ('M','F')  NOT NULL,    
        hire_date   DATE            NOT NULL,
        PRIMARY KEY (emp_no)
    );
    
    INSERT INTO `employees_2` VALUES (10037,'1963-07-22','Pradeep','Makrucki','M','1990-12-05'),
    (10038,'1960-07-20','Huan','Lortz','M','1989-09-20'),
    (10039,'1959-10-01','Alejandro','Brender','M','1988-01-19'),
    (10040,'1959-09-13','Weiyi','Meriste','F','1993-02-14'),
    (10041,'1959-08-27','Uri','Lenart','F','1989-11-12'),
    (10042,'1956-02-26','Magy','Stamatiou','F','1993-03-21'),
    (10043,'1960-09-19','Yishay','Tzvieli','M','1990-10-20'),
    (10044,'1961-09-21','Mingsen','Casley','F','1994-05-21'),
    (10045,'1957-08-14','Moss','Shanbhogue','M','1989-09-02'),
    (10046,'1960-07-23','Lucien','Rosenbaum','M','1992-06-20'),
    (10047,'1952-06-29','Zvonko','Nyanchama','M','1989-03-31'),
    (10048,'1963-07-11','Florian','Syrotiuk','M','1985-02-24'),
    (10049,'1961-04-24','Basil','Tramer','F','1992-05-04'),
    (10050,'1958-05-21','Yinghua','Dredge','M','1990-12-25'),
    (10051,'1953-07-28','Hidefumi','Caine','M','1992-10-15'),
    (10052,'1961-02-26','Heping','Nitsch','M','1988-05-21'),
    (10053,'1954-09-13','Sanjiv','Zschoche','F','1986-02-04'),
    (10054,'1957-04-04','Mayumi','Schueller','M','1995-03-13');
    
    
    CREATE DATABASE emp_2;
    
    USE emp_2;
    
    CREATE TABLE employees_1 (
        emp_no      INT             NOT NULL,
        birth_date  DATE            NOT NULL,
        first_name  VARCHAR(14)     NOT NULL,
        last_name   VARCHAR(16)     NOT NULL,
        gender      ENUM ('M','F')  NOT NULL,    
        hire_date   DATE            NOT NULL,
        PRIMARY KEY (emp_no)
    );
    
    
    INSERT INTO `employees_1` VALUES  (10055,'1956-06-06','Georgy','Dredge','M','1992-04-27'),
    (10056,'1961-09-01','Brendon','Bernini','F','1990-02-01'),
    (10057,'1954-05-30','Ebbe','Callaway','F','1992-01-15'),
    (10058,'1954-10-01','Berhard','McFarlin','M','1987-04-13'),
    (10059,'1953-09-19','Alejandro','McAlpine','F','1991-06-26'),
    (10060,'1961-10-15','Breannda','Billingsley','M','1987-11-02'),
    (10061,'1962-10-19','Tse','Herber','M','1985-09-17'),
    (10062,'1961-11-02','Anoosh','Peyn','M','1991-08-30'),
    (10063,'1952-08-06','Gino','Leonhardt','F','1989-04-08'),
    (10064,'1959-04-07','Udi','Jansch','M','1985-11-20'),
    (10065,'1963-04-14','Satosi','Awdeh','M','1988-05-18'),
    (10066,'1952-11-13','Kwee','Schusler','M','1986-02-26'),
    (10067,'1953-01-07','Claudi','Stavenow','M','1987-03-04'),
    (10068,'1962-11-26','Charlene','Brattka','M','1987-08-07'),
    (10069,'1960-09-06','Margareta','Bierman','F','1989-11-05'),
    (10070,'1955-08-20','Reuven','Garigliano','M','1985-10-14'),
    (10071,'1958-01-21','Hisao','Lipner','M','1987-10-01'),
    (10072,'1952-05-15','Hironoby','Sidou','F','1988-07-21'),
    (10073,'1954-02-23','Shir','McClurg','M','1991-12-01'),
    (10074,'1955-08-28','Mokhtar','Bernatsky','F','1990-08-13'),
    (10075,'1960-03-09','Gao','Dolinsky','F','1987-03-19'),
    (10076,'1952-06-13','Erez','Ritzmann','F','1985-07-09'),
    (10077,'1964-04-18','Mona','Azuma','M','1990-03-02'),
    (10078,'1959-12-25','Danel','Mondadori','F','1987-05-26'),
    (10079,'1961-10-05','Kshitij','Gils','F','1986-03-27'),
    (10080,'1957-12-03','Premal','Baek','M','1985-11-19'),
    (10081,'1960-12-17','Zhongwei','Rosen','M','1986-10-30'),
    (10082,'1963-09-09','Parviz','Lortz','M','1990-01-03'),
    (10083,'1959-07-23','Vishv','Zockler','M','1987-03-31'),
    (10084,'1960-05-25','Tuval','Kalloufi','M','1995-12-15');
    
    
    CREATE TABLE employees_2(
        emp_no      INT             NOT NULL,
        birth_date  DATE            NOT NULL,
        first_name  VARCHAR(14)     NOT NULL,
        last_name   VARCHAR(16)     NOT NULL,
        gender      ENUM ('M','F')  NOT NULL,    
        hire_date   DATE            NOT NULL,
        PRIMARY KEY (emp_no)
    );
    
    INSERT INTO `employees_2` VALUES (10085,'1962-11-07','Kenroku','Malabarba','M','1994-04-09'),
    (10086,'1962-11-19','Somnath','Foote','M','1990-02-16'),
    (10087,'1959-07-23','Xinglin','Eugenio','F','1986-09-08'),
    (10088,'1954-02-25','Jungsoon','Syrzycki','F','1988-09-02'),
    (10089,'1963-03-21','Sudharsan','Flasterstein','F','1986-08-12'),
    (10090,'1961-05-30','Kendra','Hofting','M','1986-03-14'),
    (10091,'1955-10-04','Amabile','Gomatam','M','1992-11-18'),
    (10092,'1964-10-18','Valdiodio','Niizuma','F','1989-09-22'),
    (10093,'1964-06-11','Sailaja','Desikan','M','1996-11-05'),
    (10094,'1957-05-25','Arumugam','Ossenbruggen','F','1987-04-18'),
    (10095,'1965-01-03','Hilari','Morton','M','1986-07-15'),
    (10096,'1954-09-16','Jayson','Mandell','M','1990-01-14'),
    (10097,'1952-02-27','Remzi','Waschkowski','M','1990-09-15'),
    (10098,'1961-09-23','Sreekrishna','Servieres','F','1985-05-13'),
    (10099,'1956-05-25','Valter','Sullins','F','1988-10-18'),
    (10100,'1953-04-21','Hironobu','Haraldson','F','1987-09-21'),
    (10101,'1952-04-15','Perla','Heyers','F','1992-12-28'),
    (10102,'1959-11-04','Paraskevi','Luby','F','1994-01-26'),
    (10103,'1953-11-26','Akemi','Birch','M','1986-12-02'),
    (10104,'1961-11-19','Xinyu','Warwick','M','1987-04-16'),
    (10105,'1962-02-05','Hironoby','Piveteau','M','1999-03-23'),
    (10106,'1952-08-29','Eben','Aingworth','M','1990-12-19'),
    (10107,'1956-06-13','Dung','Baca','F','1994-03-22'),
    (10108,'1952-04-07','Lunjin','Giveon','M','1986-10-02'),
    (10109,'1958-11-25','Mariusz','Prampolini','F','1993-06-16'),
    (10110,'1957-03-07','Xuejia','Ullian','F','1986-08-22'),
    (10111,'1963-08-29','Hugo','Rosis','F','1988-06-19'),
    (10112,'1963-08-13','Yuichiro','Swick','F','1985-10-08'),
    (10113,'1963-11-13','Jaewon','Syrzycki','M','1989-12-24'),
    (10114,'1957-02-16','Munir','Demeyer','F','1992-07-17'),
    (10115,'1964-12-25','Chikara','Rissland','M','1986-01-23'),
    (10116,'1955-08-26','Dayanand','Czap','F','1985-05-28');
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49
    • 50
    • 51
    • 52
    • 53
    • 54
    • 55
    • 56
    • 57
    • 58
    • 59
    • 60
    • 61
    • 62
    • 63
    • 64
    • 65
    • 66
    • 67
    • 68
    • 69
    • 70
    • 71
    • 72
    • 73
    • 74
    • 75
    • 76
    • 77
    • 78
    • 79
    • 80
    • 81
    • 82
    • 83
    • 84
    • 85
    • 86
    • 87
    • 88
    • 89
    • 90
    • 91
    • 92
    • 93
    • 94
    • 95
    • 96
    • 97
    • 98
    • 99
    • 100
    • 101
    • 102
    • 103
    • 104
    • 105
    • 106
    • 107
    • 108
    • 109
    • 110
    • 111
    • 112
    • 113
    • 114
    • 115
    • 116
    • 117
    • 118
    • 119
    • 120
    • 121
    • 122
    • 123
    • 124
    • 125
    • 126
    • 127
    • 128
    • 129
    • 130
    • 131
    • 132
    • 133
    • 134
    • 135
    • 136
    • 137
    • 138
    • 139
    • 140
    • 141
    • 142
    • 143
    • 144
    • 145
    • 146
    • 147
    • 148
    • 149
    • 150
    • 151
    • 152
    • 153
    • 154
    • 155
    • 156
    • 157
    • 158
    • 159
    • 160
    • 161
    • 162
    • 163
    • 164
    • 165
    • 166
    • 167
    • 168

    准备doris表

    create database test_db;
    use test_db;
    CREATE TABLE all_employees_info (
        emp_no       int NOT NULL,
        birth_date   date,
        first_name   varchar(20),
        last_name    varchar(20),
        gender       char(2),
        hire_date    date,
        database_name varchar(50),
        table_name    varchar(200)
    )
    UNIQUE KEY(`emp_no`, `birth_date`)
    DISTRIBUTED BY HASH(`birth_date`) BUCKETS 1
    PROPERTIES (
    "replication_allocation" = "tag.location.default: 1"
    );
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17

    由于 UNIQUE KEY(emp_no, birth_date),因此update mysql这两个字段的时候,doris 会多一条数据

    代码案例

    final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
            env.enableCheckpointing(10000);
            env.setParallelism(1);
            final StreamTableEnvironment tEnv = StreamTableEnvironment.create(env);
            // register a table in the catalog
            tEnv.executeSql(
                    "CREATE TABLE cdc_test_source (\n" +
                            "  emp_no INT,\n" +
                            "  birth_date DATE,\n" +
                            "  first_name STRING,\n" +
                            "  last_name STRING,\n" +
                            "  gender STRING,\n" +
                            "  hire_date  STRING,\n" +
                            "  database_name STRING METADATA VIRTUAL,\n" +
                            "  table_name STRING METADATA VIRTUAL,\n" +
                            "  PRIMARY KEY (`emp_no`) NOT ENFORCED  \n" +
                            ") WITH (\n" +
                            "  'connector' = 'mysql-cdc',\n" +
                            "  'hostname' = '192.168.22.xxx',\n" +
                            "  'port' = '3306',\n" +
                            "  'username' = 'xxx',\n" +
                            "  'password' = 'xxx',\n" +
                            "  'database-name' = 'emp_[0-9]+',\n" +
                            "  'table-name' = 'employees_[0-9]+'\n" +
                            ")");
    
            String label = UUID.randomUUID();
            //doris table
            tEnv.executeSql(
                    "CREATE TABLE doris_test_sink (" +
                            "  emp_no INT,\n" +
                            "  birth_date STRING,\n" +
                            "  first_name STRING,\n" +
                            "  last_name STRING,\n" +
                            "  gender STRING,\n" +
                            "  hire_date  STRING\n" +
                            ") " +
                            "WITH (\n" +
                            "  'connector' = 'doris',\n" +
                            "  'fenodes' = '172.8.10.xxx:8030',\n" +
                            "  'table.identifier' = 'test_db.all_employees_info',\n" +
                            "  'username' = 'xxx',\n" +
                            "  'password' = 'xxx',\n" +
                    /* doris stream load label, In the exactly-once scenario,
                       the label is globally unique and must be restarted from the latest checkpoint when restarting.
                       Exactly-once semantics can be turned off via sink.enable-2pc. */
                            "  'sink.label-prefix' ='" + label + "',\n" +
                            "  'sink.properties.format' = 'json',\n" +       //json data format
                            "  'sink.properties.read_json_by_line' = 'true'\n" +
                            ")");
    
            //insert into mysql table to doris table
            tEnv.executeSql("INSERT INTO doris_test_sink select emp_no,cast(birth_date as string) as birth_date ,first_name,last_name,gender,cast(hire_date as string) as hire_date   from cdc_test_source ");
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49
    • 50
    • 51
    • 52
    • 53

    执行之后可以再插入、更新一些数据,进行验证

    补充

    多库多表在于

    " ‘database-name’ = ‘emp_[0-9]+’,\n" +
    " ‘table-name’ = ‘employees_[0-9]+’\n" +

    使用正则,可以自动匹配上

    库、表 的名字在于
    " database_name STRING METADATA VIRTUAL,\n" +
    " table_name STRING METADATA VIRTUAL,\n" +
    如果没写,是读取不到mysql数据中库、表的名字的

    Doris 和 Flink 列类型映射关系 可以查看官网信息

    Flink Doris Connector - Apache Doris

  • 相关阅读:
    【sciter】组件封装总结
    3.物联网射频识别,(高频)RFID应用ISO14443-2协议,(校园卡)Mifare S50卡
    SQL中使用ROLLUP和CUBE函数轻松生成汇总行
    阿里云国际站:阿里云linux扩充磁盘大小常见问题
    介绍 TensorFlow 的基本概念和使用场景。
    第二证券|小鹏持续萎靡,理想蔚来逆势反弹破月销记录
    搞定“项目八怪”,你就是管理高手!
    python 中的 下划线_ 是啥意思
    Java核心知识体系3:异常机制详解
    前端周刊第三十五期
  • 原文地址:https://blog.csdn.net/weixin_45399602/article/details/127921908