• Hadoop源码编译打包


    前言

    记录总结Hadoop源码编译打包过程,根据源码里的文档,一开始以为不支持在Windows系统上打包,只支持Unix和Mac,所以这里我在自己虚拟机centos7系统上编译,后来在文档后面部分才发现也支持在Windows上编译,不过还需要安装Visual Studio 2010,可能不如还不如在虚拟机上编译简单,如果想尝试在Windows上编译,可以看源码里的文档BUILDING.txtBuilding on Windows的部分

    代码

    因之前没有下载过hadoop的源码,所以需要先下载hadoop的源码

    git clone https://github.com/apache/hadoop.git
    
    • 1

    git命令克隆源码,克隆的过程中可能会有异常:

    error: unable to create file hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/java/org/apache/hadoop/yarn/server/resourcemanager/monitor/capacity/mockframework/ProportionalCapacityPreemptionPolicyMockFramework.java: Filename too long
    error: unable to create file hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-documentstore/src/main/java/org/apache/hadoop/yarn/server/timelineservice/documentstore/collection/document/flowactivity/FlowActivityDocument.java: Filename too long
    error: unable to create file hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client/src/main/java/org/apache/hadoop/yarn/server/timelineservice/reader/filter/TimelineFilterUtils.java: Filename too long
    error: unable to create file hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client/src/main/java/org/apache/hadoop/yarn/server/timelineservice/reader/filter/package-info.java: Filename too long
    error: unable to create file hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client/src/main/java/org/apache/hadoop/yarn/server/timelineservice/storage/HBaseStorageMonitor.java: Filename too long
    error: unable to create file hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client/src/main/java/org/apache/hadoop/yarn/server/timelineservice/storage/HBaseTimelineReaderImpl.java: Filename too long
    error: unable to create file hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client/src/main/java/org/apache/hadoop/yarn/server/timelineservice/storage/HBaseTimelineSchemaCreator.java: Filename too long
    error: unable to create file hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client/src/main/java/org/apache/hadoop/yarn/server/timelineservice/storage/HBaseTimelineWriterImpl.java: Filename too long
    fatal: cannot create directory at 'hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client/src/main/java/org/apache/hadoop/yarn/server/timelineservice/storage/application': Filename too long
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9

    因文件名过长,不能创建文件,只需要修改配置

    git config --global core.longpaths true
    
    • 1

    删除clone失败的文件,重新clone就可以了,然后切换自己想要打包的版本分支,我这里使用的是分支3.3.1: branch-3.3.1

    这里最好在虚拟机上clone代码,如果在Windows系统上克隆代码再上传到虚拟机上,则在编译打包时会出现脚本因行结尾字符不一致导致的问题,后面有说明解决方法

    环境

    BUILDING.txt

    源码BUILDING.txt里,对必要环境依赖做了说明:

    ----------------------------------------------------------------------------------
    Requirements:
    
    * Unix System
    * JDK 1.8
    * Maven 3.3 or later
    * Protocol Buffers 3.7.1 (if compiling native code)
    * CMake 3.1 or newer (if compiling native code)
    * Zlib devel (if compiling native code)
    * Cyrus SASL devel (if compiling native code)
    * One of the compilers that support thread_local storage: GCC 4.8.1 or later, Visual Studio,
      Clang (community version), Clang (version for iOS 9 and later) (if compiling native code)
    * openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
    * Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
    * Doxygen ( if compiling libhdfspp and generating the documents )
    * Internet connection for first build (to fetch all Maven and Hadoop dependencies)
    * python (for releasedocs)
    * bats (for shell code testing)
    * Node.js / bower / Ember-cli (for YARN UI v2 building)
    
    ----------------------------------------------------------------------------------
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21

    已有环境依赖

    • Unix System Centos7系统
    • JDK1.8 开发常用 (1.8.0_45)
    • Maven 3.3 or later 开发常用 (3.8.1)
    • git (clone代码用)
    • Python 3.8.0 (之前有使用需求,非必要)
    • Node v12.16.3

    Native libraries

    可以看到如果要编译Hadoop Native libraries,需要安装很多依赖,如果选择不编译则会简单很多,这里选择编译,关于Hadoop Native libraries,大家如果有不懂得可以自己查资料了解

    Hadoop源码文档里提供了Ubuntu 14.04的安装命令,因为这里是centos系统,将apt-get改成yum试一下
    (其实也提供了centos的安装命令,不过是centos8,不知centos8和7的差别大不大,我没有再尝试,因为我已经安装完依赖并打包成功了)

    yum install -y   build-essential autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev libsasl2-dev
    
    • 1

    结果发现autoconfautomake已安装,build-essential zlib1g-dev pkg-config libssl-dev libsasl2-dev找不到相关的包,这样只安装了cmake

    No package build-essential available.
    Package autoconf-2.69-11.el7.noarch already installed and latest version
    Package automake-1.13.4-3.el7.noarch already installed and latest version
    No package zlib1g-dev available.
    No package pkg-config available.
    No package libssl-dev available.
    No package libsasl2-dev available
                                                                                                                                                                                                                                              4/
    Installed:
      cmake.x86_64 0:2.8.12.2-2.el7
    Dependency Installed:
      libarchive.x86_64 0:3.1.2-14.el7_7
    Updated:
      libtool.x86_64 0:2.4.2-22.el7_3
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14

    但是cmake的版本为2.8:

    $ cmake -version
    cmake version 2.8.12.2
    
    • 1
    • 2

    而所需要的版本是>=3.1 版本不符合,需要升级版本,网上查资料,记录安装过程:

    升级cmake

    首先卸载原先2.8版本的cmake

    yum -y remove cmake
    
    • 1

    下载cmakge的安装包:https://cmake.org/files/v3.23/cmake-3.23.0-rc1.tar.gz

    解压

    tar -zxvf  cmake-3.23.0-rc1.tar.gz
    
    • 1

    编译

    cd cmake-3.23.0-rc1
    ./configure
    
    • 1
    • 2

    编译过程中报以下异常:

    -- Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the system variable OPENSSL_ROOT_DIR (missing: OPENSSL_CRYPTO_LIBRARY OPENSSL_INCLUDE_DIR)
    
    • 1

    没有OpenSSL,上面的Requirements里也需要安装OpenSSL,那么先安装OpenSSL

    yum install openssl openssl-devel -y
    
    • 1

    安装完后,重新编译cmake

    编译成功后安装

    make -j$(nproc) (比较慢)
    sudo make install
    
    • 1
    • 2

    安装完成后验证cmake版本:

    $ cmake -version
    cmake version 3.23.0-rc1
    
    CMake suite maintained and supported by Kitware (kitware.com/cmake).
    
    • 1
    • 2
    • 3
    • 4

    参考:https://blog.csdn.net/qq_22938603/article/details/122964218

    zlib

    因按照文档上的命令很多依赖没有安装成功,那么我们单独尝试安装,这里发现zlib已经安装过了

    $ yum list installed | grep zlib-devel
    zlib-devel.x86_64                1.2.7-18.el7                          @base
    
    • 1
    • 2

    openssl

    openssl 在升级cmake时也安装过了

    $ yum list installed | grep openssl-devel
    openssl-devel.x86_64             1:1.0.2k-25.el7_9                     @updates
    
    • 1
    • 2

    安装 Protocol Buffers 3.7.1

    curl -L -s -S https://github.com/protocolbuffers/protobuf/releases/download/v3.7.1/protobuf-java-3.7.1.tar.gz -o protobuf-3.7.1.tar.gz
    tar xzf protobuf-3.7.1.tar.gz --strip-components 1 -C protobuf-3.7-src && cd protobuf-3.7-src
    ./configure
    make -j$(nproc) (比较慢)
    sudo make install
    
    • 1
    • 2
    • 3
    • 4
    • 5

    如果命令下载比较慢,也可以直接在浏览器上下载再上传,安装完成后验证一下protoc的版本

    $ protoc --version
    libprotoc 3.7.1
    
    • 1
    • 2

    安装fuse

    不清楚fuse_dfs是干啥的,这里也没用到,不过试着装了一下

    $ yum list installed | grep fuse
    发现没有安装,再查找fuse
    $ yum list  | grep fuse
    https://sbt.bintray.com/rpm/repodata/repomd.xml: [Errno 14] HTTPS Error 502 - Bad Gateway
    Trying other mirror.
    fuse.x86_64                               2.9.2-11.el7                 base
    fuse-devel.i686                           2.9.2-11.el7                 base
    fuse-devel.x86_64                         2.9.2-11.el7                 base
    fuse-libs.i686                            2.9.2-11.el7                 base
    fuse-libs.x86_64                          2.9.2-11.el7                 base
    fuse-overlayfs.x86_64                     0.7.2-6.el7_8                extras
    fuse3.x86_64                              3.6.1-4.el7                  extras
    fuse3-devel.x86_64                        3.6.1-4.el7                  extras
    fuse3-libs.x86_64                         3.6.1-4.el7                  extras
    fuseiso.x86_64                            20070708-15.el7              base
    fusesource-pom.noarch                     1.9-7.el7                    base
    glusterfs-fuse.x86_64                     6.0-61.el7                   updates
    gvfs-fuse.x86_64                          1.36.2-5.el7_9               updates
    ostree-fuse.x86_64                        2019.1-2.el7                 extras
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    yum install -y fuse.x86_64
    
    • 1

    打包

    到这里感觉自己所需要的依赖都已经装好了,先试一下,看一下有没有问题,这里第一次编译打包时需要下载很多依赖,比较慢,具体还依赖自己的网络环境

    打包命令:

    拷贝源码里的文档:

    ----------------------------------------------------------------------------------
    Building distributions:
    
    Create binary distribution without native code and without documentation:
    
      $ mvn package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true
    
    Create binary distribution with native code and with documentation:
    
      $ mvn package -Pdist,native,docs -DskipTests -Dtar
    
    Create source distribution:
    
      $ mvn package -Psrc -DskipTests
    
    Create source and binary distributions with native code and documentation:
    
      $ mvn package -Pdist,native,docs,src -DskipTests -Dtar
    
    Create a local staging version of the website (in /tmp/hadoop-site)
    
      $ mvn clean site -Preleasedocs; mvn site:stage -DstagingDirectory=/tmp/hadoop-site
    
    Note that the site needs to be built in a second pass after other artifacts.
    
    ----------------------------------------------------------------------------------
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26

    根据自己的需求这里选择了下面的参数:

    mvn clean package -Pdist,native -DskipTests -Dtar
    
    • 1

    异常1

    /root/workspace/hadoop/hadoop-project/../dev-support/bin/dist-copynativelibs: line 16: $'\r': command not found
    : invalid option namep/hadoop-project/../dev-support/bin/dist-copynativelibs: line 17: set: pipefail
    /root/workspace/hadoop/hadoop-project/../dev-support/bin/dist-copynativelibs: line 18: $'\r': command not found
    /root/workspace/hadoop/hadoop-project/../dev-support/bin/dist-copynativelibs: line 21: syntax error near unexpected token `$'\r''
    'root/workspace/hadoop/hadoop-project/../dev-support/bin/dist-copynativelibs: line 21: `function bundle_native_lib()
    
    • 1
    • 2
    • 3
    • 4
    • 5

    参考文章:https://blog.csdn.net/heihaozi/article/details/113602205

    解决方法:

    yum install -y dos2unix
    dos2unix  dev-support/bin/*
    
    • 1
    • 2

    异常2

    [INFO] --- frontend-maven-plugin:1.11.2:yarn (yarn install) @ hadoop-yarn-applications-catalog-webapp ---
    [INFO] Running 'yarn ' in /root/workspace/hadoop/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-catalog/hadoop-yarn-applications-catalog-webapp/target
    [INFO] yarn install v1.7.0
    [INFO] info No lockfile found.
    [INFO] [1/4] Resolving packages...
    [INFO] warning angular-route@1.6.10: For the actively supported Angular, see https://www.npmjs.com/package/@angular/core. AngularJS support has officially ended. For extended AngularJS support options, see https://goo.gle/angularjs-path-forward.
    [INFO] warning angular@1.6.10: For the actively supported Angular, see https://www.npmjs.com/package/@angular/core. AngularJS support has officially ended. For extended AngularJS support options, see https://goo.gle/angularjs-path-forward.
    [INFO] info There appears to be trouble with your network connection. Retrying...
    [INFO] [2/4] Fetching packages...
    [INFO] error winston@3.7.2: The engine "node" is incompatible with this module. Expected version ">= 12.0.0".
    [INFO] error Found incompatible module
    [INFO] info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
    [INFO] ------------------------------------------------------------------------
    
    ERROR] Failed to execute goal com.github.eirslett:frontend-maven-plugin:1.11.2:yarn (yarn install) on project hadoop-yarn-applications-catalog-webapp: Failed to run task: 'yarn ' failed. org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
    [ERROR]
    [ERROR] After correcting the problems, you can resume the build with the command
    [ERROR]   mvn <args> -rf :hadoop-yarn-applications-catalog-webapp
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24

    这里说是因为node版本不匹配,需要版本">= 12.0.0",但是我本地node版本是v12.16.3,而且我没有编译for YARN UI v2 building,应该用不到node,但是这里确实报了异常,通过查看目录hadoop-yarn-applications-catalog-webapp/target/node发现里面有yarn和node,所以文档上讲的有点问题,再打包hadoop-yarn-applications-catalog-webapp也用到了node,但是应该不是我本地的node应该是源码里依赖自带的,那么去对应项目里的pom查找,果然有yarn和node的依赖,我们将<nodeVersion>v8.11.3</nodeVersion>改为<nodeVersion>v12.16.3</nodeVersion>再打包,果然成功解决了上面的异常,并打包成功(我不确定有没有不改代码就可以解决这个异常并打包成功的方法)

    打包成功

    编译打包过程比较漫长,需要下载110个项目的依赖,中间下载依赖时可能还会卡住,这里我选择停掉命令重新打包,尝试几次后,编译打包成功

    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary for Apache Hadoop Main 3.3.1:
    [INFO]
    [INFO] Apache Hadoop Main ................................. SUCCESS [  6.283 s]
    [INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 18.271 s]
    [INFO] Apache Hadoop Project POM .......................... SUCCESS [  6.532 s]
    [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 10.608 s]
    [INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.749 s]
    [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  7.622 s]
    [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 19.506 s]
    [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  7.987 s]
    [INFO] Apache Hadoop Auth ................................. SUCCESS [ 23.571 s]
    [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 10.436 s]
    [INFO] Apache Hadoop Common ............................... SUCCESS [04:31 min]
    [INFO] Apache Hadoop NFS .................................. SUCCESS [ 15.377 s]
    [INFO] Apache Hadoop KMS .................................. SUCCESS [ 11.759 s]
    [INFO] Apache Hadoop Registry ............................. SUCCESS [ 14.969 s]
    [INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.206 s]
    [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [01:35 min]
    [INFO] Apache Hadoop HDFS ................................. SUCCESS [03:40 min]
    [INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [  5.246 s]
    [INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 16.972 s]
    [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  9.375 s]
    [INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [01:13 min]
    [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.184 s]
    [INFO] Apache Hadoop YARN ................................. SUCCESS [  0.141 s]
    [INFO] Apache Hadoop YARN API ............................. SUCCESS [ 48.902 s]
    [INFO] Apache Hadoop YARN Common .......................... SUCCESS [01:51 min]
    [INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.190 s]
    [INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 33.105 s]
    [INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 45.157 s]
    [INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [  7.523 s]
    [INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [ 12.931 s]
    [INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [ 11.880 s]
    [INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [01:07 min]
    [INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  3.489 s]
    [INFO] Apache Hadoop YARN Client .......................... SUCCESS [ 16.235 s]
    [INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [  8.134 s]
    [INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [  7.360 s]
    [INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [  0.123 s]
    [INFO] Apache Hadoop YARN TimelineService HBase Common .... SUCCESS [ 12.865 s]
    [INFO] Apache Hadoop YARN TimelineService HBase Client .... SUCCESS [ 11.617 s]
    [INFO] Apache Hadoop YARN TimelineService HBase Servers ... SUCCESS [  0.151 s]
    [INFO] Apache Hadoop YARN TimelineService HBase Server 1.2  SUCCESS [ 10.463 s]
    [INFO] Apache Hadoop YARN TimelineService HBase tests ..... SUCCESS [  6.213 s]
    [INFO] Apache Hadoop YARN Router .......................... SUCCESS [ 10.472 s]
    [INFO] Apache Hadoop YARN TimelineService DocumentStore ... SUCCESS [  7.838 s]
    [INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.129 s]
    [INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [  6.458 s]
    [INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  5.032 s]
    [INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  0.474 s]
    [INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 13.449 s]
    [INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 18.252 s]
    [INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [  6.455 s]
    [INFO] Apache Hadoop MapReduce App ........................ SUCCESS [ 20.143 s]
    [INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [ 13.208 s]
    [INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [ 15.821 s]
    [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  1.586 s]
    [INFO] Apache Hadoop YARN Services ........................ SUCCESS [  0.095 s]
    [INFO] Apache Hadoop YARN Services Core ................... SUCCESS [  8.248 s]
    [INFO] Apache Hadoop YARN Services API .................... SUCCESS [  2.692 s]
    [INFO] Apache Hadoop YARN Application Catalog ............. SUCCESS [  0.147 s]
    [INFO] Apache Hadoop YARN Application Catalog Webapp ...... SUCCESS [ 27.034 s]
    [INFO] Apache Hadoop YARN Application Catalog Docker Image  SUCCESS [  0.247 s]
    [INFO] Apache Hadoop YARN Application MaWo ................ SUCCESS [  0.138 s]
    [INFO] Apache Hadoop YARN Application MaWo Core ........... SUCCESS [  6.439 s]
    [INFO] Apache Hadoop YARN Site ............................ SUCCESS [  0.164 s]
    [INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  0.849 s]
    [INFO] Apache Hadoop YARN UI .............................. SUCCESS [  0.125 s]
    [INFO] Apache Hadoop YARN CSI ............................. SUCCESS [ 20.487 s]
    [INFO] Apache Hadoop YARN Project ......................... SUCCESS [ 24.698 s]
    [INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  5.055 s]
    [INFO] Apache Hadoop MapReduce NativeTask ................. SUCCESS [ 20.908 s]
    [INFO] Apache Hadoop MapReduce Uploader ................... SUCCESS [  4.863 s]
    [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 11.329 s]
    [INFO] Apache Hadoop MapReduce ............................ SUCCESS [  6.357 s]
    [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 11.950 s]
    [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 12.106 s]
    [INFO] Apache Hadoop Client Aggregator .................... SUCCESS [  6.302 s]
    [INFO] Apache Hadoop Dynamometer Workload Simulator ....... SUCCESS [  9.199 s]
    [INFO] Apache Hadoop Dynamometer Cluster Simulator ........ SUCCESS [  9.159 s]
    [INFO] Apache Hadoop Dynamometer Block Listing Generator .. SUCCESS [  5.495 s]
    [INFO] Apache Hadoop Dynamometer Dist ..................... SUCCESS [ 11.781 s]
    [INFO] Apache Hadoop Dynamometer .......................... SUCCESS [  0.093 s]
    [INFO] Apache Hadoop Archives ............................. SUCCESS [  5.436 s]
    [INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  5.062 s]
    [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 12.155 s]
    [INFO] Apache Hadoop Gridmix .............................. SUCCESS [  9.009 s]
    [INFO] Apache Hadoop Data Join ............................ SUCCESS [  5.023 s]
    [INFO] Apache Hadoop Extras ............................... SUCCESS [  4.804 s]
    [INFO] Apache Hadoop Pipes ................................ SUCCESS [  1.855 s]
    [INFO] Apache Hadoop OpenStack support .................... SUCCESS [  8.414 s]
    [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 35.861 s]
    [INFO] Apache Hadoop Kafka Library support ................ SUCCESS [  4.439 s]
    [INFO] Apache Hadoop Azure support ........................ SUCCESS [ 22.346 s]
    [INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [  6.827 s]
    [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 12.319 s]
    [INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [  9.552 s]
    [INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [  6.276 s]
    [INFO] Apache Hadoop Image Generation Tool ................ SUCCESS [  8.871 s]
    [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 25.044 s]
    [INFO] Apache Hadoop Tools ................................ SUCCESS [  0.075 s]
    [INFO] Apache Hadoop Client API ........................... SUCCESS [03:06 min]
    [INFO] Apache Hadoop Client Runtime ....................... SUCCESS [03:26 min]
    [INFO] Apache Hadoop Client Packaging Invariants .......... SUCCESS [  1.225 s]
    [INFO] Apache Hadoop Client Test Minicluster .............. SUCCESS [05:37 min]
    [INFO] Apache Hadoop Client Packaging Invariants for Test . SUCCESS [  0.509 s]
    [INFO] Apache Hadoop Client Packaging Integration Tests ... SUCCESS [ 27.535 s]
    [INFO] Apache Hadoop Distribution ......................... SUCCESS [01:52 min]
    [INFO] Apache Hadoop Client Modules ....................... SUCCESS [  0.228 s]
    [INFO] Apache Hadoop Tencent COS Support .................. SUCCESS [01:03 min]
    [INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [  2.805 s]
    [INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [  0.134 s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time:  45:20 min
    [INFO] Finished at: 2022-06-21T20:45:46+08:00
    [INFO] ------------------------------------------------------------------------
    
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49
    • 50
    • 51
    • 52
    • 53
    • 54
    • 55
    • 56
    • 57
    • 58
    • 59
    • 60
    • 61
    • 62
    • 63
    • 64
    • 65
    • 66
    • 67
    • 68
    • 69
    • 70
    • 71
    • 72
    • 73
    • 74
    • 75
    • 76
    • 77
    • 78
    • 79
    • 80
    • 81
    • 82
    • 83
    • 84
    • 85
    • 86
    • 87
    • 88
    • 89
    • 90
    • 91
    • 92
    • 93
    • 94
    • 95
    • 96
    • 97
    • 98
    • 99
    • 100
    • 101
    • 102
    • 103
    • 104
    • 105
    • 106
    • 107
    • 108
    • 109
    • 110
    • 111
    • 112
    • 113
    • 114
    • 115
    • 116
    • 117
    • 118
    • 119
    • 120

    成功截图:

  • 相关阅读:
    Java8:Effectively final
    MySQL - 权限表
    k8s强制删除一个 Pod
    css网页缩小 ,把2560尺寸的布局改到1920上面
    数字孪生与GIS:优化公共交通的未来
    控制一个游戏对象的旋转和相机的缩放
    微信小程序 生成跳转体验版url,可直接跳转到体验版小程序(可通过此方法测试模板消息)
    如何在安卓設備上更換IP地址?
    Dcoker入门,小白也学得懂!
    梯度下降法求解BP神经网络的简单Demo
  • 原文地址:https://blog.csdn.net/dkl12/article/details/125473505