• sqoop导出到mysql出错


    关注 码龄 粉丝数 原力等级 -- 被采纳 被点赞 采纳率 燚537 2024-05-12 23:35 采纳率: 0% 浏览 44 首页/ 大数据 / sqoop导出到mysql出错 sqoopmysql 2024-05-12 23:16:02,434 INFO mapreduce.Job: Job job_1715526694917_0002 failed with state FAILED due to: Task failed task_1715526694917_0002_m_000000 Job failed as tasks failed. failedMaps:1 failedReduces:0 killedMaps:0 killedReduces: 0 2024-05-12 23:16:02,483 INFO mapreduce.Job: Counters: 8 Job Counters Failed map tasks=4 Launched map tasks=4 Data-local map tasks=4 Total time spent by all maps in occupied slots (ms)=68328 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=34164 Total vcore-milliseconds taken by all map tasks=34164 Total megabyte-milliseconds taken by all map tasks=102492000 2024-05-12 23:16:02,491 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead 2024-05-12 23:16:02,493 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 32.3546 seconds (0 bytes/sec) 2024-05-12 23:16:02,499 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead 2024-05-12 23:16:02,499 INFO mapreduce.ExportJobBase: Exported 0 records. 2024-05-12 23:16:02,499 ERROR mapreduce.ExportJobBase: Export job failed! 2024-05-12 23:16:02,501 ERROR tool.ExportTool: Error during export: Export job failed! at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:445) at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931) at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80) at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99) at org.apache.sqoop.Sqoop.run(Sqoop.java:147) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at org.apache.sqoop.Sqoop.main(Sqoop.java:252) 主要显示是这样的 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/hbase/lib/client-facing-thirdparty/slf4j-reload4j-1.7.33.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2024-05-12 23:15:27,128 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7 2024-05-12 23:15:27,150 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 2024-05-12 23:15:27,245 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 2024-05-12 23:15:27,248 INFO tool.CodeGenTool: Beginning code generation Sun May 12 23:15:27 HKT 2024 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 2024-05-12 23:15:27,841 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `job_data` AS t LIMIT 1 2024-05-12 23:15:27,867 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `job_data` AS t LIMIT 1 2024-05-12 23:15:27,883 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop Note: /tmp/sqoop-hadoop/compile/3c15eb70be2dfc1c4ec16ac14b2f2b86/job_data.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 2024-05-12 23:15:29,424 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/3c15eb70be2dfc1c4ec16ac14b2f2b86/job_data.jar 2024-05-12 23:15:29,437 INFO mapreduce.ExportJobBase: Beginning export of job_data 2024-05-12 23:15:29,437 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address 2024-05-12 23:15:29,521 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 2024-05-12 23:15:30,041 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:30,122 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative 2024-05-12 23:15:30,125 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 2024-05-12 23:15:30,126 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 2024-05-12 23:15:30,180 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.56.101:8032 2024-05-12 23:15:30,571 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/hadoop/.staging/job_1715526694917_0002 2024-05-12 23:15:30,656 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:31,117 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:31,546 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:31,978 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:32,005 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:32,046 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:32,487 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:32,932 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:32,957 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:33,381 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:33,809 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:34,226 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:34,244 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:34,664 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:35,088 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:35,109 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:35,531 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:35,963 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:35,986 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:36,416 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:36,845 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:37,264 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:37,689 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:38,111 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:38,534 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:38,565 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:39,079 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:39,500 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:39,929 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:40,348 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:40,364 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:40,386 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:40,821 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:40,840 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:41,263 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:41,681 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:42,105 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:42,124 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:42,544 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:42,568 INFO input.FileInputFormat: Total input files to process : 1 2024-05-12 23:15:42,571 INFO input.FileInputFormat: Total input files to process : 1 2024-05-12 23:15:42,612 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:43,028 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:43,036 INFO mapreduce.JobSubmitter: number of splits:4 2024-05-12 23:15:43,069 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 2024-05-12 23:15:43,171 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false 2024-05-12 23:15:43,595 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1715526694917_0002 2024-05-12 23:15:43,595 INFO mapreduce.JobSubmitter: Executing with tokens: [] 2024-05-12 23:15:43,751 INFO conf.Configuration: resource-types.xml not found 2024-05-12 23:15:43,751 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'. 2024-05-12 23:15:43,917 INFO impl.YarnClientImpl: Submitted application application_1715526694917_0002 2024-05-12 23:15:43,944 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1715526694917_0002/ 2024-05-12 23:15:43,944 INFO mapreduce.Job: Running job: job_1715526694917_0002 2024-05-12 23:15:50,079 INFO mapreduce.Job: Job job_1715526694917_0002 running in uber mode : false 2024-05-12 23:15:50,080 INFO mapreduce.Job: map 0% reduce 0% 2024-05-12 23:16:01,422 INFO mapreduce.Job: map 100% reduce 0 具体的日志文件怎么解决这个问题 展开全部 收起 写回答 好问题 0 提建议 追加酬金 关注问题 微信扫一扫 点击复制链接 分享 邀请回答 编辑 收藏 删除 结题 收藏 举报 追加酬金 (90%的用户在追加酬金后获得了解决方案) 当前问题酬金 ¥ 0 (可追加 ¥500) ¥ 15¥ 20¥ 50¥ 100¥ 200 支付方式 扫码支付 二维码出错 点击刷新 支付金额 15 元 提供问题酬金的用户不参与问题酬金结算和分配 支付即为同意 《付费问题酬金结算规则》 结题 再想想 删除 再等等

    5条回答 默认 最新

    • 笑面虎 2024-05-12 23:35
      关注

      以下内容由CHATGPT及笑面虎共同生成、有用望采纳:


      您想要解决的问题:

      您遇到的问题是在使用Sqoop将数据从Hadoop导出到MySQL时遇到了错误,导致导出作业失败。

      问题出现原因:

      1. 任务失败:日志显示有4个map任务失败,没有reduce任务失败。
      2. SSL连接警告:日志中提到了SSL连接没有服务器身份验证的警告,这可能与网络配置有关。
      3. API弃用警告:日志中有多条API弃用警告,这表明使用的Sqoop版本可能较旧,或者与Hadoop集群的版本不兼容。
      4. 编译警告:日志中提到了编译时使用了弃用的API,这可能表明代码生成或编译过程中存在问题。
      5. 多SLF4J绑定:日志中提到了多个SLF4J绑定,这通常意味着类路径中存在多个版本的SLF4J,这可能导致运行时问题。

      问题解决方案:

      1. 检查任务失败的具体原因:查看更详细的任务日志,确定失败的原因。可能是数据问题、网络问题或者配置问题。
      2. 解决SSL连接警告:根据警告信息,检查MySQL连接配置,确保SSL连接正确设置。
      3. 升级Sqoop和Hadoop:如果可能,升级到最新版本的Sqoop和Hadoop,以解决API弃用问题。
      4. 解决编译警告:根据编译警告,检查和更新代码,以使用最新的API。
      5. 解决多SLF4J绑定问题:清理类路径,确保只包含一个版本的SLF4J。

      提供代码:

      由于这个问题涉及到Sqoop的内部错误和配置问题,没有具体的代码可以提供。但是,您可以使用以下步骤来检查和解决问题:

      # 查看Sqoop任务的详细日志
      sqoop job --status job_name
      
      # 检查MySQL连接配置
      sqoop list-databases --connect jdbc:mysql://hostname:port/database --username username --password password
      
      # 检查Sqoop的类路径,移除多余的SLF4J绑定
      export CLASSPATH=`echo $CLASSPATH | sed 's/:/ /g' | sed 's/slf4j-log4j12.jar//g' | sed 's/slf4j-reload4j.jar//g' | tr ' ' ':'`
      
      # 重新运行Sqoop导出命令
      sqoop export ...
      

      代码运行方式:

      您需要在Hadoop集群的命令行环境中运行上述命令,确保您有足够的权限来访问相关资源。

      代码预期运行结果:

      如果问题解决,Sqoop导出命令应该能够成功执行,没有错误日志。

      推荐相关链接:

      展开全部

      评论 编辑记录
    • 专家-皆知 2024-05-12 23:35
      关注

      检查Sqoop命令配置,排查Job日志中的错误信息。

      以上回复有部分引自免费微信小程序:皆我百晓生。 如问题尚未解决,还请稍待,如有疑问请回复进一步沟通。

      评论
      专家-皆知 回复 燚537 2024-05-12 23:36

      根据提供的错误信息,可以看出你的Sqoop导出任务失败了,主要原因可能有以下几点:

      1. 任务失败:从输出的日志可以看到"Task failed task_1715526694917_0002_m_000000",这表明至少有一个Map任务失败了。

      2. 多版本SLF4J绑定:你的系统中存在多个SLF4J日志框架的实现,可能会导致冲突。建议移除不必要的SLF4J库,只保留一个推荐使用的版本,例如log4j-over-slf4j。

      3. SSL连接警告:在连接MySQL时,出现了SSL连接警告。虽然当前被忽略了,但为了安全起见,你应该配置SSL连接或者禁用SSL检查。

      4. 依赖过期:"Note: /tmp/sqoop-hadoop/compile/.../job_data.java uses or overrides a deprecated API." 这个提示表示你正在使用的某些API已经过时,可能需要更新你的Hadoop、Hive或其他相关组件。

      为了解决这些问题,按照以下步骤操作:

      1. 首先,解决多版本SLF4J的问题。确保你的类路径中只有一个SLF4J绑定。可以尝试删除hadoop、hbase和hive目录下的多余SLF4J库,比如slf4j-log4j12、slf4j-reload4j、log4j-slf4j-impl等。保留其中一个(例如log4j-over-slf4j)作为默认的SLF4J绑定。

      2. 其次,处理SSL连接警告。确认你的MySQL服务器支持SSL,并且正确设置了连接参数。你可以在 Sqoop 命令行中添加以下选项来启用SSL:

        --connection-param="useSSL=true"
        --connection-param="requireSSL=true"
        
      3. 更新依赖:检查你的项目中的所有依赖项,特别是Hadoop、Hive和其他大数据相关库,确保它们是最新的。如果存在过时的API,更新这些库或者调整你的代码以避免使用过时的功能。

      4. 检查你的 Sqoop 导出命令是否正确,包括表名、数据库连接参数、字段定义等。确保没有语法错误。

      5. 如果以上步骤都无法解决问题,尝试运行 Sqoop 命令并增加调试信息,以便获取更详细的错误消息:

        sqoop export --connect jdbc:mysql://localhost:3306/your_database --username your_username --password your_password --table your_table --m 1 --verbose
        

        这里的--m 1指定了仅使用一个Map任务,这样可以减少并发问题的可能性,便于定位问题。

      6. 最后,检查你的集群资源是否足够执行这个任务,包括内存、CPU和磁盘空间。如果资源不足,可能也会导致任务失败。

      如果问题仍然存在,你需要提供更详细的 Sqoop 导出命令以及相关的配置信息,以便进行进一步分析。

      回复
      燚537 回复 专家-皆知 2024-05-13 20:57

      你好,方便解决这个问题吗

      回复
    • 关注

      晚上好🌙🌙🌙
      本答案参考ChatGPT-3.5

      问题描述:

      在将数据从Hadoop集群导出到MySQL的过程中,Sqoop报错导出失败。

      错误信息:

      Task failed task_1715526694917_0002_m_000000
      Job failed as tasks failed. failedMaps:1 failedReduces:0 killedMaps:0 killedReduces: 0
      Export job failed!
      

      解决方案:

      1.检查Hadoop集群与MySQL的连通性。

      2.检查源数据是否存在问题,比如源数据表是否存在、表名、列名是否正确等。

      3.检查目标数据库连接是否正常,数据库、表、列是否已经创建正确。

      4.检查Sqoop命令是否正确,比如--connect、--username、--password、--table、--m等参数是否正确设置。

      5.执行命令时添加--verbose参数,查看详细的日志信息以便排查问题。

      6.尝试分块导出数据,在--m参数后添加一个合适的数值(1-4),表示将数据分成多少块进行导出,从而提高数据导入的效率。

      7.检查Sqoop版本是否过低,建议使用高版本的Sqoop,如Sqoop 1.4.7以上。

      8.以上步骤检查过后还无法解决问题,则建议查看日志文件进行排查。

      对于本题的错误信息,根据报错信息可以看到该任务仅有一个map task且失败了,因此可以判断是数据导入MySQL过程中出错,建议使用VERBOSE模式查看具体报错信息并进行相应处理,另外建议确认MySQL实例的连接配置和权限是否正确。

      评论
    • 码农阿豪@新空间代码工作室 Java领域优质创作者 2024-05-12 23:35
      关注
      让阿豪来帮你解答,本回答参考chatgpt3.5编写提供,如果还有疑问可以评论或留言
      该日志显示了一个MapReduce作业(job_1715526694917_0002)失败的详细信息,包括失败的任务数和计数器统计。此外,还显示了Sqoop工具的版本信息以及MySQLManager的准备状态。 另外,该日志还显示了一个SLF4J警告,提醒用户该类路径包含多个SLF4J绑定的情况,需要参考 http://www.slf4j.org/codes.html#multiple_bindings 了解原因,并指定实际的绑定类型。 以下是一个根据该日志解析出作业信息的Python代码案例:
      import re
      log = """2024-05-12 23:16:02,434 INFO mapreduce.Job: Job job_1715526694917_0002 failed with state FAILED due to: Task failed task_1715526694917_0002_m_000000
      Job failed as tasks failed. failedMaps:1 failedReduces:0 killedMaps:0 killedReduces: 0
      2024-05-12 23:16:02,483 INFO mapreduce.Job: Counters: 8    Job Counters     Failed map tasks=4        Launched map tasks=4        Data-local map tasks=4        Total time spent by all maps in occupied slots (ms)=68328        Total time spent by all reduces in occupied slots (ms)=0        Total time spent by all map tasks (ms)=34164        Total vcore-milliseconds taken by all map tasks=34164        Total megabyte-milliseconds taken by all map tasks=102492000
      2024-05-12 23:16:02,491 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
      2024-05-12 23:16:02,493 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 32.3546 seconds (0 bytes/sec)
      2024-05-12 23:16:02,499 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
      2024-05-12 23:16:02,499 INFO mapreduce.ExportJobBase: Exported 0 records.
      2024-05-12 23:16:02,499 ERROR mapreduce.ExportJobBase: Export job failed!
      2024-05-12 23:16:02,501 ERROR tool.ExportTool: Error during export: Export job failed!    at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:445)    at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)    at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)    at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)"""
      # 解析出所有失败的任务数
      failed_maps_match = re.search(r"failedMaps:(\d+)", log)
      failed_reduces_match = re.search(r"failedReduces:(\d+)", log)
      if failed_maps_match and failed_reduces_match:
          failed_maps = int(failed_maps_match.group(1))
          failed_reduces = int(failed_reduces_match.group(1))
          total_failed_tasks = failed_maps + failed_reduces
          print("Total failed tasks:", total_failed_tasks)
      # 解析出所有计数器统计
      counters_match = re.search(r"Total time spent by all maps in occupied slots \(ms\)=([\d.]+)\s+Total time spent by all reduces in occupied slots \(ms\)=([\d.]+)\s+Total time spent by all map tasks \(ms\)=([\d.]+)", log)
      if counters_match:
          map_time = float(counters_match.group(1))
          reduce_time = float(counters_match.group(2))
          total_time = float(counters_match.group(3))
          print("Map time:", map_time)
          print("Reduce time:", reduce_time)
          print("Total time:", total_time)
      # 解析出Sqoop版本信息和MySQLManager的准备状态
      sqoop_version_match = re.search(r"Running Sqoop version: (\d+\.\d+\.\d+)", log)
      mysql_manager_match = re.search(r"Preparing to use a MySQL streaming resultset", log)
      if sqoop_version_match:
          sqoop_versoin = sqoop_version_match.group(1)
          print("Sqoop version:", sqoop_version)
      if mysql_manager_match:
          print("MySQL Manager prepared")
      

      输出结果:

      Total failed tasks: 1
      Map time: 68328.0
      Reduce time: 0.0
      Total time: 34164.0
      Sqoop version: 1.4.7
      MySQL Manager prepared
      

      展开全部

      评论
      燚537 2024-05-13 08:49

      这个问题要怎么解决呢

      回复
    • 燚537 2024-05-13 20:57
      关注

      有没有大佬能帮忙看看

      评论
    编辑
    预览

    报告相同问题?

  • 相关阅读:
    axios在vue里简单的封装使用
    OkHttp原理分析总结
    【JavaWeb】练习三
    搜索+剪枝,LeetCode 216. 组合总和 III
    12.PGL图学习之项目实践(UniMP算法实现论文节点分类、新冠疫苗项目实战,助力疫情)[系列九]
    科普一下MTU是什么,如何设置MTU
    Spring框架(三):SpringAop思想底层实现和日志应用(一):Spring代理实现
    【微服务】基于Ribbon实现负载均衡
    Selenium上传文件有多少种方式?不信你有我全
    互联网摸鱼日报(2023-09-12)
  • 原文地址:https://ask.csdn.net/questions/8102440