• Sqoop导入到Hive,Hive使用 HA


    Sqoop写入Hive卡在连接Hive的JDBC上不执行

    Sqoop访问 启用 HA模式的Hive

    找到Hive的安装根目录:$HIVE_HOME/conf
    创建一个新的配置文件:beeline-hs2-connection.xml

    
    
    
    
     beeline.hs2.connection.user
     hive
    
    
     beeline.hs2.connection.password
     hive
    
    
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12

    beeline.hs2.connection.user:指定Hive访问的用户名
    beeline.hs2.connection.password:当前用户名对应的访问密码

    重新执行后发现会有新的报错信息:

    报错信息

    23/11/02 13:57:23 INFO hive.HiveImport: Error: Error while compiling statement: FAILED: SemanticException Unable to load data to destination table. Error: The file that you are trying to load does not match the file format of the destination table. (state=42000,code=40000)
    23/11/02 13:57:23 INFO hive.HiveImport: Closing: 0: jdbc:hive2://hdp3.node1:2181,hdp3.node2:2181,hdp3.node3:2181/default;password=hive;serviceDiscoveryMode=zooKeeper;user=hive;zooKeeperNamespace=hiveserver2
    23/11/02 13:57:23 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive exited with status 2
    	at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:253)
    	at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:206)
    	at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:273)
    	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:564)
    	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:655)
    	at org.apache.sqoop.Sqoop.run(Sqoop.java:151)
    	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:82)
    	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:187)
    	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:241)
    	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:250)
    	at org.apache.sqoop.Sqoop.main(Sqoop.java:259)
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14

    解决办法:

    1-新建一个存储格式为textfile的临时表

    create table hive_db.hive_01( id string comment 'Id') 
    row format delimited fields terminated by '\001' 
    stored as textFile;
    
    • 1
    • 2
    • 3

    2-将数据导入临时表中(Sqoop执行的Import写入到临时表中)
    3-通过查询插入的方式将临时表数据导入目标表

    insert into hive_db.hive_table select * from hive_db.hive_01
    
    • 1
  • 相关阅读:
    训练千亿参数大模型,离不开四种GPU并行策略
    【算法】蓝桥杯全攻略:从语言基础到数学算法,一站式解锁竞赛技巧
    Python Web开发 之 学生管理系统(2)[实现筛选,搜索,分页]
    Hadoop-3.0.0版本Windows安装
    module.exports和exports,应该用哪个
    IProgress not found.Please update jupyter and ipywidgets.解决办法
    数据结构之栈
    JAVAWeb--会话_过滤器_监听器
    C语言日常刷题5
    我做了一个世界杯的可视化网站...
  • 原文地址:https://blog.csdn.net/Han_Lin_/article/details/134181041