• Sqoop 安装部署


    今天重新部署一台 sqoop,就手再记录一下

    官网:sqoop.apache.org

    说明:sqoop 有两个大版本,1 和 2,且两个是完全不同的,但 2 现在一直是 1.99.x 版本,并未到 2.x 版本,因此我们选用 1.4.6 版本(当前最新是 1.4.7,使用最新的前一个版本);

    下载、安装

    $ tar xzvf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz -C /home/hadoop/local/

    $ cd /home/hadoop/local

    $ ln -s sqoop-1.4.6.bin__hadoop-2.0.4-alpha sqoop

    sqoop 是导入工具,不需要在集群中所有服务器都安装,只在一个服务器安装即可,我装到了 ns1 服务器;

    配置

    $ cd sqoop/conf

    $ mv sqoop-env-template.sh sqoop-env.sh

    $ vim sqoop-env.sh

    最下面添加:

    1. export HADOOP_COMMON_HOME=/home/local/hadoop
    2. export HADOOP_MAPRED_HOME=/home/local/hadoop
    3. export HIVE_HOME=/home/hadoop/local/hive
    4. export ZOOKEEPER_HOME=/home/hadoop/local/zookeeper
    5. export ZOOCFGDIR=/home/hadoop/local/zookeeper/conf

    这里配置的 ZOOKEEPER 是使用 HBASE 时候用的

    添加环境变量

    $ sudo vim /etc/profile.d/my_env.sh

    1. SQOOP_HOME=/home/hadoop/local/sqoop
    2. PATH=$PATH:/home/hadoop/bin:$SQOOP_HOME/bin
    3. export SQOOP_HOME PATH

    $ source /etc/profile

    拷贝 JDBC 驱动

    $ cp mysql-connector-java-8.0.11.jar /home/hadoop/local/sqoop/lib/

    验证

    $ sqoop version

    1. Warning: /home/hadoop/local/sqoop/../hbase does not exist! HBase imports will fail.
    2. Please set $HBASE_HOME to the root of your HBase installation.
    3. Warning: /home/hadoop/local/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
    4. Please set $HCAT_HOME to the root of your HCatalog installation.
    5. Warning: /home/hadoop/local/sqoop/../accumulo does not exist! Accumulo imports will fail.
    6. Please set $ACCUMULO_HOME to the root of your Accumulo installation.
    7. 2022-09-07 17:14:12,041 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
    8. Sqoop 1.4.6
    9. git commit id c0c5a81723759fa575844a0a1eae8f510fa32c25
    10. Compiled by root on Mon Apr 27 14:38:36 CST 2015

    Warning 不用管;

    $ sqoop help

    1. usage: sqoop COMMAND [ARGS]
    2. Available commands:
    3. codegen Generate code to interact with database records
    4. create-hive-table Import a table definition into Hive
    5. eval Evaluate a SQL statement and display the results
    6. export Export an HDFS directory to a database table
    7. help List available commands
    8. import Import a table from a database to HDFS
    9. import-all-tables Import tables from a database to HDFS
    10. import-mainframe Import datasets from a mainframe server to HDFS
    11. job Work with saved jobs
    12. list-databases List available databases on a server
    13. list-tables List available tables in a database
    14. merge Merge results of incremental imports
    15. metastore Run a standalone Sqoop metastore
    16. version Display version information
    17. See 'sqoop help COMMAND' for information on a specific command.

    $ sqoop list-databases --connect jdbc:mysql://ns1:3306 --username root --password 123456

    1. mysql
    2. information_schema
    3. performance_schema
    4. sys
    5. activiti2
    6. logistics
    7. activiti
    8. zsoft

    列出了当前 ns1 上面 MySQL 中的所有数据库;

  • 相关阅读:
    RPC项目解析(1)
    共读《redis设计与实现》-单机(一)
    css心跳动画
    学习git博客
    基于SpringCloud微服务的Hdfs分布式大数据实现的企业网盘系统
    【网络爬虫笔记】爬虫Robots协议语法详解
    docker之设置代理
    默认浏览器怎么更改为别的浏览器,这2个方法很简单
    我的创作纪念日
    使用Chrome 开发者工具提取对应的字符串
  • 原文地址:https://blog.csdn.net/zhy0414/article/details/126750516