• Apache Knox 2.0.0使用


    目录

    介绍

    使用

    gateway-site.xml

    users.ldif

    my_hdfs.xml

    my_yarn.xml

    其它


    介绍

            The Apache Knox Gateway is a system that provides a single point of authentication and access for Apache Hadoop services in a cluster. The goal is to simplify Hadoop security for both users (i.e. who access the cluster data and execute jobs) and operators (i.e. who control access and manage the cluster). The gateway runs as a server (or cluster of servers) that provide centralized access to one or more Hadoop clusters. In general the goals of the gateway are as follows:

    • Provide perimeter security for Hadoop REST APIs to make Hadoop security easier to setup and use
      • Provide authentication and token verification at the perimeter
      • Enable authentication integration with enterprise and cloud identity management systems
      • Provide service level authorization at the perimeter
    • Expose a single URL hierarchy that aggregates REST APIs of a Hadoop cluster
      • Limit the network endpoints (and therefore firewall holes) required to access a Hadoop cluster
      • Hide the internal Hadoop cluster topology from potential attackers

    使用

     解压后,目录如下:

    进入conf目录:

    文件说明:

    gateway-site.xml

    网关文件,修改如下:

    1. <property>
    2. <name>gateway.portname>
    3. <value>18483value>
    4. <description>The HTTP port for the Gateway.description>
    5. property>
    6. <property>
    7. <name>gateway.pathname>
    8. <value>my/mimivalue>
    9. <description>The default context path for the gateway.description>
    10. property>
    11. <property>
    12. <name>gateway.dispatch.whitelistname>
    13. <value>.*$value>
    14. property>

    users.ldif

    用户文件,全部如下

    1. # Licensed to the Apache Software Foundation (ASF) under one
    2. # or more contributor license agreements. See the NOTICE file
    3. # distributed with this work for additional information
    4. # regarding copyright ownership. The ASF licenses this file
    5. # to you under the Apache License, Version 2.0 (the
    6. # "License"); you may not use this file except in compliance
    7. # with the License. You may obtain a copy of the License at
    8. #
    9. # http://www.apache.org/licenses/LICENSE-2.0
    10. #
    11. # Unless required by applicable law or agreed to in writing, software
    12. # distributed under the License is distributed on an "AS IS" BASIS,
    13. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    14. # See the License for the specific language governing permissions and
    15. # limitations under the License.
    16. version: 1
    17. # Please replace with site specific values
    18. dn: dc=hadoop,dc=apache,dc=org
    19. objectclass: organization
    20. objectclass: dcObject
    21. o: Hadoop
    22. dc: hadoop
    23. # Entry for a sample people container
    24. # Please replace with site specific values
    25. dn: ou=people,dc=hadoop,dc=apache,dc=org
    26. objectclass:top
    27. objectclass:organizationalUnit
    28. ou: people
    29. dn: ou=myhadoop,dc=hadoop,dc=apache,dc=org
    30. objectclass:top
    31. objectclass:organizationalUnit
    32. ou: myhadoop
    33. # Entry for a sample end user
    34. # Please replace with site specific values
    35. dn: uid=guest,ou=people,dc=hadoop,dc=apache,dc=org
    36. objectclass:top
    37. objectclass:person
    38. objectclass:organizationalPerson
    39. objectclass:inetOrgPerson
    40. cn: Guest
    41. sn: User
    42. uid: guest
    43. userPassword:123456
    44. dn: uid=myclient,ou=people,dc=hadoop,dc=apache,dc=org
    45. objectclass:top
    46. objectclass:person
    47. objectclass:organizationalPerson
    48. objectclass:inetOrgPerson
    49. cn: Myclient
    50. sn: Client
    51. uid: myclient
    52. userPassword:qwe
    53. dn: uid=myhdfs,ou=myhadoop,dc=hadoop,dc=apache,dc=org
    54. objectclass:top
    55. objectclass:person
    56. objectclass:organizationalPerson
    57. objectclass:inetOrgPerson
    58. cn: myhdfs
    59. sn: myhdfs
    60. uid: myhdfs
    61. userPassword:123456
    62. dn: uid=myyarn,ou=myhadoop,dc=hadoop,dc=apache,dc=org
    63. objectclass:top
    64. objectclass:person
    65. objectclass:organizationalPerson
    66. objectclass:inetOrgPerson
    67. cn: myyarn
    68. sn: myyarn
    69. uid: myyarn
    70. userPassword:123
    71. # entry for sample user admin
    72. dn: uid=admin,ou=people,dc=hadoop,dc=apache,dc=org
    73. objectclass:top
    74. objectclass:person
    75. objectclass:organizationalPerson
    76. objectclass:inetOrgPerson
    77. cn: Admin
    78. sn: Admin
    79. uid: admin
    80. userPassword:123456
    81. dn: uid=root,ou=myhadoop,dc=hadoop,dc=apache,dc=org
    82. objectclass:top
    83. objectclass:person
    84. objectclass:organizationalPerson
    85. objectclass:inetOrgPerson
    86. cn: Admin
    87. sn: Admin
    88. uid: root
    89. userPassword:123456
    90. # entry for sample user sam
    91. dn: uid=sam,ou=people,dc=hadoop,dc=apache,dc=org
    92. objectclass:top
    93. objectclass:person
    94. objectclass:organizationalPerson
    95. objectclass:inetOrgPerson
    96. cn: sam
    97. sn: sam
    98. uid: sam
    99. userPassword:sam-password
    100. # entry for sample user tom
    101. dn: uid=tom,ou=people,dc=hadoop,dc=apache,dc=org
    102. objectclass:top
    103. objectclass:person
    104. objectclass:organizationalPerson
    105. objectclass:inetOrgPerson
    106. cn: tom
    107. sn: tom
    108. uid: tom
    109. userPassword:tom-password
    110. # create FIRST Level groups branch
    111. dn: ou=groups,dc=hadoop,dc=apache,dc=org
    112. objectclass:top
    113. objectclass:organizationalUnit
    114. ou: groups
    115. description: generic groups branch
    116. # create the analyst group under groups
    117. dn: cn=analyst,ou=groups,dc=hadoop,dc=apache,dc=org
    118. objectclass:top
    119. objectclass: groupofnames
    120. cn: analyst
    121. description:analyst group
    122. member: uid=sam,ou=people,dc=hadoop,dc=apache,dc=org
    123. member: uid=tom,ou=people,dc=hadoop,dc=apache,dc=org
    124. member: uid=myhdfs,ou=myhadoop,dc=hadoop,dc=apache,dc=org
    125. member: uid=myyarn,ou=myhadoop,dc=hadoop,dc=apache,dc=org
    126. # create the scientist group under groups
    127. dn: cn=scientist,ou=groups,dc=hadoop,dc=apache,dc=org
    128. objectclass:top
    129. objectclass: groupofnames
    130. cn: scientist
    131. description: scientist group
    132. member: uid=sam,ou=people,dc=hadoop,dc=apache,dc=org
    133. # create the admin group under groups
    134. dn: cn=admin,ou=groups,dc=hadoop,dc=apache,dc=org
    135. objectclass:top
    136. objectclass: groupofnames
    137. cn: admin
    138. description: admin group
    139. member: uid=admin,ou=people,dc=hadoop,dc=apache,dc=org
    140. member: uid=root,ou=myhadoop,dc=hadoop,dc=apache,dc=org

    注意,我在这里面添加了两个用户 myhdfs,myyarn

     进入knox-2.0.0/conf/topologies 文件,会发现一个sandbox.xml文件

    它是一个模板文件,使用时改一个名字,我的如下

    my_hdfs.xml

    1. "1.0" encoding="UTF-8"?>
    2. <topology>
    3. <name>my_hdfsname>
    4. <gateway>
    5. <provider>
    6. <role>authenticationrole>
    7. <name>ShiroProvidername>
    8. <enabled>trueenabled>
    9. <param>
    10. <name>sessionTimeoutname>
    11. <value>30value>
    12. param>
    13. <param>
    14. <name>main.ldapRealmname>
    15. <value>org.apache.knox.gateway.shirorealm.KnoxLdapRealmvalue>
    16. param>
    17. <param>
    18. <name>main.ldapContextFactoryname>
    19. <value>org.apache.knox.gateway.shirorealm.KnoxLdapContextFactoryvalue>
    20. param>
    21. <param>
    22. <name>main.ldapRealm.contextFactoryname>
    23. <value>$ldapContextFactoryvalue>
    24. param>
    25. <param>
    26. <name>main.ldapRealm.contextFactory.systemUsernamename>
    27. <value>uid=myclient,ou=people,dc=hadoop,dc=apache,dc=orgvalue>
    28. param>
    29. <param>
    30. <name>main.ldapRealm.contextFactory.systemPasswordname>
    31. <value>${ALIAS=ldcSystemPassword}value>
    32. param>
    33. <param>
    34. <name>main.ldapRealm.userSearchBasename>
    35. <value>ou=myhadoop,dc=hadoop,dc=apache,dc=orgvalue>
    36. param>
    37. <param>
    38. <name>main.ldapRealm.userSearchAttributeNamename>
    39. <value>uidvalue>
    40. param>
    41. <param>
    42. <name>main.ldapRealm.userSearchFiltername>
    43. <value>(&(objectclass=person)(uid={0})(uid=myhdfs))value>
    44. param>
    45. <param>
    46. <name>main.ldapRealm.contextFactory.urlname>
    47. <value>ldap://localhost:33389value>
    48. param>
    49. <param>
    50. <name>main.ldapRealm.contextFactory.authenticationMechanismname>
    51. <value>simplevalue>
    52. param>
    53. <param>
    54. <name>urls./**name>
    55. <value>authcBasicvalue>
    56. param>
    57. provider>
    58. <provider>
    59. <role>identity-assertionrole>
    60. <name>Defaultname>
    61. <enabled>trueenabled>
    62. provider>
    63. <provider>
    64. <role>hostmaprole>
    65. <name>staticname>
    66. <enabled>trueenabled>
    67. <param>
    68. <name>localhostname>
    69. <value>my_hdfsvalue>
    70. param>
    71. provider>
    72. gateway>
    73. <service>
    74. <role>HDFSUIrole>
    75. <url>http://hadoop02:50070url>
    76. <version>2.7.0version>
    77. service>
    78. topology>

    my_yarn.xml

    1. "1.0" encoding="UTF-8"?>
    2. <topology>
    3. <name>my_yarnname>
    4. <gateway>
    5. <provider>
    6. <role>authenticationrole>
    7. <name>ShiroProvidername>
    8. <enabled>trueenabled>
    9. <param>
    10. <name>sessionTimeoutname>
    11. <value>30value>
    12. param>
    13. <param>
    14. <name>main.ldapRealmname>
    15. <value>org.apache.knox.gateway.shirorealm.KnoxLdapRealmvalue>
    16. param>
    17. <param>
    18. <name>main.ldapContextFactoryname>
    19. <value>org.apache.knox.gateway.shirorealm.KnoxLdapContextFactoryvalue>
    20. param>
    21. <param>
    22. <name>main.ldapRealm.contextFactoryname>
    23. <value>$ldapContextFactoryvalue>
    24. param>
    25. <param>
    26. <name>main.ldapRealm.contextFactory.systemUsernamename>
    27. <value>uid=myclient,ou=people,dc=hadoop,dc=apache,dc=orgvalue>
    28. param>
    29. <param>
    30. <name>main.ldapRealm.contextFactory.systemPasswordname>
    31. <value>${ALIAS=ldcSystemPassword}value>
    32. param>
    33. <param>
    34. <name>main.ldapRealm.userSearchBasename>
    35. <value>ou=myhadoop,dc=hadoop,dc=apache,dc=orgvalue>
    36. param>
    37. <param>
    38. <name>main.ldapRealm.userSearchAttributeNamename>
    39. <value>uidvalue>
    40. param>
    41. <param>
    42. <name>main.ldapRealm.userSearchFiltername>
    43. <value>(&(objectclass=person)(uid={0})(uid=myyarn))value>
    44. param>
    45. <param><name>csrf.enabledname><value>truevalue>param>
    46. <param><name>csrf.customHeadername><value>X-XSRF-Headervalue>param>
    47. <param><name>csrf.methodsToIgnorename><value>GET,OPTIONS,HEADvalue>param>
    48. <param><name>cors.enabledname><value>falsevalue>param>
    49. <param><name>xframe.options.enabledname><value>truevalue>param>
    50. <param><name>xss.protection.enabledname><value>truevalue>param>
    51. <param><name>strict.transport.enabledname><value>truevalue>param>
    52. <param><name>rate.limiting.enabledname><value>truevalue>param>
    53. <param>
    54. <name>main.ldapRealm.contextFactory.urlname>
    55. <value>ldap://localhost:33389value>
    56. param>
    57. <param>
    58. <name>main.ldapRealm.contextFactory.authenticationMechanismname>
    59. <value>simplevalue>
    60. param>
    61. <param>
    62. <name>urls./**name>
    63. <value>authcBasicvalue>
    64. param>
    65. provider>
    66. <provider>
    67. <role>identity-assertionrole>
    68. <name>Defaultname>
    69. <enabled>trueenabled>
    70. provider>
    71. <provider>
    72. <role>hostmaprole>
    73. <name>staticname>
    74. <enabled>trueenabled>
    75. <param>
    76. <name>localhostname>
    77. <value>my_yarnvalue>
    78. param>
    79. provider>
    80. gateway>
    81. <service>
    82. <role>YARNUIrole>
    83. <url>http://hadoop03:8088url>
    84. service>
    85. <service>
    86. <role>JOBHISTORYUIrole>
    87. <url>http://hadoop03:19888url>
    88. service>
    89. topology>

    说明,配置中的${ALIAS=ldcSystemPassword}是生成的密文

    参考:

    如果测试不成功改名实际密码测试

    之后进入knox-2.0.0/bin

    执行(注意,不能使用root用户)

    1. ./ldap.sh start
    2. ./knoxcli.sh create-master
    3. ./gateway.sh start

    网站访问:

    https://192.168.200.11:18483/my/mimi/my_yarn/yarn

    账号:myyarn 密码 123

    https://192.168.200.11:18483/my/mimi/my_hdfs/hdfs

    账号:myhdfs 密码 123456

    其它

    通过我的多次测试,成功实现了,不同用户访问不同页面。下面是几个问题

    1 必须有多个单独的xml才能实现不同用户访问不同页面

    2 我测试成功的只有hdfs、yarn、hbase。之后准备测试hue,发现死活不成功

    注意:文中xml单用户访问页面可以直接复制,然后修改。官网描述很多测试失败。

  • 相关阅读:
    Linux设置禁止SSH空密码登录
    Hadoop完全分布式环境搭建
    天软特色因子看板 (2023.09 第04期)
    Swift开发中:非逃逸闭包、逃逸闭包、自动闭包的区别
    100张照片带你了解澳大利亚
    typeScript简单封装axios
    CSS定位
    tiup cluster upgrade
    线上研讨会 | CATIA助力AI提升汽车造型设计
    Sentinel 哨兵数据 更新下载地址 2023年11月
  • 原文地址:https://blog.csdn.net/qq_40209679/article/details/138563912