目录
The Apache Knox Gateway is a system that provides a single point of authentication and access for Apache Hadoop services in a cluster. The goal is to simplify Hadoop security for both users (i.e. who access the cluster data and execute jobs) and operators (i.e. who control access and manage the cluster). The gateway runs as a server (or cluster of servers) that provide centralized access to one or more Hadoop clusters. In general the goals of the gateway are as follows:
解压后,目录如下:
进入conf目录:
文件说明:
网关文件,修改如下:
- <property>
- <name>gateway.portname>
- <value>18483value>
- <description>The HTTP port for the Gateway.description>
- property>
- <property>
- <name>gateway.pathname>
- <value>my/mimivalue>
- <description>The default context path for the gateway.description>
- property>
- <property>
- <name>gateway.dispatch.whitelistname>
- <value>.*$value>
- property>
用户文件,全部如下
- # Licensed to the Apache Software Foundation (ASF) under one
- # or more contributor license agreements. See the NOTICE file
- # distributed with this work for additional information
- # regarding copyright ownership. The ASF licenses this file
- # to you under the Apache License, Version 2.0 (the
- # "License"); you may not use this file except in compliance
- # with the License. You may obtain a copy of the License at
- #
- # http://www.apache.org/licenses/LICENSE-2.0
- #
- # Unless required by applicable law or agreed to in writing, software
- # distributed under the License is distributed on an "AS IS" BASIS,
- # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- # See the License for the specific language governing permissions and
- # limitations under the License.
-
- version: 1
-
- # Please replace with site specific values
- dn: dc=hadoop,dc=apache,dc=org
- objectclass: organization
- objectclass: dcObject
- o: Hadoop
- dc: hadoop
-
- # Entry for a sample people container
- # Please replace with site specific values
- dn: ou=people,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:organizationalUnit
- ou: people
-
- dn: ou=myhadoop,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:organizationalUnit
- ou: myhadoop
-
- # Entry for a sample end user
- # Please replace with site specific values
- dn: uid=guest,ou=people,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:person
- objectclass:organizationalPerson
- objectclass:inetOrgPerson
- cn: Guest
- sn: User
- uid: guest
- userPassword:123456
-
- dn: uid=myclient,ou=people,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:person
- objectclass:organizationalPerson
- objectclass:inetOrgPerson
- cn: Myclient
- sn: Client
- uid: myclient
- userPassword:qwe
-
- dn: uid=myhdfs,ou=myhadoop,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:person
- objectclass:organizationalPerson
- objectclass:inetOrgPerson
- cn: myhdfs
- sn: myhdfs
- uid: myhdfs
- userPassword:123456
-
- dn: uid=myyarn,ou=myhadoop,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:person
- objectclass:organizationalPerson
- objectclass:inetOrgPerson
- cn: myyarn
- sn: myyarn
- uid: myyarn
- userPassword:123
-
-
- # entry for sample user admin
- dn: uid=admin,ou=people,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:person
- objectclass:organizationalPerson
- objectclass:inetOrgPerson
- cn: Admin
- sn: Admin
- uid: admin
- userPassword:123456
-
- dn: uid=root,ou=myhadoop,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:person
- objectclass:organizationalPerson
- objectclass:inetOrgPerson
- cn: Admin
- sn: Admin
- uid: root
- userPassword:123456
-
- # entry for sample user sam
- dn: uid=sam,ou=people,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:person
- objectclass:organizationalPerson
- objectclass:inetOrgPerson
- cn: sam
- sn: sam
- uid: sam
- userPassword:sam-password
-
- # entry for sample user tom
- dn: uid=tom,ou=people,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:person
- objectclass:organizationalPerson
- objectclass:inetOrgPerson
- cn: tom
- sn: tom
- uid: tom
- userPassword:tom-password
-
- # create FIRST Level groups branch
- dn: ou=groups,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass:organizationalUnit
- ou: groups
- description: generic groups branch
-
- # create the analyst group under groups
- dn: cn=analyst,ou=groups,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass: groupofnames
- cn: analyst
- description:analyst group
- member: uid=sam,ou=people,dc=hadoop,dc=apache,dc=org
- member: uid=tom,ou=people,dc=hadoop,dc=apache,dc=org
- member: uid=myhdfs,ou=myhadoop,dc=hadoop,dc=apache,dc=org
- member: uid=myyarn,ou=myhadoop,dc=hadoop,dc=apache,dc=org
-
- # create the scientist group under groups
- dn: cn=scientist,ou=groups,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass: groupofnames
- cn: scientist
- description: scientist group
- member: uid=sam,ou=people,dc=hadoop,dc=apache,dc=org
-
- # create the admin group under groups
- dn: cn=admin,ou=groups,dc=hadoop,dc=apache,dc=org
- objectclass:top
- objectclass: groupofnames
- cn: admin
- description: admin group
- member: uid=admin,ou=people,dc=hadoop,dc=apache,dc=org
- member: uid=root,ou=myhadoop,dc=hadoop,dc=apache,dc=org
注意,我在这里面添加了两个用户 myhdfs,myyarn
进入knox-2.0.0/conf/topologies 文件,会发现一个sandbox.xml文件
它是一个模板文件,使用时改一个名字,我的如下
- "1.0" encoding="UTF-8"?>
- <topology>
- <name>my_hdfsname>
- <gateway>
- <provider>
- <role>authenticationrole>
- <name>ShiroProvidername>
- <enabled>trueenabled>
- <param>
- <name>sessionTimeoutname>
- <value>30value>
- param>
- <param>
- <name>main.ldapRealmname>
- <value>org.apache.knox.gateway.shirorealm.KnoxLdapRealmvalue>
- param>
- <param>
- <name>main.ldapContextFactoryname>
- <value>org.apache.knox.gateway.shirorealm.KnoxLdapContextFactoryvalue>
- param>
- <param>
- <name>main.ldapRealm.contextFactoryname>
- <value>$ldapContextFactoryvalue>
- param>
-
-
- <param>
- <name>main.ldapRealm.contextFactory.systemUsernamename>
- <value>uid=myclient,ou=people,dc=hadoop,dc=apache,dc=orgvalue>
- param>
- <param>
- <name>main.ldapRealm.contextFactory.systemPasswordname>
- <value>${ALIAS=ldcSystemPassword}value>
- param>
- <param>
- <name>main.ldapRealm.userSearchBasename>
- <value>ou=myhadoop,dc=hadoop,dc=apache,dc=orgvalue>
- param>
- <param>
- <name>main.ldapRealm.userSearchAttributeNamename>
- <value>uidvalue>
- param>
- <param>
- <name>main.ldapRealm.userSearchFiltername>
- <value>(&(objectclass=person)(uid={0})(uid=myhdfs))value>
- param>
-
- <param>
- <name>main.ldapRealm.contextFactory.urlname>
- <value>ldap://localhost:33389value>
- param>
- <param>
- <name>main.ldapRealm.contextFactory.authenticationMechanismname>
- <value>simplevalue>
- param>
- <param>
- <name>urls./**name>
- <value>authcBasicvalue>
- param>
- provider>
- <provider>
- <role>identity-assertionrole>
- <name>Defaultname>
- <enabled>trueenabled>
- provider>
- <provider>
- <role>hostmaprole>
- <name>staticname>
- <enabled>trueenabled>
- <param>
- <name>localhostname>
- <value>my_hdfsvalue>
- param>
- provider>
- gateway>
-
- <service>
- <role>HDFSUIrole>
- <url>http://hadoop02:50070url>
- <version>2.7.0version>
- service>
-
- topology>
- "1.0" encoding="UTF-8"?>
- <topology>
- <name>my_yarnname>
- <gateway>
- <provider>
- <role>authenticationrole>
- <name>ShiroProvidername>
- <enabled>trueenabled>
- <param>
- <name>sessionTimeoutname>
- <value>30value>
- param>
- <param>
- <name>main.ldapRealmname>
- <value>org.apache.knox.gateway.shirorealm.KnoxLdapRealmvalue>
- param>
- <param>
- <name>main.ldapContextFactoryname>
- <value>org.apache.knox.gateway.shirorealm.KnoxLdapContextFactoryvalue>
- param>
- <param>
- <name>main.ldapRealm.contextFactoryname>
- <value>$ldapContextFactoryvalue>
- param>
-
-
- <param>
- <name>main.ldapRealm.contextFactory.systemUsernamename>
- <value>uid=myclient,ou=people,dc=hadoop,dc=apache,dc=orgvalue>
- param>
- <param>
- <name>main.ldapRealm.contextFactory.systemPasswordname>
- <value>${ALIAS=ldcSystemPassword}value>
- param>
- <param>
- <name>main.ldapRealm.userSearchBasename>
- <value>ou=myhadoop,dc=hadoop,dc=apache,dc=orgvalue>
- param>
- <param>
- <name>main.ldapRealm.userSearchAttributeNamename>
- <value>uidvalue>
- param>
- <param>
- <name>main.ldapRealm.userSearchFiltername>
- <value>(&(objectclass=person)(uid={0})(uid=myyarn))value>
- param>
-
- <param><name>csrf.enabledname><value>truevalue>param>
- <param><name>csrf.customHeadername><value>X-XSRF-Headervalue>param>
- <param><name>csrf.methodsToIgnorename><value>GET,OPTIONS,HEADvalue>param>
- <param><name>cors.enabledname><value>falsevalue>param>
- <param><name>xframe.options.enabledname><value>truevalue>param>
- <param><name>xss.protection.enabledname><value>truevalue>param>
- <param><name>strict.transport.enabledname><value>truevalue>param>
- <param><name>rate.limiting.enabledname><value>truevalue>param>
-
- <param>
- <name>main.ldapRealm.contextFactory.urlname>
- <value>ldap://localhost:33389value>
- param>
- <param>
- <name>main.ldapRealm.contextFactory.authenticationMechanismname>
- <value>simplevalue>
- param>
- <param>
- <name>urls./**name>
- <value>authcBasicvalue>
- param>
- provider>
- <provider>
- <role>identity-assertionrole>
- <name>Defaultname>
- <enabled>trueenabled>
- provider>
- <provider>
- <role>hostmaprole>
- <name>staticname>
- <enabled>trueenabled>
- <param>
- <name>localhostname>
- <value>my_yarnvalue>
- param>
- provider>
- gateway>
- <service>
- <role>YARNUIrole>
- <url>http://hadoop03:8088url>
- service>
- <service>
- <role>JOBHISTORYUIrole>
- <url>http://hadoop03:19888url>
- service>
-
- topology>
说明,配置中的${ALIAS=ldcSystemPassword}是生成的密文
参考:
如果测试不成功改名实际密码测试
之后进入knox-2.0.0/bin
执行(注意,不能使用root用户)
- ./ldap.sh start
- ./knoxcli.sh create-master
- ./gateway.sh start
网站访问:
https://192.168.200.11:18483/my/mimi/my_yarn/yarn
账号:myyarn 密码 123
https://192.168.200.11:18483/my/mimi/my_hdfs/hdfs
账号:myhdfs 密码 123456
通过我的多次测试,成功实现了,不同用户访问不同页面。下面是几个问题
1 必须有多个单独的xml才能实现不同用户访问不同页面
2 我测试成功的只有hdfs、yarn、hbase。之后准备测试hue,发现死活不成功
注意:文中xml单用户访问页面可以直接复制,然后修改。官网描述很多测试失败。