Apache Knox Setup and HDFS and Hive Accessing BigData

Apache Knox:

The Apache Knox™ Gateway is an Application Gateway for interacting with the REST APIs and UIs of Apache Hadoop deployments.

Setup Knox:

Access the Hive:

  • Change in Ambari the Hive config hive.server2.transport.mode to http from binary
  • Check in Linux
  • Use Commands
  • beeline
  • !connect jdbc:hive2://hadoopmaster.com:8443/;ssl=true;sslTrustStore=/var/lib/knox/data-2.6.1.0–129/security/keystores/gateway.jks;trustStorePassword=password@123?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive
  • hadoopmaster.com is your system address
  • 8443 is your port to knox
  • sslTrustStore change according to the Knox version
  • trustStorePassword is the password used during setup
  • Enter username guest
  • Enter password guest-password
  • (The above usename and password is from the Advance Topology file in config)
  • Access the HDFS:
  • In Linux
  • curl -i -k -u guest:guest-password -X GET \ ‘https://<KnoxHostAddress>:8443/gateway/<ClusterName>/webhdfs/v1/?op=LISTSTATUS’
  • Browser
  • https://hadoopmaster.pratian.com:8443/gateway/default/webhdfs/v1/?op=GETFILESTATUS
  • Provide same credential guest and guest-password when prompt

Use this link for detail view:

https://www.slideshare.net/AbhishekMallick9/apache-knox-setup-and-hive-and-hdfs-access-using-knox