Hadoop Connection and Access Information List
Overview
Identifies what information is needed to configure and connect to your cluster.
After your Hadoop cluster has been configured, users need some information and permissions to connect to the cluster and access its services.
Pentaho
You'll need read access to the following Pentaho directories:
- Pentaho Server, Spoon, PRD, and PME shim directories
- Pentaho Log Directories
Hadoop Cluster
You'll need to know this information about your Hadoop Cluster. You can get these things from your Hadoop Administrator or the Hadoop Cluster Management tool.
- Installed version
- Hostname and IP Address for each cluster node, including Yarn servers
- If your cluster is enabled for High Availability, you will also need the name of the nameservice, (DNS lookup table)
If you are using MapR, you'll also need:
- Version and location of MapR clients
- Hostnames and IP addresses for the CLDB (Container Locator Database) nodes and port numbers. The default port is 7222. This is configured in the client.
Optional Services
If you are going to use the following services, you'll need this information.
HDFS
Your Hadoop Administrator should be able to provide this information:
- Hostname/IP Address, Namenode Port and Namenode Web Console Port
- Paths to directories that will be used
- Owners for the various data sets in HDFS
- If S3 is used, you'll need the access key and secret key
You will also need permission to access to the directories you need.
Hive2 and Impala
Check with your Hadoop Administrator or the Hadoop Cluster Management Console for:
- Username (and password) the service will run under
- Hostname, IP Address, and Port
- JDBC URL (you must use the thrift interface)
HBase
See your Hadoop Administrator or the Hadoop Cluster Management Console to get information on:
- Zookeeper connection hostname
- Zookeeper connection port
Oozie
Check your Hadoop Administrator or the Hadoop Cluster Management Console to get the:
- URL to Oozie web interface
- Jobtracker Hostname/IP and Port number (or Resource Manager Hostname/IP and Port number)
- Namenode Hostname/IP and Port number
Pig
Get this information from the PDI Developer.
- Username used to access HDFS
- Jobtracker Hostname/IP and Port number (or Resource Manager Hostname/IP and Port number)
- Hostname/IP Address, Namenode Port and Namenode Web Console Port
Pentaho Mapreduce (PMR)
See your Hadoop Administrator or the Hadoop Cluster Management Console to get information on:
- Job History Server IP and Port
- Jobtracker Hostname/IP and Port number (or Resource Manager Hostname/IP and Port number)
- Hostname/IP Address, Namenode Port and Namenode Web Console Port
Sqoop
Get this information from the IT Administrator.
- JDBC Connection details for target or source databases
- JDBC Drivers
- Jobtracker Hostname/IP and Port number (or Resource Manager Hostname/IP and Port number)
- Hostname/IP Address, Namenode Port and Namenode Web Console Port
- Get this information from the PDI Developer or Database Administrator.
- Username used to access HDFS