Use Secure Impersonation to Access a Cloudera Cluster
This article explains how to configure the Pentaho Server to connect to a Cloudera Hadoop 5.9 cluster to use secure impersonation. For an overview of secure impersonation, refer to Setting Up Big Data Security. The following sections will guide you through the setup and configuration process:
- Prerequisites
- Parameter Configuration
- Configuring MapReduce Jobs (Windows-only)
- Connecting to a Cloudera Impala Database
- Next Steps
Prerequisites
The following requirements must be met to use secure impersonation:
- The cluster must be secured with Kerberos, and the Kerberos server used by the cluster must be accessible to the Pentaho Server.
- The Pentaho computer must have Kerberos installed and configured as explained in Set Up Kerberos for Pentaho.
If your system has version 8 of the Java Runtime Environment (JRE) or the Java Developer's Kit (JDK) installed, you will not need to install the Kerberos client, since it is included in the Java installation. You will need to modify the Kerberos configuration file, krb5.conf, as specified in the Set Up Kerberos for Pentaho topic.
- Pentaho shims for client and server must be configured for each component as explained in Edit Secured Cluster Configuration Properties.
Follow the instructions below for editing the config.properties file below instead of the instructions in the Edit config.properties (Secured Clusters) section of the Set up Pentaho to Connect to a Cloudera Cluster article.
Parameter Configuration
The mapping types value in the config.properties
file turns secure impersonation on or off. The mapping types supported by the Pentaho Server are disabled and simple. When set to disabled or left blank, the Pentaho Server does not use authentication. When set to simple, the Pentaho users can connect to the Hadoop cluster as a proxy user.
To configure the cluster for secure impersonation, stop the Pentaho Server and complete the following steps:
- Navigate to the pentaho-server\pentaho-solutions\system\kettle\plugins\pentaho-big-data-plugin\hadoop-configurations\
chd59
folder and open theconfig.properties
file with a text editor. - Modify the config.properties file with the values in the following table:
Parameter Value pentaho.authentication.default.kerberos.principal exampleUser@EXAMPLE.COM pentaho.authentication.default.kerberos.keytabLocation Set the Kerberos keytab. You only need to set the password or the keytab, not both. pentaho.authentication.default.kerberos.password Set the Kerberos password. You only need to set the password or the keytab, not both. pentaho.authentication.default.mapping.impersonation.type simple pentaho.authentication.default.mapping.server.credentials.kerberos.principal exampleUser@EXAMPLE.COM pentaho.authentication.default.mapping.server.credentials.kerberos.keytabLocation You only need to set the password or the keytab, not both. pentaho.authentication.default.mapping.server.credentials.kerberos.password You only need to set the password or the keytab, not both. pentaho.oozie.proxy.user Add the proxy user's name if you plan to access the Oozie service through a proxy. Otherwise, leave it set to oozie. config.properties
file that are not security related, merge those settings into the file.
- Save and close the
config.properties
file. - Copy the
config.properties
file to the following folders:design-tools/report-designer/plugins/pentaho-big-data-plugin/hadoop-configurations/cdh59/config.properties
design-tools/metadata-editor/plugins/pentaho-big-data-plugin/hadoop-configurations/cdh59/config.properties
design-tools/data-integration/plugins/pentaho-big-data-plugin/hadoop-configurations/cdh59/config.properties - Restart the Pentaho Server.
Configuring MapReduce Jobs
For Windows systems, you must modify the mapred-site.xml files to run MapReduce jobs with secure impersonation. Complete the following steps to modify the files:
- Navigate to the design-tools\data-integration\plugins\pentaho-big-data-plugin\hadoop-configurations\cdh59 folder and open the mapred-site.xml file with a text editor.
- Navigate to the pentaho-server\pentaho-solutions\system\kettle\plugins\pentaho-big-data-plugin\hadoop-configurations\cdh59 folder and open the mapred-site.xml file with a text editor.
- Add the following two properties to the two
mapred-site.xml
files:<property> <name>mapreduce.app-submission.cross-platform</name> <value>true</value> </property> <property> <name>mapreduce.framework.name</name> <value>yarn</value> </property>
- Save and close the files.
Connecting to a Cloudera Impala Database
Complete the following steps to connect to a secure Cloudera Impala database:
- Download the Cloudera Impala JDBC driver for your operating system from the Cloudera web site http://www.cloudera.com/downloads/connectors/impala/jdbc/2-5-29.html.
Secure impersonation with Impala is only supported with the Cloudera Impala JDBC driver. You may have to create an account with Cloudera to download the driver file.
- Extract the ImpalaJDBC41.jar file from the downloaded zip file into the folder pentaho-server/pentaho-solution/system/kettle/plugins/pentaho-big-data-plugin/hadoop-configurations/cdh59/lib. The ImpalaJDBC41.jar file is the only file to extract from the downloaded file.
- Connect to a secure CDH cluster. If you have not set up a secure cluster, complete the procedure in the article Set up Pentaho to Connect to a Cloudera Cluster to set up a secure cluster.
- Start the PDI Client (Spoon) and choose File > New > Transformation to add a new transformation.
- Click the View tab, then right-click Database Connections and choose New.
- In the Database Connection dialog box enter the values from the following table:
Field Value Connection Name User-defined name Connection Type Cloudera Impala Host Name Hostname Database Name default Port Number 21050 - Click Options in the left pane of the Database Connection dialog box and enter the parameter values as shown in the following table:
Parameter Value KrbHostFQDN The fully qualified domain name of the Impala host KrbServiceName The service principal name of the Impala server KrbRealm The Kerberos realm used by the cluster - Click Test when your settings are entered. A success message appears if everything was entered correctly.
Next Steps
When you save your changes in the repository and your Hadoop cluster is connected to the Pentaho Server, you are now ready to use secure impersonation to run your transformations and jobs from the Pentaho Server.
Secure impersonation from the PDI client is not currently supported.
If you have not yet connected your Hadoop cluster to the Pentaho Server, continue to the Edit hbase-site.xml section in Step 4: Edit the Shim Configuration Files.