Skip to main content

Pentaho+ documentation has moved!

The new product documentation portal is here. Check it out now at


Hitachi Vantara Lumada and Pentaho Documentation

Use Kerberos with Spark Submit

This article explains how to execute Spark Submit jobs on secure Cloudera CDP clusters using Kerberos authentication. Spark jobs can be submitted to the secure clusters by adding keytab and principal utility parameter values to the job. These values are what enable Kerberos authentication for Spark.


The following prerequisites must be completed before running the Spark jobs:

  • A Spark client must be installed, Refer to the article Spark Submit for information on installing and configuring the Spark client.
  • The cluster must be secured with Kerberos, and the Kerberos server used by the cluster must be accessible to the Pentaho Server.
  • The Pentaho computer must have Kerberos installed and configured as explained in Set Up Kerberos for Pentaho.
NoteA valid Kerberos ticket must already be in the ticket cache area on your client machine before you launch and submit the Spark Submit job.

Spark Submit entry properties

Spark Submit dialog box

Configure your job setup with the parameters in the following table. For additional details, see Spark Submit.

Entry NameSpecify the name of the entry. You can customize this, or leave it as the default.
Spark Submit UtilitySpecify the name of the script that launches the Spark job, which is the batch/shell file name of the underlying spark-submit tool. For example, Spark2-submit.
Master URLChoose a master URL for the cluster from the drop-down:
  • yarn-cluster: Runs the driver program as a thread of the YARN application master (one of the node managers in the cluster). This option is similar to the way MapReduce works.
  • yarn-client: Runs the driver program on the YARN client. Tasks are still executing in the node managers of the YARN cluster.

Select the file type of the Spark job you want to submit. Your job can be written in Java, Scala, or Python. The fields displayed in the Files tab will depend on what language option you select.

Python support on Windows requires Spark version 2.3.x or higher.

Utility Parameters

Specify the Name and Value of optional Spark configuration parameters associated with the spark-defaults.conf file. Add the following name and value pairs:

  • Name: spark.yarn.keytab

    Value: File path to the spark.yarn.keytab file

  • Name: spark.yarn.principal

    Value: Kerberos principal used to authenticate to the cluster

Enable BlockingSelect Enable Blocking to have the Spark Submit entry wait until the Spark job finishes running. If this option is not selected, the Spark Submit entry proceeds with its execution once the Spark job is submitted for execution. Blocking is enabled by default.

Authentication via password is not supported.