Embed Pentaho Data Integration
You can get the accompanying sample project from the kettle-sdk-embedding-samples folder of the sample code package. The sample project is bundled with a minimal set of dependencies. In a real-world implementation, projects require the complete set of PDI dependencies that include all .jar files from data-integration/lib.
For each embedding scenario, there is a sample class that can be executed as a stand-alone java application. You can execute the classes manually or run the Ant targets provided in build/build.xml to run the sample classes.
Pentaho License File
Before initializing the Kettle environment, you must install the PDI Enterprise Edition license file for each user account. Then, to ensure that the Pentaho Server uses the same location to store and retrieve your Pentaho license, you must also create a PENTAHO_INSTALLED_LICENSE_PATH system environment variable for each account. The location of your license path must be available to the user accounts that run the Pentaho Server. For information about installing the license and setting the variable path, see Manage Licenses using the Command Line Interface.
Kettle Plugins
Before initializing the Kettle environment, you must make the Kettle plugins available to your application. Set the KETTLE_PLUGIN_BASE_FOLDERS system property to point to the PDI plugins that you want to use. For example: System.setProperty("KETTLE_PLUGIN_BASE_FOLDERS", "c:\\plugins1,c:\\plugins2")
.
Run Transformations
The org.pentaho.di.sdk.samples.embedding.RunningTransformations
class is an example of how to run a PDI transformation from Java code in a stand-alone application. This class sets the parameters and executes the transformation in etl/parametrized_transformation.ktr. The transform can be run from the .ktr file using runTransformationFromFileSystem()
or from a PDI repository using runTransfomrationFromRepository()
. Important considerations:
- Always make the first call to
KettleEnvironment.init()
whenever you are working with the PDI APIs. - Prepare the transformation: The definition of a PDI transformation is represented by a
TransMeta
object. You can load this object from a .ktr file, a PDI repository, or you can generate it dynamically. To query the declared parameters of the transformation definition uselistParameters()
, or to query the assigned values usesetParameterValue()
. - Execute the transformation: An executable
Trans
object is derived from theTransMeta
object that is passed to the constructor. TheTrans
object starts and then executes asynchronously. To ensure that all steps of theTrans
object have completed, callwaitUntilFinished()
. - Evaluate the transformation outcome: After the
Trans
object completes, you can access the result usinggetResult()
. TheResult
object can be queried for success by evaluatinggetNrErrors()
. This method returns zero (0) on success and a non-zero value when there are errors. To get more information, retrieve the transformation log lines.
Run Jobs
The org.pentaho.di.sdk.samples.embedding.RunningJobs
class is an example of how to run a PDI job from Java code in a stand-alone application. This class sets the parameters and executes the job in etl/parametrized_job.kjb. The job can be run from the .kjb file using runJobFromFileSystem()
or from a repository using runJobFromRepository()
. Important considerations:
- Always make the first call to
KettleEnvironment.init()
whenever you are working with the PDI APIs. - Prepare the job: The definition of a PDI job is represented by a
JobMeta
object. You can load this object from a .ktb file, a PDI repository, or you can generate it dynamically. To query the declared parameters of the job definition uselistParameters()
. To set the assigned values usesetParameterValue()
. - Execute the job: An executable
Job
object is derived from theJobMeta
object that is passed in to the constructor. TheJob
object starts, and then executes in a separate thread. To wait for the job to complete, callwaitUntilFinished()
. - Evaluate the job outcome: After the
Job
completes, you can access the result usinggetResult(
). TheResult
object can be queried for success usinggetResult()
. This method returnstrue
on success andfalse
on failure. To get more information, retrieve the job log lines.
Build Transformations Dynamically
The org.pentaho.di.sdk.samples.embedding.GeneratingTransformations
class is an example of a dynamic transformation. This class generates a transformation definition and saves it to a .ktr file. Important considerations:
- Always make the first call to
KettleEnvironment.init()
whenever you are working with the PDI APIs. - Create and configure a transformation definition object: A transformation definition is represented by a
TransMeta
object. Create this object using the default constructor. The transformation definition includes the name, the declared parameters, and the required database connections. - Populate the
TransMeta
object with steps: The data flow of a transformation is defined by steps that are connected by hops.- Create the step by instantiating its class directly and configure it using its
get
andset
methods. Transformation steps reside in sub-packages oforg.pentaho.di.trans.steps
. For example, to use the Get File Names step , create an instance oforg.pentaho.di.trans.steps.getfilenames.GetFileNamesMeta
and use itsget
andset
methods to configure it. - Obtain the step id string. Each PDI step has an id that can be retrieved from the PDI plugin registry. A simple way to retrieve the step id is to call
PluginRegistry.getInstance().getPluginId(StepPluginType.class, theStepMetaObject)
. - Create an instance of
org.pentaho.di.trans.step.StepMeta
, passing the step id string, the name, and the configured step object to the constructor. An instance ofStepMeta
encapsulates the step properties, as well as controls the placement of the step on the PDI client (Spoon) canvas and connections to hops. Once theStepMeta
object has been created, call setDrawn(true) andsetLocation(x,y)
to make sure the step appears correctly on the PDI client canvas. Finally, add the step to the transformation, by callingaddStep()
on the transformation definition object. - Once steps have been added to the transformation definition, they need to be connected by hops. To create a hop, create an instance of
org.pentaho.di.trans.TransHopMeta
, passing in the From and To steps as arguments to the constructor. Add the hop to the transformation definition by callingaddTransHop()
.
- Create the step by instantiating its class directly and configure it using its
After all steps have been added and connected by hops, the transformation definition object can be serialized to a .ktr file by calling getXML()
and opening it in the PDI client for inspection. The sample class org.pentaho.di.sdk.samples.embedding.GeneratingTransformations
generates the transformation shown below:
Build Jobs Dynamically
The org.pentaho.di.sdk.samples.embedding.GeneratingJobs class is an example of a dynamic job. This class generates a job definition and saves it to a .kjb file. Important considerations:
- Always make the first call to
KettleEnvironment.init()
whenever you are working with the PDI APIs. - Create and configure a job definition object: A job definition is represented by a
JobMeta
object. Create this object using the default constructor. The job definition includes the name, the declared parameters, and the required database connections. - Populate the
JobMeta
object with job entries: The control flow of a job is defined by job entries that are connected by hops.- Create the job entry by instantiating its class directly and configure it using its
get
andset
methods. The job entries reside in sub-packages oforg.pentaho.di.job.entries
. For example, use the File Exists job entry, create an instance oforg.pentaho.di.job.entries.fileexists.JobEntryFileExists
, and usesetFilename()
to configure it. The Start job entry is implemented byorg.pentaho.di.job.entries.special.JobEntrySpecial
. - Create an instance of
org.pentaho.di.job.entry.JobEntryCopy
by passing the job entry created in the previous step to the constructor. An instance ofJobEntryCopy
encapsulates the properties of a job entry, as well as controls the placement of the job entry on the PDI client canvas and connections to hops. Once created, callsetDrawn(true)
andsetLocation(x,y)
to make sure the job entry appears correctly on the PDI client canvas. Finally, add the job entry to the job by callingaddJobEntry()
on the job definition object. It is possible to place the same job entry in several places on the canvas by creating multiple instances ofJobEntryCopy
and passing in the same job entry instance. - Once job entries have been added to the job definition, they need to be connected by hops. To create a hop, create an instance of
org.pentaho.di.job.JobHopMeta
, passing in the From and To job entries as arguments to the constructor. Configure the hop consistently. Configure it as a green or red hop by callingsetConditional()
andsetEvaluation(true/false)
. If it is an unconditional hop, callsetUnconditional()
. Add the hop to the job definition by callingaddJobHop()
.
- Create the job entry by instantiating its class directly and configure it using its
After all job entries have been added and connected by hops, the job definition object can be serialized to a .kjb file by calling getXML()
, and opened in the PDI client for inspection. The sample class org.pentaho.di.sdk.samples.embedding.GeneratingJobs
generates the job shown below:
Obtain Logging Information
When you need more information about how transformations and jobs execute, you can view PDI log lines and text.
PDI collects log lines in a central place. The org.pentaho.di.core.logging.KettleLogStore
class manages all log lines and provides methods for retrieving the log text for specific entities. To retrieve log text or log lines, supply the log channel id generated by PDI during runtime. You can obtain the log channel id by calling getLogChannelId()
, which is part of LoggingObjectInterface
. Jobs, transformations, job entries, and transformation steps all implement this interface.
For example, assuming the job variable is an instance of a running or completed job, the following code shows how you retrieve the job's log lines:
LoggingBuffer appender = KettleLogStore.getAppender(); String logText = appender.getBuffer(job.getLogChannelId(), false).toString();
The main methods in the sample classes org.pentaho.di.sdk.samples.embedding.RunningJobs
and org.pentaho.di.sdk.samples.embedding.RunningTransformations
retrieve log information from the executed job or transformation in this manner.
Expose a Transformation or Job as a Web Service
Running a PDI job or transformation as part of a web-service is implemented by writing a servlet that maps incoming parameters for a transformation step or job entry and executes them as part of the request cycle.
Instead of writing a servlet, you can use the Carte server or the Pentaho Server directly by building a transformation that writes its output to the HTTP response of the Carte server. This is achieved by using the Pass Output to Servlet feature of the Text output, XML output, JSON output, or scripting steps. For an example, run the sample transformation, /data-integration/samples/transformations/Servlet Data Example.ktr, on Carte.
Use Non-Native Plugins
To use non-native plugins with an embedded Pentaho Server, you must configure the server to find where the plugins reside. How you configure the server depends on whether your plugin is a folder with associated files or a single JAR file.
If your plugins are folders with associated files, register the folders by setting the KETTLE_PLUGIN_BASE_FOLDERS system property just before the call to KettleEnvironment.init(), as shown in the following example for the “plugins” and “plugins2” plugins:
System.setProperty("KETTLE_PLUGIN_BASE_FOLDERS", "C:\\pentaho\\data-integration\\plugins,c:\\plugins2"); KettleEnvironment.init();
If your plugin is a single JAR file, annotate the classes for the plugin and include them in the class path, then set the KETTLE_PLUGIN_CLASSES system property to register the fully-qualified class names just before the call to KettleEnvironment.init(), as shown in the following example for a “jsonoutput” plugin:
System.setProperty("KETTLE_PLUGIN_CLASSES","org.pentaho.di.trans.steps.jsonoutput.JsonOutputMeta"); KettleEnvironment.init();
Refer to the Extend Pentaho Data Integration article for more information on creating plugins.
If you have custom job entries or custom transformation steps, you must use one of the above two methods to configure the locations where the embedded server will search for your custom job entries or custom transformation steps.