Skip to main content
Hitachi Vantara Lumada and Pentaho Documentation

Use Dataflows

You can manipulate your Lumada DataOps Suite data using the Lumada Data Integration solutions: PDI client, Data Transformation Editor (DTE), and dataflows.

  • Use PDI client to extract, transform, and load (ETL) your data using transformation and job files.
  • Use Lumada Data Transformation Editor to make simple modifications to your transformation and job files in a web-based environment.
  • Use dataflows to execute and schedule your transformation and job files in a web-based environment.

Dataflows are imported PDI client transformation (KTR) and job (KJB) files wrapped with enhancements so you can perform extraction, transform, and load operations with your data without the need of specialized ETL knowledge. For example, dataflows in Lumada DataOps Suite use prompts to ease the process of capturing, cleansing, and storing of data. After the prompt entries and selections are completed for the required parameters at runtime, the dataflow is ready. Administrators have permissions to customize the displayed prompts and exposed parameters.

Based on your assigned user role, you may run, monitor, and schedule executions of your dataflows. Administrators have access to additional features to analyze the summary and logging information to fine-tune and enhance the performance of the dataflows.

To access your available dataflows, click Dataflow in the Lumada DataOps Suite menu bar.Lumada DataOps Suite menu bar

View these sections to get started with dataflows:

  • Importing dataflows

    Learn how to import transformation (KTR) and job (KJB) files into Dataflow Studio.

  • Navigating dataflows

    Learn how to work with dataflows available to your user profile in Lumada DataOps Suite.

  • Executing dataflows

    Learn how to execute dataflows, configure parameters, view execution reports, and adjust advanced settings.

  • Editing dataflows

    Learn how to lightly edit dataflows using the web-based Data Transformation Editor.

  • Monitoring dataflows

    Learn how to monitor the status and performance of your executed dataflows to verify completed processes and maximize your resource usage.

  • Scheduling dataflows

    Learn how to run a dataflow at a scheduled time or set it to run on a recurring schedule.

  • Manage dataflow permissions

    Learn how to assign groups to a dataflow in Dataflow Studio.

  • Delete a dataflow

    Learn how to remove a dataflow from Dataflow Studio.

For more information about transformations and jobs, including details for specific steps and job entries, see Pentaho Data Integration.


Dataflow user documentation is intended for data engineers, data scientists, and business users. Configuration documentation is intended for administrators who know where data is stored, how to connect to it, and details about the computing environment.

Note that this feature documentation is intended for selected Lumada DataOps Suite customers who are interested in exploring and using dataflows. For more information, contact your Hitachi Vantara account representative.