Pentaho EE Marketplace Plugins
For the Pentaho EE 9.5 GA release, two new plugins were released as part of the Pentaho EE Marketplace Plugin release with new features to improve your data management operations. The plugins are available from the Support Portal home page. Sign into the portal using the Pentaho support username and password provided in your Pentaho Welcome Packet. The plugins are:
- A hierarchical data plugin that adds five new steps for working with hierarchical data.
- A Kafka streaming plugin that enhances the Kafka consumer and producer steps and adds a Kafka Offset job with the ability to reset the offset.
Pentaho has added an hierarchical data type, and has five new steps for processing structured, complex, and nested data types. This new data type is supported in steps in previous releases of Pentaho that can handle hierarchical data. These five new steps consist of the following:
This step accepts a JSON file or JSONL from a previous step or a file location and converts it into a hierarchical object.
This step accepts hierarchical data from a previous step and converts it into a JSON formatted string.
This step parses hierarchical data from input steps.
Modify values from a single row
This step modifies the hierarchical data using incoming columns or create hierarchical data output to another step.
Modify values from grouped rows
This step builds complex hierarchical data or group data based on a field.
The last three steps are used within the transformation flow for interacting with the hierarchical data type structure in-place, without needing to flatten the data to a row structure.
A new job entry called Kafka Offset has been added to enable you to change the offset of a topic partition. This Job entry has fields to connect to a Kafka broker or cluster in the Setup and Options tabs.
The following improvements have been made to the Kafka Producer and Kafka consumer steps:
- Encryption is supported for connection parameters.
- SSL and Kerberos (SASL) connectivity have been certified.
- You can now use variables from Kettle properties, PDI environment variables. and parameter variables in the Kafka properties settings on the Options tab.
- The Kafka client library has been upgraded to 3.4.0.
- Logging has been improved to make debugging easier.
- Improved the Kafka consumer step to consume messages until the time stamp set using the Offset Settings tab in the Kafka Offset job.
- An offset rebalancer has been added to correctly commit offsets if a rebalance occurs when a new consumer is added or an existing consumer removed from the consumer group.