Job Entry Reference
Entries extend and expand the functionality of PDI jobs. This page contains the list of supported Entries.
A-F
Name | Category | Description |
---|---|---|
Abort job | Utility | Abort the job |
Add filenames to result | File management | Add filenames to result |
Amazon EMR Job Executor | Big Data | Execute MapReduce jobs in Amazon EMR |
Amazon Hive Job Executor | Big Data | Execute Hive jobs in Amazon EMR |
BulkLoad from Mysql into file | Bulk loading | Load from a MySQL table into a file |
BulkLoad into MSSQL | Bulk loading | Load data from a file into a MSSQL table |
BulkLoad into Mysql | Bulk loading | Load data from a file into a MySQL table |
Check Db connections | Conditions | Check if we can connect to one or several databases. |
Check files locked | Conditions | Check if one or several files are locked by another process |
Check if a folder is empty | Conditions | Check if a folder is empty |
Check if connected to repository | Repository | Return true if we are connected to a repository |
Check if XML file is well formed | XML | Check if one or several XML files is/are well formed |
Check webservice availability | Conditions | Check if a webservice is available |
Checks if files exist | Conditions | Checks if files exist |
Columns exist in a table | Conditions | Check if one or several columns exist in a table on a specified connection |
Compare folders | File management | compare two folders (or two files) |
Convert file between Windows and Unix | File management | Convert file content between Windows and Unix. Converting to Unix will replace CRLF (Carriage Return and line Feed) by LF (Line Feed) |
Copy Files | File management | Copy Files |
Copy or Move result filenames | File management | Copy or Move result filenames (since version 5.0, this job entry has been renamed to Process result filenames and handles Delete as well) |
Create a folder | File management | Create a folder |
Create file | File management | Create (an empty) file |
Decrypt files with PGP | File encryption | Decrypt files encrypted with PGP (Pretty Good Privacy). This job entry need GnuPG to work properly. |
Delete file | File management | Delete a file |
Delete filenames from result | File management | Delete filenames from result |
Delete files | File management | Delete files |
Delete folders | File management | Delete specified folders. Attention: if a the folder contains files, PDI will delete them all! |
Display Msgbox Info | Utility | Display a simple Message box Information |
DTD Validator | XML | DTD Validator |
Dummy | General | Use the Dummy job entry to do nothing in a job. |
Encrypt files with PGP | File encryption | Encrypt files with PGP (Pretty Good Privacy). This job entry need GnuPG to work properly. |
Evaluate files metrics | Conditions | Evaluate files size or files count |
Evaluate rows number in a table | Conditions | Evaluate the content of a table. You can also specify a SQL query. |
Example plugin | General | This is an example test job entry for a plugin |
Export repository to XML file | Repository | Export repository to XML file |
File Compare | File management | Compare 2 files |
File Exists | Conditions | Checks if a file exists |
FTP Delete | File transfer | Delete files in a remote host |
G-L
Name | Category | Description |
---|---|---|
Get a file with FTP | File transfer | Get files using FTP (File Transfer Protocol) |
Get a file with FTPS | File transfer | Get a file with FTP secure |
Get a file with SFTP | File transfer | Get files using Secure FTP (Secure File Transfer Protocol) |
Get mails (POP3/IMAP) | Get mails (POP3/IMAP) server and save into a local folder | |
Hadoop Copy Files | Big Data | Copies files in a Hadoop cluster from one location to another |
Hadoop job executor | Big Data | Execute a map/reduce job contained in a jar file |
HL7 MLLP Acknowledge | Utility | Acknowledges HL7 messages |
HL7 MLLP Input | Utility | Reads data from HL7 data streams within a transformation |
HTTP | File management | Gets or uploads a file using HTTP (Hypertext Transfer Protocol) |
JavaScript | Scripting | Evaluates the result of the execution of a previous job entry |
Job | General | Executes a job |
M-R
Name | Category | Description |
---|---|---|
Sends an e-Mail | ||
Mail validator | Check the validity of an email address ( SNMP trap to a target host | |
Move Files | File management | Move Files |
MS Access Bulk Load | Deprecated | Load data into a Microsoft Access table from CSV file format. ATTENTION, at the moment only the insertion is available! If target table exists, a new one will be created and data inserted. |
Oozie Job Executor | Big Data | Executes Oozie workflows |
Palo Cube Create | Palo | Creates a cube on a Palo server |
Palo Cube Delete | Palo | Deletes a cube on a Palo server |
Pentaho MapReduce | Big Data | Execute Transformation Based MapReduce Jobs in Hadoop |
Pig Script Executor | Big Data | Execute a Pig script on a Hadoop cluster |
Ping a host | Utility | Ping a host |
Put a file with FTP | File transfer | Put a file with FTP |
Process result filenames | File management | Copy, Move or Delete result filenames |
Put a file with SFTP | File transfer | Put files using SFTP (Secure File Transfer Protocol) |
S-Z
Name | Category | Description |
---|---|---|
Send information using Syslog | Utility | Sends information to another server using the Syslog protocol |
Send Nagios passive check | Utility | Send Nagios passive checks |
Send SNMP trap | Utility | Send SNMP trap to a target host |
Set variables | General | Set one or several variables |
Shell | Scripting | Executes a shell script |
Simple evaluation | Conditions | Evaluate one field or variable |
Spark Submit | Big Data | Submit Spark jobs to Hadoop clusters |
SQL | Scripting | Executes SQL on a certain database connection |
Sqoop Export | Big Data | Export data from the Hadoop Distributed File System (HDFS) into a relational database (RDBMS) using Apache Sqoop |
Sqoop Import | Big Data | Import data from a relational database (RDBMS) into the Hadoop Distributed File System (HDFS) using Apache Sqoop |
SSH2 Get | Deprecated | Get files using SSH2 (Deprecated in 5.0 in favor of the SFTP job entry) |
SSH2 Put | Deprecated | Put files in a remote host using SSH2 (Deprecated in 5.0 in favor of the SFTP job entry) |
Start | General | Start is where the job starts to execute and is required before the job can be executed. |
Start a PDI Cluster on YARN | Big Data | Starts a PDI Cluster on YARN |
Stop a PDI Cluster on YARN | Big Data | Stops a PDI Cluster on YARN |
Success | General | Success |
Table exists | Conditions | Checks if a table exists on a database connection |
Talend Job Execution | Utility | This job entry executes an exported Talend Job |
Transformation | General | Executes a transformation |
Truncate tables | Utility | Truncate one or several tables. |
Unzip file | File management | Unzip file in a target folder |
Upload files to FTPS | File transfer | Upload files to a FTP secure |
Verify file signature with PGP | File encryption | Verify file signature with PGP (Pretty Good Privacy). This job entry need GnuPG to work properly. |
Wait for | Conditions | Wait for a delay |
Wait for file | File management | Wait for a file |
Wait for SQL | Utility | Scan a database and success when a specified condition on returned rows is true. |
Write to file | File management | Write text content to file. |
Write To Log | Utility | Write message to log |
XSD Validator | XML | XSD Validator |
XSL Transformation | XML | Make an XSL Transformation |
Zip file | File management | Zip files from a directory and process files |
Replacements for Deprecated or Removed Job Entries
The following table lists deprecated or removed entries and their suggested replacements.
Removed or Deprecated Entry | Replacement Entry |
---|---|
MS Access Bulk Load Entry | Microsoft Access Output Step |
SSH2 Get Entry | Get a file with FTPS Entry |
SSH2 Put Entry | Upload files to FTPS Entry |