Skip to main content

Pentaho+ documentation has moved!

The new product documentation portal is here. Check it out now at


Hitachi Vantara Lumada and Pentaho Documentation

General issues

Parent article

Follow the suggestions in these topics to help resolve common software issues associated with the Pentaho suite:

See Pentaho Troubleshooting articles for additional topics.

JDBC driver issues

Before you begin

Before you begin troubleshooting suspected JDBC driver issues, make sure that the correct JDBC driver JARs are installed in the correct locations. You could install your drivers with our JDBC Distribution Tool to ensure they are placed in the correct locations. Also, make sure there are no conflicting driver versions installed. Confirm with your database or driver vendor if you suspect you have JDBC driver compatibility issues.
The Pentaho Server needs the appropriate driver to connect to the database that stores your data. You can download drivers from your database vendor's website. Check the JDBC drivers reference for a list of supported drivers and links to vendor websites.

Perform the following steps to install the appropriate driver for your Pentaho Server:


  1. Stop the Pentaho Server.

  2. Copy your driver into this location: <pentaho-install-directory>/server/pentaho-server/tomcat/lib.

  3. Start the Pentaho Server.

Data conversion issues with MySQL driver 5.0.8

The MySQL JDBC driver version 5.0.8 may cause data conversion errors in the Pentaho Server. For example, SQL statements that convert a numeric field to a string are returned as a string in version 5.0.7, but return as a byte array in version 5.0.8.

To solve this problem, you must replace the mysql-connector-java-5.0.8.jar with the mysql-connector-java-5.0.7.jar in your client tool or application server's lib folder

Fixing JTDS varchar(Max) limitations in MSSQL 2005

Creating a report that uses a column of type varchar(MAX) may result in a net.sourceforge.jtds.jdbc.ClobImpl@83cf00 error when using the JTDS SQL Server driver. This is caused by inherent limitations in older versions of the JTDS driver. Additionally, the SQL Server JDBC driver version 1.2.2828 also has issues accessing a varchar(MAX) column.

The solution is to upgrade the MSSQL 2005 JDBC driver to version 1.0.809.102 or later. Download and install the file from, then restart your MSSQL server.

Snowflake timeout errors

When you are pooling the Pentaho database connection, you may see errors. Snowflake JDBC connections use a default timeout of four hours which contributes to these errors because the same connection can be reused for more than four hours. See the Snowflake documentation for further details.

To resolve this issue, do one of the following actions:

Using a validation query

In Pentaho, you can execute a validation query every time a connection is borrowed from the pool. If the query fails, such as when the connection has timed out, the connection is evicted from the pool and a new connection is created.

Perform the following steps to use a validation query:


  1. Open the Database Connection dialog box, then access the Pooling tab.

  2. Move down in the Parameters table and select the check box to the left of the validationQuery parameter.

  3. Set the parameter value to select 1.

Next steps

See Define connection pooling for more information on setting database connection pooling parameters.

Disabling the Snowflake timeout

This method avoids running the validation queries, but leaves the connection open indefinitely.

Perform the following steps to disable the timeout:


  1. Open the Database Connection dialog box, then access the Options tab.

  2. Enter in the CLIENT_SESSION_KEEP_ALIVE parameter and set its value to true.

Next steps

See the Snowflake documentation for more details on the CLIENT_SESSION_KEEP_ALIVE parameter.

Log table data is deleted

When you run a job or transformation, data in the log table is unexpectedly deleted.

When a job or transformation runs, the value in the Log line timeout (days) field determines when to delete the data in the log table, which is selected in the Log table field. In PDI, both fields are located in the job or transformation Properties page.

When more than one job or transformation is run using the same table, the data in the log table is deleted according to the shorter timeout value in either of the settings. For example, if you run two jobs with a timeout setting of 20 days in one job and 10 days in the other job, the log table data is deleted in 10 days.

NoteThis setting deletes all the data at the table level, not at the log row level.

There are two ways to solve this problem:

  1. As a best practice, use a different log table for each job or transformation.
  2. Optionally, set the Log line timeout (days) field to the same value in your jobs and transformations.