Follow the suggestions in these topics to help resolve common software issues associated with the Pentaho suite:
- JDBC driver issues
- Data conversion issues with MySQL driver 5.0.8
- Fixing JTDS varchar(Max) limitations in MSSQL 2005
- Snowflake timeout errors
- Log table data is not deleted
See Pentaho Troubleshooting articles for additional topics.
JDBC driver issues
Before you begin
Perform the following steps to install the appropriate driver for your Pentaho Server:
Stop the Pentaho Server.
Copy your driver into this location: <pentaho-install-directory>/server/pentaho-server/tomcat/lib.
Start the Pentaho Server.
Data conversion issues with MySQL driver 5.0.8
To solve this problem, you must replace the mysql-connector-java-5.0.8.jar with the mysql-connector-java-5.0.7.jar in your client tool or application server's lib folder
Fixing JTDS varchar(Max) limitations in MSSQL 2005
The solution is to upgrade the MSSQL 2005 JDBC driver to version 1.0.809.102 or later. Download and install the http://msdn.microsoft.com/en-us/sqlserver/aa937724 file from Microsoft.com, then restart your MSSQL server.
Snowflake timeout errors
When you are pooling the Pentaho database connection, you may see errors. Snowflake JDBC connections use a default timeout of four hours which contributes to these errors because the same connection can be reused for more than four hours. See the Snowflake documentation for further details.
To resolve this issue, do one of the following actions:
Using a validation query
Perform the following steps to use a validation query:
Open the Database Connection dialog box, then access the Pooling tab.
Move down in the Parameters table and select the check box to the left of the validationQuery parameter.
Set the parameter value to select 1.
Disabling the Snowflake timeout
Perform the following steps to disable the timeout:
Open the Database Connection dialog box, then access the Options tab.
Enter in the CLIENT_SESSION_KEEP_ALIVE parameter and set its value to true.
Log table data is not deleted
When you run a job or transformation, data in the log table is not deleted as expected:
- When a job runs, the value in the Log line timeout (days) field, along with the LOGDATE and JOBNAME or TRANSNAME fields, which are selected in the Log table field pane, determine when to delete the data in the log table. These fields are in the Job properties window in PDI. See Set up job logging for details.
- When a transformation runs, the value in the Log record timeout (in days) field, along with the LOGDATE and TRANSNAME fields, which are selected in the Fields to log pane, determine when to delete the data in the log table. These fields are in the Transformation properties window in PDI. See Set up transformation logging for details.
If the name of an existing job or transformation is changed, then any log table entries with the previous name are no longer recognized and are no longer deleted from the table when the Log line timeout (days) or the Log record timeout (in days) value is present.
If you find that your log table is becoming too large, you should manually purge these unrecognized log table rows using your SQL editor.