Skip to main content

Pentaho+ documentation has moved!

The new product documentation portal is here. Check it out now at docs.hitachivantara.com

 

Hitachi Vantara Lumada and Pentaho Documentation

Compatability issues running Pentaho on Java 11 with your Hadoop cluster

Parent article

Pentaho running on Java 11 is compatible with Java 11 on your Hadoop clusters. If you only have Java 8 on your Hadoop clusters, some components may not be compatible when running Pentaho on Java 11. The following sections indicate which components are compatible with Pentaho running on Java 11 when using Java 8 on your Hadoop clusters.

Cloudera 7.1 secured cluster

When connecting to a Cloudera 7.1 secured cluster running in Java 8 from Pentaho running Java 11 while using the Pentaho Cloudera 7.1 driver, the following component compatibilities apply:

  • Compatible components

    HDFS, Avro, Parquet, ORC, HBase, Hive, HadoopExecutor*, Oozie, Pig, Pentaho MapReduce

  • Non-compatible components

    Sqoop

* Compatible as long as the source JAR is compiled with Java 8.

Hortonworks 3.1 secured cluster

When connecting to a Hortonworks 3.1 secured cluster running in Java 8 from Pentaho running Java 11 while using the Pentaho Hortonworks 3.0 driver, the following component compatibilities apply:

  • Compatible components

    HDFS, Avro, Parquet, ORC, HBase, Hive, HadoopExecutor*

  • Non-compatible components

    Sqoop, Oozie, Pig

* Compatible as long as the source JAR is compiled with Java 8.

NoteThe Pentaho MapReduce component was not tested.

Cloudera 6.3 secured cluster

When connecting to a Cloudera 6.3 secured cluster running in Java 8 from Pentaho running Java 11 while using the Pentaho Cloudera 7.1 driver, the following component compatibilities apply:

  • Compatible components

    HDFS, Avro, Parquet, ORC, Oozie, Pig, Pentaho MapReduce

  • Non-compatible components

    HBase, Hive, HadoopExecutor, Sqoop

When connecting to a Cloudera 6.3 secured cluster running in Java 8 from Pentaho running Java 11 while using the Pentaho Cloudera 6.1 driver, the following component compatibilities apply:

  • Compatible components

    HDFS, Avro, Parquet, ORC, HBase, Hive, HadoopExecutor, Oozie, Pentaho MapReduce

  • Non-compatible components

    Sqoop, Pig

Cloudera 6.2 secured cluster

When connecting to a Cloudera 6.2 secured cluster running in Java 8 from Pentaho running Java 11 while using the Pentaho Cloudera 7.1 driver, the following component compatibilities apply:

  • Compatible components

    HDFS, Avro, Parquet, ORC, HBase, HadoopExecutor, Oozie, Pig, Pentaho MapReduce

  • Non-compatible components

    Hive, Sqoop

When connecting to a Cloudera 6.2 secured cluster running in Java 8 from Pentaho running Java 11 while using the Pentaho Cloudera 6.1 driver, the following component compatibilities apply:

  • Compatible components

    HDFS, Avro, Parquet, ORC, HBase, Hive, HadoopExecutor, Oozie, Pentaho MapReduce

  • Non-compatible components

    Sqoop, Pig

Cloudera 6.1 secured cluster

When connecting to a Cloudera 6.1 secured cluster running in Java 8 from Pentaho running Java 11 while using the Pentaho Cloudera 7.1 driver, the following component compatibilities apply:

  • Compatible components

    HDFS, Avro, Parquet, ORC, HBase, HadoopExecutor*, Oozie, Pig, Pentaho MapReduce

  • Non-compatible components

    Hive, Sqoop

* Compatible as long as the source JAR is compiled with Java 8.

When connecting to a Cloudera 6.1 secured cluster running in Java 8 from Pentaho running Java 11 while using the Pentaho Cloudera 6.1 driver, the following component compatibilities apply:

  • Compatible components

    HDFS, Avro, Parquet, ORC, HBase, Hive, HadoopExecutor*, Oozie

  • Non-compatible components

    Sqoop, Pig**, Pentaho MapReduce**

* Compatible as long as the source JAR is compiled with Java 8.

** Missing method errors occur with Pig and Pentaho MapReduce.