![]()
This guide will help you address performance issues with Karaf, using background information
on known Karaf issues, as well as step-by-step instructions on addressing them for Pentaho.
- Parent Topic
- Child Topics
- Set up write permissions for directories containing Karaf
Your users may experience performance issues
when launching Kitchen or Pan unless you set up some variables specific to Karaf. You will need
to set these up on individual workstations for users of Kitchen and Pan. Setting these variables
makes sure that Kitchen and Pan launch as quickly as they did for 5.x. You can use these
variables for Pentaho Report Designer,
which also uses Karaf.
Set up write permissions for directories containing Karaf
Your users may experience performance issues
when launching Kitchen or Pan unless you set up some variables specific to Karaf. You will need
to set these up on individual workstations for users of Kitchen and Pan. Setting these variables
makes sure that Kitchen and Pan launch as quickly as they did for 5.x. You can use these
variables for Pentaho Report Designer,
which also uses Karaf.
- Parent Topic
- Child Topics
- Issue description
- Solution: Define -D variables for
Karaf
As of 6.1.0.1, if a non-admin
user is deploying a client tool with Karaf or executing a MapReduce job in Hadoop where their
File System Level user only has read and execute permissions, the Karaf deployable content is
written to a temporary directory where the user has write access. In this case, Karaf deploys to
the user’s home directory OR a global tmp directory based on the file system. This deployment is
used for this execution only and is deleted on exit of the application.
Issue description
Previously, non-admin users
would need to have write permissions for the directories where they installed Pentaho client tools. If they had read and
execute permissions only, their client tools would fail for two reasons:
- A data/cache folder could not be created
- Contents in the karaf/etc folder could not be
modified
NoteThis same issue happened when trying
to deploy a Pentaho Server
within a Yarn cluster.
- Parent Topic
- Set up write permissions for directories containing Karaf
Your users may experience performance issues
when launching Kitchen or Pan unless you set up some variables specific to Karaf. You will need
to set these up on individual workstations for users of Kitchen and Pan. Setting these variables
makes sure that Kitchen and Pan launch as quickly as they did for 5.x. You can use these
variables for Pentaho Report Designer,
which also uses Karaf.
Solution: Define -D variables for
Karaf
As of 6.1.0.1, if a non-admin
user is deploying a client tool with Karaf or executing a MapReduce job in Hadoop where their
File System Level user only has read and execute permissions, the Karaf deployable content is
written to a temporary directory where the user has write access. In this case, Karaf deploys to
the user’s home directory OR a global tmp directory based on the file system. This deployment is
used for this execution only and is deleted on exit of the application.
Two variables are used together specifically for Karaf deployments, one defines the location
and the other allows for directory cleanup:
Below is an example of how to configure these variables to leverage Karaf. You
will need to define your system variable and then add -D
parameters to Spoon
and/or PRD.
- Parent Topic
- Set up write permissions for directories containing Karaf
Your users may experience performance issues
when launching Kitchen or Pan unless you set up some variables specific to Karaf. You will need
to set these up on individual workstations for users of Kitchen and Pan. Setting these variables
makes sure that Kitchen and Pan launch as quickly as they did for 5.x. You can use these
variables for Pentaho Report Designer,
which also uses Karaf.
- Child Topics
Define system variable
An Administrator will need to define the following system variable with the
file system PENTAHO_KARAF_ROOT. The dir directory
should not already exist, the executed application needs to create this directory on initial
execution. For example:
export PENTAHO_KARAF_ROOT=/my/karaf/dir
- Parent Topic
- Solution: Define -D variables for
Karaf
As of 6.1.0.1, if a non-admin
user is deploying a client tool with Karaf or executing a MapReduce job in Hadoop where their
File System Level user only has read and execute permissions, the Karaf deployable content is
written to a temporary directory where the user has write access. In this case, Karaf deploys to
the user’s home directory OR a global tmp directory based on the file system. This deployment is
used for this execution only and is deleted on exit of the application.
Add -D parameters to PDI client and/or PRD
For this
PENTAHO_KARAF_ROOT variable to be used by the application, the
following –D parameters need to be added to PDI client and/or PRD. Files such as Kitchen and
Pan will reference these parameters from Spoon.
These –D parameters with their values are
needed:
-Dpentaho.karaf.root.copy.dest.folder=$PENTAHO_KARAF_ROOT
-Dpentaho.karaf.root.transient=false
In
Spoon and PRD, these
parameters need to be added to the java OPT location in the
spoon.sh or
spoon.bat files. Typically, these changes should only be applied to
client tools.
- Parent Topic
- Solution: Define -D variables for
Karaf
As of 6.1.0.1, if a non-admin
user is deploying a client tool with Karaf or executing a MapReduce job in Hadoop where their
File System Level user only has read and execute permissions, the Karaf deployable content is
written to a temporary directory where the user has write access. In this case, Karaf deploys to
the user’s home directory OR a global tmp directory based on the file system. This deployment is
used for this execution only and is deleted on exit of the application.