As the LDOS administrator, you can configure solutions for your organization. Current configuration information is stored in the values.yaml file in the Configuration tab of each solution.
Changes to the values.yaml file are saved as a new configuration in the Configuration tab. New run resources are created and previously used resources are destroyed. The History tab tracks the configuration changes, including information about the revision, status, and description of the action taken.
To edit the configuration of a solution, click inside the editor and make your changes to the currently running configuration parameters in the values.yaml file. When finished, click Save to save the file and restart the service. Clicking Reset discards the changes you are making in the configuration file.
If you already saved your changes, but want to revert back to the previous values, then you can use the Rollback action on the History tab. For more information, see Rollback a revision.
The Messaging Service solution provides the communications infrastructure for the Dataflow Engine and Dataflow Studio solutions in Lumada DataOps Suite. You can scale the messaging queues by configuring the number of replicas (message brokers) and the size of the persistent message stores.
This solution is shared by other solutions. A best practice, you should only install one version and not append the timestamp to the solution name, which will simplify its discovery by other solutions.
The user-configurable values for the Messaging Service are defined in the following table:
|The number of replicas used for the Messaging Service solution.||3|
|The size of the persistent volumes. This value can only be changed during installation. |
CautionThe size for the persistent volumes and all messages are lost if the installation is removed.
The Dataflow Importer solution is the background service that automatically imports your staged dataflow files into Dataflow Studio. You must ensure that the files are tagged correctly and placed in the applicable network file system (NFS) folder for successful importing. See Importing dataflows for details.
You can configure the scanning interval of the Dataflow Importer for ingestion of new, revised, and deleted files. As a best practice, begin with an interval of 6 minutes (360000 milliseconds) and then add minutes as your expected file count increases. Use an interval proportional to the number of files imported to allow the process to complete. For example, if you have 100,000 files to import, your setting should be much longer than the interval used to import 1,000 files.
To access the Interval setting, from the Solution management window, click Installed
in the navigation pane and then click Dataflow Importer. Click the
Configuration tab to view the setting details for the
|The time, in milliseconds, that elapses before the Dataflow Importer scans the staging folder. This setting should always be greater than 0, but less than 2147483647.||10000|
Data Processing Service
Lumada Data Catalog uses the Data Processing Service for multi-node processing using a secure Spark 3.1.1 history server and S3 storage that you provide.
The Data Processing Service provides a Spark history server instance for convenience, but you do not have to use this instance. The Data Processing Service history server is configured to connect to a valid S3 filesystem and valid S3 path. Amazon Web Services (AWS) S3 and Minio are examples of valid S3 filesystems. If a valid S3 path is not provided during installation, the Data Processing Service will not install successfully.