Use machine learning service with IIoT Core
The Machine learning service is a component of Solution Management that enables you to train specific ML models by accessing historic data and routing datasets through it to influence responses.
The Machine learning service can optionally call third-party simulation engines, sending payloads and receiving back the result, synchronously. This creates the ability to augment existing data with inferred values that can’t be easily measured in the physical world, such as angular acceleration, torque, and so on.
Solution Management is where you train, track, manage, deploy, and monitor ML models.
For help creating projects and building, retraining, deploying, and deleting ML models, contact your Hitachi Vantara representative.
Log into the Solution Management UI
To access the ML Model Manager application, log into the Solution Management UI.
Procedure
From the command line on the installation node, get the username and password for the Solution Management UI:
- Username:
echo $(kubectl get keycloakusers -n hiota keycloak-user -o jsonpath='{.spec.user.username}')
- Password:
echo $(kubectl get keycloakusers -n hiota keycloak-user -o jsonpath='{.spec.user.credentials[0].value}')
- Username:
Log into the Solution Management UI using the acquired credentials:
https://<cluster-fqdn>:30443/hiota/hscp-hiota/solution-control-plane/
where <cluster-fqdn> is the location where IIoT Core Services is installed.
Results
Use Model Management
Use the following actions to manage your Machine learning service models:
- View Machine learning service projects
- View project details
- View models
- View model repository
- View model details
- View model versions
- Compare model versions
View Machine learning service projects
Procedure
Open the Lumada ML Model Manager application.
Select the Projects menu option at the top of the page.
Results
Field | Description |
Name | Name of the ML service project |
Status | Status of the ML service project
|
Tags | Keywords that describe the ML service project |
Description | Purpose of the ML service project |
Created | Date the ML service project was created |
Created By | User who created the ML service project |
Modified | Date the ML service project was modified |
Modified By | User who modified the ML service project |
View project details
You can view project details such as properties and deployment on the Projects page using the following steps:
Procedure
Open the Lumada ML Model Manager application.
Select Projects from the menu at the top of the page.
Select the project for which you want to view the details.
Select View Details in the Actions menu.
The PROJECT PROPERTIES section contains information on the following fields:Field Description Description Purpose of the project Status Status of the project Draft
The project has no associated models with the status of Published.
Published
The project has at least one associated model with the status of Published.
Tags Keywords that describe the project Created Date the project was created Created By User who created the project Modified Date the project was modified Modified By User who modified the project The DEPLOYMENT section contains information on the following fields of the ML model deployed to the project:
NoteYou may have to scroll down the Project details page to view the DEPLOYMENT section.Field Description Name Name of the deployment Endpoint Model endpoint where the inferencing applications can integrate with the REST endpoint Status Status of the project. The values are: Pending
Waiting for resources
Deploying
Deployment is in progress
Deployed
In production
Failed
The deployment request failed
Not Found
Deployment was not found
Timeout
The configured timeout was reached
ASC Category name of the analytic. For example, failure prediction Model Name of the machine learning model Version Version of the model Inferences/last hour Time when last inferencing occurred. Total Inferences Total number of inferences occurred. Start Time Time that the version’s deployment was requested. Average Elapsed Time Average time of the inferences.
View models
Procedure
Open the Lumada ML Model Manager application.
Click Model Repository from the menu at the top of the page.
Select the model for which you want the details.
NoteYou can also view the list of models by selecting View Models from the Actions menu next to the project on which you want to view the details.The MODELS LIST displays with the following fields:
Field Description Name Name of the machine learning model. Status Status of the ML model.
Draft
A model that has been created but does not have a version. It is an empty model.
Ready
A model that has at least one version with trained status.
Published
A model that has at least one version with deployed status.
Project The project to which the model belongs. Tags Keywords that describe the ML model. ASC Category name of the analytic. For example, Failure Prediction. Created Date the ML model was created. Created By User who created the ML model. Modified Date the ML model was modified. Modified By User who modified the ML model.
View model repository
Procedure
Open the Lumada ML Model Manager application.
Select Model Repository from the menu at the top of the page.
The list of models in the project will display with the following fields:
Field Description Name Name of the ML model Status Status of the ML model Draft
A model has been created but has no version. It is an empty model.
Ready
A model has at least one version with trained status.
Published
A model has at least one version with Published status.
Project Project to which this ML model belongs. Tags Keywords that describe the ML models. ASC Category name of the analytics. For example, Failure Prediction. Created Date the ML model was created. Created By User who created the ML model. Modified Date the ML model was modified. Modified By User who modified the ML model.
View model details
Procedure
Open the Lumada ML Model Manager application.
Select Model Repository from the menu at the top of the page.
Select View Details from the Actions menu next to the model on which you want to view the details.
The following properties of the model you selected are displayed:
Field Description Description Description of the ML model. A counter displays the number of characters in the description and the maximum permitted. Status Status of the ML model Draft
A model has been created but has no version. It is an empty model.
Ready
A model has at least one version with trained status.
Published
A model has at least one version with Published status.
Tags Keywords that describe the ML models. Created Date the ML model was created. Created By User who created the ML model. Modified Date the ML model was modified. Modified By User who modified the ML model. Project Project to which this ML model belongs. ASC Category name of the analytics. For example, Failure Prediction.
View model versions
Procedure
Open the Lumada ML Model Manager application.
Select Model Repository from the menu at the top of the page.
Select View Versions from the Actions menu next to the model on which you want to view the details.
The following properties of the model versions are displayed:
Field Description Name Version name Status Status of each version. The status values are: Trained
A version has been created or trained but has not been deployed.
Deployment pending
Model is In the process of being deployed, pending resource availability.
Deploying
Deployment is in progress.
Deployed
The version is deployed.
Datasets The datasets used for the model training. For example, the dataset name and location from where the dataset is retrieved. This field will vary depending on how the ML model is built. Metrics Performance metrics of the version. This field will vary depending on how the ML model is built. Parameters The parameters of the model version. This field will vary depending on how the ML model is built. Training The training duration of each version.
Compare model versions
Procedure
Open the Lumada ML Model Manager application.
Select Model Repository from the menu at the top of the page.
Select your project from the Filter by project menu.
Select View Versions in the Actions menu.
Select the check box for the two versions that you want to compare and click Compare.
Results
View model performance
Machine learning projects can have multiple models, and models can have multiple versions. Lumada ML Model Manager provides a way to view the performance of each of your models. To view a model’s performance, click the Projects tab, and select the project containing the model you want to investigate. On the Details tab of the Projects page, click the More actions menu of the model you want to investigate, and choose View inference. On the Select Class menu, select the class you want to view. The statistics for your model are displayed on the Inference tab.
This tab contains a collection of widgets, KPIs (key performance indicators), and charts that display the metrics collected from the inferenced and subject matter expert review of the data. The data on the tab is divided into these sections:
- KPI
- Metrics
- Classifications
- Trends
- Summary
You can view the sections by scrolling the page or by clicking the label for that section.
The KPI section provides the logged metrics in numeric form. Standard KPIs displayed are:
- Precision is the average of the predicted positive counts versus the actual positive counts.
- Recall is an average of the actual positive counts versus the number the model successfully identified.
- F1 score is the weighted average of Precision and Recall.
The Metrics section provides a visualization of these metrics and the true positive, true negative and false positive results. When you hover over the bar graphs, the threshold levels of the quality are displayed. The color codes legend and example KPI threshold values are shown in the following table:
Threshold ranges | Rating | Color |
Values greater than 89.20% | Good | Green |
Values between 89.20 - 79.20% | Warning | Orange |
Values less than 79.20% | Critical | Red |
The Classifications section contains the Confusion Matrix, which maps the distribution of the predicted performance versus the ground-truth for the model. The total number of observations and the industry-standard matrix breakdown is displayed as shown in the following table:
Predicted | ||
Actual | True positive | False negative |
False positive | True negative |
In the Trends section of the page, historical data about the performance results are graphed on a timeline for each metric, where inference data is shown in blue, and the training data is in purple. You can rollover each graph to see the metric’s inference percentage for all metrics along the selected timeline, and the model training thresholds for the F1 score, Precision, and Recall metrics.
The Summary section contains summary information on the version and the metrics.