This repository contains sample configuration, data and postman files for creating and running an orchestration in your instance of the Predix Analytics Runtime as per the 'Running an Orchestration Using Predix Time Series Tags' roadmap (https://www.predix.io/docs#ogUlV1fl). The orchestration runs the demo-timeseries-adder-java analytic. It take 2 sets of (aligned) timeseries data as input and produces their row-wise sum and writes the output to a new Timeseries tag.
Sample data files:
- data files with the timeseries inputs for the two addends,
- the expected output and
- sample data files for validating that the demo-timeseries-adder-java analytic has been properly deploy to your Predix Analytics Catalog.
A postman collections containing requests for
- getting a user token
- loading the orchestration configuration
- running the orchestration
Configuration files for the sample orchestration
Follow the instructions below to set up the orchestration and trigger its execution in your Predix Analytics environment.
-
Create subscriptions to:
- Predix Analytics Framework,
- Predix Timeseries and
- Predix UAA
(see the Getting Started instructions for each service in https://docs.predix.io/en-US/service)
-
Use the websocket connection in the Predix Tool Kit (for basic:https://predix-starter.run.aws-usw02-pr.ice.predix.io/#!/wsClient, for select: ????) to load the sample data into your Predix Timeseries instance.
- log in as a user from your UAA
- use Time series ingest to load the data from supportingDataFiles/time-series-tag-A-data.json and suportingDataFiles/time-series-tag-B-data.json:
- predix-zone-id is your Time series guid (zone id/instance id)
- open the socket
- use supportingDataFiles/time-series-tag-A-data.json as the request body and send the message
- use supportingDataFiles/time-series-tag-B-data.json as the request body and send the message
- close the socket
- use Time series query, request: Time Bounded Request to verify that tag-A and tag-B have been loaded. Request bodies can be found in:
- supportingDataFiles/tag-A-time-bounded-request.json
- supportingDataFiles/tag-B-time-bounded-request.json
-
Setup Postman
- Install Postman (from the chrome web store, https://chrome.google.com/webstore/detail/postman/fhbjgbiflinjbdggehcddcbncdddomop?hl=en)
- Open Postman and import 'SingleStepOrchestrationDemoUsingTagMap.postman_collection'
- Define the Postman environment (AnalyticsDemo)
- uaa_uri: the uri for your UAA instance
- uaa_client_id: the uaa client id for your Analytics and Time Series instances
- uaa_authorization_id: the base64 encoding of : from your UAA instance
- userId: a user id from the UAA instance
- userPassword: the password for the userId
- user_token: a valid UAA token from your Predix UAA service for your Predix Analytics and Time Series instances.
- Use the user token returned on the log in when you ingested tag-A and tag-B into Predix Time Series or use 'Request user token' from the postman collection to get a new user token
- catalog_uri, config_uri, execution_uri: from running 'cf env ' after binding your Predix Analytics instances to . (See the Getting Started guides)
- runtime-zone-id: the guid for your Predix Analytics Runtime
-
Deploy the demo-timeseries-adder-java analytic (https://github.com/PredixDev/predix-analytics-sample/tree/master/analytics/demo-timeseries-adder-java) to your Analytics Catalog (using your Analytics UI)
- add the demo-timeseries-adder-java.jar and demo-timeseries-adder-template.json as 'Executable' and 'template' attachments
- deploy and test the analytic using supportingDataFiles/analytic-input-for-demo-timeseries-adder.json
-
Create the orchestration configuration file from orchestration/ConfigurationFiles/orchestration-workflow.xml by updating the following entries in line 19:
- analytic Catalog Entry Id is catalog's guid for the analytic. The guid can be found on the URI for the Analytic Detail as follows: in the Analytic UI, go to the Analytic Detail page for the demo adder analytic, the guid is after '.../view/' in the uri.
- Analytic Name is the name you entered when you created the analytic in the Analytics Catalog
- Analytic Version is the version you entered when you created the analytic in the Analytics Catalog
-
Validate the Orchestration Workflow File - use the 'Workflow Validation' request from the Postman collection
- choose the updated bpmn.xml file in the Body of the request
- submit the request and look for 'Status: 200 OK'
-
Create a port-to-field Map - use orchestrationConfigurationFiles/port-to-field-map-for-demoTimeseriesAdder.json. You can use this file as is, no changes are needed.
-
Create an Orchestration Configuration Entry - use the following Postman requests
- 'Create Orchestration Configuration Entry' request. Note down the 'id' in the response - this is the orchestration entry id.
- 'Upload Orchestration Workflow File' request - change to the orchestration entry id from step i and chose the orchestrationConfigurationFiles/orchestration-workflow.xml in the body of the request before sending it.
- 'Upload port-to-field Map for Orchestration Step' request - change to the orchestration entry id from step i and chose the orchestrationConfigurationFiles/port-to-field-map-for-demoTimeseriesAdder.json before sending the request.
-
Run the orchestration using the 'Run Single Step Orchestration Using Tag Map' request - change to the orchestration entry id from the prior step before submitting the request.
You have just created and run an orchestration.
The following highlights important entries in the sample files and how values are related across the files.
time-series-tag-A-data.json contains:
''' { "messageId": "1453338376222", "body": [ { "name": "tag-A", "datapoints": [ [ 1453338376200, 1, 3 ], [ 1453338376201, 2, 3 ], ... '''
Note: the "name" : "tag-A" part of the json object. This causes Time Series to store the data under the tag-A key when Time Series ingests it.
The port to field map file ( port-to-field-map-for-demo-TimeseriesAdder.json), defines the input as coming from fieldIds temperature sensor and vibration sensor (see below). And the orchestration request (in postman) maps temperature sensor to tag-A and vibration sensor to tag-B. So tag-A's values are read for the temperature sensor and tag-B's values are read for the vibration sensor.
port to field map entry:
''' "inputMaps": [ {"valueSourceType": "DATA_CONNECTOR", "fullyQualifiedPortName": "data.time_series.numberArray1", "fieldId": "temperature sensor", "queryCriteria": {"start": 0, "end": -1}, "dataSourceId": "PredixTimeSeries" }, {"valueSourceType": "DATA_CONNECTOR", "fullyQualifiedPortName": "data.time_series.numberArray2", "fieldId": "vibration sensor", "queryCriteria": {"start": 0, "end": -1}, "dataSourceId": "PredixTimeSeries" } ], '''
orchestration run request:
''' "assetDataFieldsMap": { "temperature sensor": "tag-A", "vibration sensor": "tag-B", "demo sum": "tag-C" }, '''
The port to field map entries:
"fullyQualifiedPortName": "data.time_series.numberArray1" and "fullyQualifiedPortName": "data.time_series.numberArray2"
tell the processing to put the values in the input json object as:
''' {"data" : {"time_series" : {"numberArray1" : [], "numberArray2" : [] } } } '''
This is essentially the same as the input mapping. The orchestration run request maps tag-C with demo sum. The port to field map maps demo sum to the value from the json data.time_series.sum object in the analytic output.
When a port to field map is uploaded to the system, the request will contain a name field (see the request in postman). This value must match the <serviceTask ... id="<value>" ... \> in the orchestration's bpmn specification.