Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
michal_majer
Active Participant


Intro


Once you start seriously implementing and orchestrating the pipelines in the SAP DI, question will come your way - how can I dynamically run a (sub)graph? How to pass parameters to it? The standard offers us two ways to solve this problem, but both are not ideal. Which is great, because we can solve this by creating our own operator to meet the needs we desire.

Let's start with available options.

Standard ways


Pipeline operator 👨‍✈️



Data Workflows - Pipeline


The pipeline operator from Data Workflows allows us to launch graphs from the current instance, and from the remote system. To start, it's required to pass the JSON as a string:
'{"event":"trigger","data":{}}'

That message is also generated from the Workflow Trigger operator, which is recommended to be used when a Pipeline operator is the first in the graph.
Connect this operator to the “Trigger” or “Terminator” operator in case it is the first or last operator that you want to be executed.

The useful property of this operator is "Configuration Substitutions". Using that, we could specify parameters + values. So for example, if our graph has ${BATCH} parameter to be defined on startup - we can set up this, using this property.


Looks like a complete solution, doesn't it? - There is only one problem.

I could not find a way to transmit parameters dynamically. In my case, I would like to transmit for example today's date, time etc.


That led me to solution number 2...

JavaScript Custom operator 🛰


For all available custom operator languages in DI, only plain JavaScript receives this wonderful functionality. Important information - the JavaScript internal API looks the most elaborate. (It's shame it's not available also in Python or Node.js)

https://help.sap.com/docs/SAP_DATA_INTELLIGENCE/97fce0b6d93e490fadec7e7021e9016e/a59151380fdc461e97c...

Using coding, we could make our scenario flight. We could implement all the logic directly in the JavaScript base operator, or pass only the required data to it.

It's code from the documentation. If you have any questions - just let me know! (or just use my operator).
$.addGenerator(gen)

// This example shows basic usage of a subgraph
// It calls a graph with a parameter, provides input to a port,
// checks a port's output and stops the graph

var graphName = "com.sap.demo.subgraph.call-and-wait.sub"
var subgraphInput = "subgraph-input "
var paramValue = "value1"
function gen(ctx) {
// start subgraph
var g
try {
g = $.instantiateGraph(graphName,
{"param1": paramValue},
// handle subgraph output port with name 'output'
{"output":function(ctx,s){
// check subgraph output
if (s != subgraphInput + paramValue) {
$.fail("Unexpected subgraph output: " + s + " Expected: " + prefix+paramValue)
}
// stop subgraph
g.stop()
}}
)
} catch(e) {
$.fail(e.message)
return
}
// write string in subraph input port with name 'input'
g.input(subgraphInput)
// wait for graph execution is finished
status = g.wait()
// check graph status
if (status != $.graphStatus.completed) {
$.fail("Subgraph is failed")
} else {
$.done()
}
}

 

My Pipeline 🦾 operator


I wanted to reuse this functionality in many pipelines and make life easier for others. The ability to create your own operators is an important feature of DI, which somewhat takes the burden of creating new operators off the software provider.

Consequently, I decided to implement such an operator and share it with you.

mm.dh.vflowpipeline is similar to standard com.sap.dh.vlowpipeline. The only important difference is that it could receive substitution parameters from the input port.


 

Likewise, in a standard operator, you could define substitution parameters using the configuration properties of the operator, or you could pass it as a message through the substitution input port. All parameters have been described in operator documentation.

Main graph




Subgraph



It's required to set the port as "exported". A right-click on the outport and select "Export port".

Node Base Operator


const SDK = require("@sap/vflow-sub-node-sdk");
const operator = SDK.Operator.getInstance();

operator.getOutPort("out1").send({
Attributes: {},
Body: [{
"name": "BATCH",
"value": 1000
}]
});

And that's all!

Please, let me know if, you think this is a useful solution. If you use the operator in your development, please tell me about it!

Please feel free to make any changes, starting from the ID to the program code.

 

DOWNLOAD mm.dh.vflowpipeline-1.0.0


 

How to import operators?



  1. Navigate to the Data Intelligence System Management application.

  2. Switch to the files tab

  3. Press import button








 

1 Comment
Labels in this area