Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
KazuhikoTakata
Participant
This article is successor of Use SAP Build Process Automation data for process mining.

Introduction


Last time I posted my thought of Signavio Process Intelligence use case against workflow process developed by SAP Build Process Automation. That was generic idea to reduce initial setup effort of process mining project, and I thought it can be automated by something integration solution.

After some time, I have chance to develop this solution by myself. It was one week hackathon event and I completed implementing program by SAP BTP Integration Suite (like below architecture). Even my team was not awarded at that event, now I would like to share this solution for you to help your future process mining projects. Also, this activity was helpful for me to catch up SAP BTP Integration Suite capability.

From now on I will explain about my solution, but you may directly go to my GitHub repository to use my artifact.


Figure 1: Solution Architecture (Click to enlarge if applicable).



Preparation


Before starting integration, we will prepare the source and target applications.

Process Automation setup


As discussed in my previous post, any kind of Process (workflow) is available. This time I used Change and Innovation Approval sample process.

Please do not forget to add Visibility Scenario. This time I setup visibility attributes (e.g. Line of Business) for detailed analysis purpose.

After deployment, enter few data is helpful for testing integration.


Figure 2: Visibility Attributes in Process Details (Click to enlarge if applicable)


 

Signavio Process data pipeline setup


This is standard operation to create new Process data pipeline. Select Ingestion API as connection type, then follow instruction to complete wizard.

After that click connection (first) box in the pipeline, then you can see API information. This information is required at integration setup later.


Figure 3: Signavio API Credentials in Ingestion API connection (Click to enlarge if applicable)



Integration overview


I would like to explain overview of this integration developed by SAP BTP Integration Suite.

Required tables


"events" table storing event log is mandatory for process mining. This data is enough for minimum process mining, but normally we would like to do detailed analysis using attribute of workflow instance.

This is possible by "cases" table. It have case ID as primary key and arbitrary attribute columns. Then connect to "events" table by case ID, and provide attributes to each event log.

Finally, I need one more "tasks" table to decorate event description.

Source API endpoints


Based on the table requirements above, I mapped Process Automation API endpoints and target tables.

  1. GET /v1/task-instances -> events table (completed events per workflow box)

  2. GET /v1/workflow-instances/{workflowInstanceId} -> cases table (workflow started event)

  3. GET /v1/workflow-instances/{workflowInstanceId}/attributes -> cases table (visibility attributes)

  4. GET /v1/task-definitions -> tasks table (event name)


You may feel "cases" table came from two sources. It is just technical matter to minimize Signavio Ingestion API call.

Implementation key points


As discussed, I need to call four types of API to get data. I united these API call to single integration flow then call it from main program via ProcessDirect. API endpoint is parameter from main program.


Figure 4: BTP Workflow API : endpoint is parameter (Click to enlarge if applicable)


Similarly, when uploading three tables I call single integration flow from main program via ProcessDirect. This Ingestion API needs to pass parameters: table primary keys, table schema and data itself as CSV format. To do this I implemented Groovy script to generate multipart form data.


Figure 5: Signavio Ingestion API : generate multipart form data (Click to enlarge if applicable)



Configure and deploy Integration Flow


From now on, I will explain procedure to use this integration in SAP BTP Integration Suite.

Download and import integration package


Download integration package from GitHub Repository to your local machine, then import it to your Integration Suite environment.

Create Security materials


To securely access to BTP, create security material in Integration Suite. Fill in token service URL, client ID and client secret referring to service key in Process Automation instance in your subaccount.


Figure 6: Setup BTP Process Automation client credentials


Similarly, create security material for Singavio Ingestion API. Create Security Parameter then fill in API key to secure parameter field, referring to connection in pipeline.

Configure parameters


Go to imported integration package and Artifacts tab. You will see four integration flows and one script collection.

Click action button and configure menu of first integration flow "BTP Workflow API".

  • btpBaseUrl is the value of endpoint URL in Process Automation service key.

  • btpSecurityName is the name of security material you created above.


Click configuration of last integration flow "Signavio Ingestion API".

  • signavioApiEndpoint is the value of endpoint URL referring to connection in pipeline.

  • signavioSecurityName is the name of security material you created above.


Finally click configuration of third integration flow "Regular Integration".

  • workflowDefinitionId is the technical ID of Process Automation workflow. You can find it in SAP Build Monitor > Processes and Workflows.



Figure 7: Workflow ID in SAP Build Monitor (Click to enlarge if applicable)



Deploy Integration Flows


We are ready to start integation. In the artifacts tab, deploy "BTP Workflow API", "scripts" and "Signavio Ingestion API" at first. These are required for executing main programs.

Then deploy "Initial Integation" and monitor the result of this integration flow. If it is successfully done, you will go to Signavio pipeline and Source data tab, then find execution log of "tasks" table.


Figure 8: Result of Initial Integration (Click to enlarge if applicable)


Then deploy "Regular Integration" and do same things. You can find execution log of "cases" and "events" tables. Please note that you do not need to adapt attribute columns in "cases" table, this determination is automatically done.


Figure 9: Result of Regular Integration (Click to enlarge if applicable)


This integration flow is designed for scheduling run. You can go into integration flow and change start timer setting based on your requirement.



Figure 10: Configuring start timer (Click to enlarge if applicable)



Remaining Signavio configurations


Integration has done. And remaining things to start analysis are Signavio configurations.

Data Model and SQL


In the pipeline click Process Data Model to create, and Business Object too.

In the Business Object, create event collector SQL for completed event. Just copy below SQL. This is transformed from "events" table with "tasks" table to retrieve task name.
SELECT
e.workflowInstanceId as c_caseid,
'Completed '||t.name as c_eventname,
FROM_UNIXTIME(e.completedAt/1000) as c_time,
e.processor as c_user
FROM events AS e
LEFT JOIN tasks AS t
ON e.definitionId = t.id

Secondary create one more event collector SQL for started event. Just copy below SQL. This is transformed from "cases" table.
SELECT
c.workflow as c_caseid,
'Started Workflow ' || c.subject as c_eventname,
FROM_UNIXTIME(c.startedAt/1000) as c_time,
c.startedBy as c_user
FROM cases AS c

Finally create case attribute SQL. This is transformed from "cases" table. Only this SQL is dependent on your visibility attributes. Please change "c.*" part accordingly from below SQL.
SELECT 
c.workflow as c_caseid,
c.*
FROM cases AS c

After configuration, your object is looks like this. Check preview is working correctly before executing ETL. If you worked in process mining project, you may be surprised at how little SQL there is.


Figure 11: Image of Business Object in Process Data Model (Click to enlarge if applicable)



Process


Click Process from pipeline and follow instruction to create new Process.

ETL schedule


It is ready to start ETL. First run ETL manually and check if no error happened. Then schedule ETL based on your requirement (fastest period is one hour).

Closing


If everything works well, you can use Process and its investigations. Compared with my previous post, current investigation has attribute (e.g. Line of Business) for detailed analysis.

Using this integration, quite quickly you can start analysis of workflow using Signavio Process Intelligence.


Figure 12: Sample investigation (Click to enlarge if applicable)


I hope this blog help you. Please leave feedback or thoughts in a comment. Also please follow above topics and kaztakata.

 

Reference


SAP Build Process Automation | SAP Community

SAP Integration Suite | SAP Community

SAP Signavio | SAP Community
Labels in this area