Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
olaf_fischer
Advisor
Advisor

Introduction


When using the data integration functionality a regular check with respect to executions status and errors is a common task.

Based on the delivered task monitoring views ( see for details: (https://help.sap.com/viewer/9f804b8efa8043539289f42f372c4862/cloud/en-US/4ab45090c5684ebf8765757a1df... ) this blog describes an SQL view combining these into a good basis to build reports.

Starting with some sample scenarios providing ideas and inspiration of what can be achieved using the task log views, it also describes a concrete example.

Sample Scenarios


Number of Failing Runs


The view provides a list of schedules, that had at least one failing run in the past days. In the example below we see, that there are six schedules with at least one failure on the 28th or 29th of September (the first one shows up as it has an error not part of the screenshot).

For these schedules, the number of completed and failed runs is counted per day. In the example below you see the SOURCE LEDGER remote table replication failed 19 times and successfully completed 5 times on the 29th . This sums up to 24 – which is as expected given that the cron string 0 */1 * * stands for an hourly run.

Hint: to easily understand the cron string – use e.g. https://crontab.guru . For the example https://crontab.guru/#0_*/1_*_*_* .

The column with the cron strings indicate the expected frequeny of runs. The columns completed/failed indicate the count of execution status.

For a step-by-step instruction to create a story showing the information below, please scroll down to chapter “Create SQL View for Reporting”.


These are some examples of reports you can build on the view:

Number of Failing Schedules


To have better overview of failing schedules, the following proposal just counts how many schedules failed at least once for a day.

For the 28th of Sept you see one schedule with a failure, hence the summary below shows only a count of 1, for the 29th you see six rows – and the summary shows a count of six.



Currently Running Task


Next to information on finished schedules, an overview about the currently running tasks – and a comparison to the past runs could be helpful.

In the given example the green line looks good: the duration is still below the average and median of the past executions.

If you check the blue line you see something suspicious – it runs unexpected long.

For the red ones again we see that they seem to be still running according to the task framework, still the absence of an active locks indicates that this is stopped/finished already.


The step-by-step instructions for this sample is described here: https://blogs.sap.com/2021/11/04/sap-data-warehouse-cloud-data-integration-monitoring-running-task-o...

Create the SQL View for Reporting


Now let’s start with the view creation that can be consumed in SAP Analytics Cloud story.

As preparation we have to import the tables

  • TASK_LOGS_V_EXT,

  • TASK_LOCKS_V_EXT,

  • TASK_SCHEDULES_V_EXT


Open the Editor for SQL View creation and drag & drop the tables from the sources into the SQL Editor field:







You will then be prompted with the Import and Deploy dialog:







 

Navigate to your monitor space and create an SQL view with the following content. As a proposed naming convention, let the name end with V_R (it is a view on the reporting layer).

Settings:


Choose the following dimensions as measures:


The mappings with the SQL column names are:

  • DURATION -> Duration

  • ROW_COUNT -> Count

  • COUNT_RUNNING_CAL -> Count Running

  • COUNT_FAILED_CAL -> Count Failed

  • COUNT_COMPLETED -> Count Completed


And here is the SQL - you might want to check comments for more details ... Don't forget to check upfront that the space is selected as monitor space.
/* --------------------------------------------------------------------------------------------

Monitoring of DWC Data Integration Tasks (dataFlow, persistence and replication activities)

Link for documentation:
- Main page for Data Integration Monitoring
https://help.sap.com/viewer/9f804b8efa8043539289f42f372c4862/cloud/en-US/4cbf7c7fc64645bfa3643328275...
- Technical description of TASK_LOG view
https://help.sap.com/viewer/9f804b8efa8043539289f42f372c4862/cloud/en-US/4ab45090c5684ebf8765757a1df...

This view exposes:
- task properties like duration and execution status (e.g. failed, completed, ...).
- various measures for counting tasks by e.g. failed
- the cron description of the schedule defined
- locking status

Todo before usage:
- Check that the space is enabled for monitor space
- Change the constant for the url_host to your DWC instance
------------------------------------------------------------------------------------------ */

SELECT

base."TASK_LOG_ID",

base."SPACE_ID",
base."APPLICATION_ID",
base."OBJECT_ID",
base."ACTIVITY",

base."STATUS",

base."START_TIME",
to_date(base."START_DATE") "START_DATE",
base."END_TIME",

-- manual or scheduled execution on behalf of
base."TRIGGER_TYPE",
base."TRIGGERED_BY",

-- enable jump from SAC report to DWC detailed view
'<my-dwc-system>.hcs.cloud.sap' URL_HOST,
concat(concat('/dwaas-ui/index.html#/dataintegration&/di/'
, base."SPACE_ID"), base."MONITOR_TYPE") URL_PATH,

-- at least one run on the same day has a failure
failed."HAS_FAILURE" OBJECT_ID_WITH_FAILURE,

-- the re-occurence of the task in cron notation
"TASK_SCHEDULES_V_EXT".CRON,

to_int(base."DURATION") "DURATION",
to_int(base."ROW_COUNT") "ROW_COUNT",
to_int(base."COUNT_RUNNING_CAL") "COUNT_RUNNING_CAL",
to_int(base."COUNT_FAILED_CAL") "COUNT_FAILED_CAL",
to_int(base."COUNT_COMPLETED_CAL") "COUNT_COMPLETED_CAL",

CASE WHEN "TASK_LOCKS_V_EXT"."TASK_LOG_ID" IS NULL
then to_int(0)
ELSE to_int(1)
END "ACTIVE_LOCK"

FROM (
SELECT "TASK_LOG_ID",
"SPACE_ID",
"APPLICATION_ID",
CASE WHEN "APPLICATION_ID" = 'VIEWS' then '/viewMonitor/'
WHEN "APPLICATION_ID" = 'REMOTE_TABLES' then '/remoteTableMonitor/'
WHEN "APPLICATION_ID" = 'DATA_FLOWS' then '/dataFlowMonitor/'
ELSE "APPLICATION_ID" END "MONITOR_TYPE",
"OBJECT_ID",
"ACTIVITY",
"STATUS",
"START_TIME",
"START_DATE",
"END_TIME",
"TRIGGER_TYPE",
"TRIGGERED_BY",
"DURATION",
to_int(1) "ROW_COUNT",
CASE WHEN "STATUS" = 'RUNNING' then to_int(1) ELSE to_int(0) END "COUNT_RUNNING_CAL",
CASE WHEN "STATUS" = 'FAILED' then to_int(1) ELSE to_int(0) END "COUNT_FAILED_CAL",
CASE WHEN "STATUS" = 'COMPLETED' then to_int(1) ELSE to_int(0) END "COUNT_COMPLETED_CAL"
FROM "TASK_LOGS_V_EXT"
) base
-- add locking information
LEFT OUTER JOIN "TASK_LOCKS_V_EXT"
ON base."TASK_LOG_ID" = "TASK_LOCKS_V_EXT"."TASK_LOG_ID"

-- add failed run during same day property
LEFT OUTER JOIN (
SELECT DISTINCT 'X' HAS_FAILURE,
"START_DATE",
"SPACE_ID",
"APPLICATION_ID",
"OBJECT_ID",
"ACTIVITY"
FROM "TASK_LOGS_V_EXT"
WHERE "STATUS" = 'FAILED'
) failed ON
base."START_DATE" = failed."START_DATE"
AND base."APPLICATION_ID" = failed."APPLICATION_ID"
AND base."OBJECT_ID" = failed."OBJECT_ID"
AND base."ACTIVITY" = failed."ACTIVITY"

-- add cron information
INNER JOIN "TASK_SCHEDULES_V_EXT" ON
base."SPACE_ID" = "TASK_SCHEDULES_V_EXT"."SPACE_ID"
AND base."APPLICATION_ID" = "TASK_SCHEDULES_V_EXT"."APPLICATION_ID"
AND base."OBJECT_ID" = "TASK_SCHEDULES_V_EXT"."OBJECT_ID"
AND base."ACTIVITY" = "TASK_SCHEDULES_V_EXT"."ACTIVITY"

Create an SAP Analytics Cloud Story for the Failing Runs



  1. Create a blank story in your SAC system

  2. Insert your DWC analytical dataset as data model

  3. Insert a table and configure the point-of-view as outlined below


The following measures are needed:

  • Failed

  • Completed


Create Measure 'Failed':


Use measure COUNT_FAILED_CAL from the SQL Views and set the description to Failed. You can also directly use the Count Failed measure directly - still a shorter name gives a more compact view.

Create Measure 'Completed':



Important: Select “Enable Constant Selection” to have the correct values being displayed

Configure table: Set the drill state and filters according to the screenshot below:




Hyperlink to Open DWC Monitoring Page


If you like to enable a jump from your story to the integration monitor, you can add a hyperlink:

How to do: chose the Add -> Hyperlink option and specify the URL like shown below:


 

Conclusion


The new task related views provide a great addition to the existing monitoring around SAP Data Warehouse Cloud Data Integration.

Take this sample and tailor and enrich it to your daily needs.

I’d be happy to hear your thoughts, ideas and comments on this monitoring topic. Let me know in the comments!
6 Comments
olaf_fischer
Advisor
Advisor
0 Kudos
Please check out the blog post https://blogs.sap.com/2021/11/04/sap-data-warehouse-cloud-data-integration-monitoring-running-task-o... - it continues the series for sample monitoring content.
dasari_reddy_ext
Explorer
0 Kudos
Hi olaf.fischer

 

Thank you for this detailed blog.

I have a query related to the same, When we replicate the new data in DWC Integration Monitor the old data gets deleted or is it still stored in Disk Storage?

 

Regards,
olaf_fischer
Advisor
Advisor
0 Kudos
Hi Reddi, from what I have seen so far, the data will be removed once the data import has been finished with success. Best regards, Olaf
MKlare
Explorer
0 Kudos
Hi olaf.fischer ,

the newly introduced Task Chains are not represented in the monitoring views. Seems like the filter for the application id must be enhanced by the value 'TASK_CHAINS', e.g. in view TASK_LOG_MESSAGES_V_EXT.

Is there any plan to enhance the current monitoring views?

Best regards,
Mathias
olaf_fischer
Advisor
Advisor
0 Kudos
Hi Mathias,

I have just checked the documentation - and it doesn't list the task chains. Unfortunately I am not aware of any enhancements in that direction.

Sorry for not having positive news for you.

Best regards, Olaf
paulvatter
Participant
0 Kudos
Hi olaf.fischer

how can this request be positioned on the DWC roadmap? For me the requirement to have the task chains included in the monitoring is very valuable.

Thanks and best regards
Paul