Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Phillip_P
Product and Topic Expert
Product and Topic Expert
In a previous series we looked at how to configure connectivity between S/4HANA Cloud and SAP Data Intelligence. In this blog post we'll look at how to configure connectivity between SAP S/4HANA AnyPremise and SAP Data intelligence. britta.jochum and tobias.koebler have written in detail about SAP Data Intelligence ABAP Integration.

Table of Contents


Prerequisites



  • An explanation of the steps we'll follow, systems you need and some caveats


Connection Management on Data Intelligence



  • Configuration required in Data Intelligence to connect to S/4HANA On-Premise


SAP S/4HANA Security Settings



  • Security steps, required notes and user management on S/4HANA for connectivity


SAP S/4HANA Whitelisting



  • Required whitelisting steps to allow metadata browsing and data extraction


Important Notes



  • Notes used during the configuration for required steps or additional info


Connection Test, Metadata Explorer and Data Extraction



  • Test your configuration in Data Intelligence and run a first pipeline for data extraction


 

Prerequisites


We'll use Data Intelligence, Cloud Edition but the same scenario works with Data Intelligence On-Premise. For S/4HANA, we'll use the 1909 FPS01 Fully Activated Appliance available from SAP CAL. I would advise against using the plain S/4HANA 1909 instance available on SAP CAL as this instance comes with no pre-configuration, only standard clients, no additional users other than SAP* and DDIC and for Data Intelligence connectivity i found there were some missing required SICF entries. You'll have to perform a lot of additional steps to get the scenario working with this instance, including the required TCI note 2873666. Likewise, with the SAP S/4HANA 1909 FPS00 Fully Activated Appliance, the connectivity to Data Intelligence does work but there are a number of notes required to get to this point, including the above mentioned TCI note.

The S/4HANA 1909 FPS01 instance on CAL does not come SSL enabled, in order to use Web Socket RFC connections to Data Intelligence you'll need to first SSL enable your CAL instance and have valid certificates for ports 44300 and 44301. There are a number of guides available to do this and you can use freely available certificates from Let's Encrypt, these may need to be renewed every three months. For test and demo purposes this will suffice though. More details about SSL enabling your S/4HANA CAL instance is here.

The scenario described here is for test and demo purposes, for production scenarios, you should pay closer attention to the provided security mechanisms to protect the systems and data involved. With metadata explorer and data extraction there are fine grained options that allow you to restrict/allow access to specific CDS Views in the S/4HANA system. Note 2831756 describes some of these details. Also pay attention to your SAP kernel release version, if you use SAP Kernel 7.77 at patch level 100 or lower, you may encounter ABAP dumps and no data being transferred when you run a pipeline. SAP note 2898937 has more details.

 

Connection Management on Data Intelligence


In Data Intelligence Connection Management, select the Certificates tab and import the SSL certificate from your S/4HANA system. You get the certificate by exporting it from the secure port 44300 for example when accessing it via a browser. You could go to the Fiori Launchpad for example and export the certificate from there. More details about exporting the certificate are available in Note 2849542


Next, go to the Connections tab and let's create a new Web Socket RFC connection, select the + icon to create a new connection.


Fill out the required details of the connection.

  • Connection type is ABAP

  • Protocol WebSocket RFC

  • SID is S4H if you use CAL

  • Port is usually 44300 or 44301 but depends on your configuration

  • Hostname is the FQDN for your host such as s4hana1909.mydomain.com

  • Client 100 if you use CAL

  • Username enter S4H_DI (We'll create this in the next step on S/4HANA)

  • Password enter Welcome01 (We'll create this in the next step on S/4HANA)



 

Save the connection and let's move on to the required configuration in S/4HANA.

SAP S/4HANA Security Settings


Log into client 100 on the S/4HANA CAL system using SAP GUI and go to transaction SU01. Create a new technical user. Enter the Username S4H_DI and Alias as the same and select Technical User.


Enter the required info in the Logon Data tab, Alias and User Type and set the same password you used in Data Intelligence, Connection Management.


 

On the Profiles tab, I added in SAP_ALL and SAP_NEW but these were just additional as we will upload a required role into PFCG that is assigned to the S4H_DI user.

Next, let's go to SAP Note 2831756 and download the required role SAP_DH_ABAP_MASTER. In transaction PFCG go to Role-->Upload and select the role you just downloaded from the note.

In the Authorisations tab, save and generate the role.


In the User tab, add the S4H_DI user we created and save this. You'll need to do a User Comparison here to get to the green status. Once it looks as below, your user has the required role and authorisations for metadata explorer and data extraction in Data Intelligence.


If you go back to SU01 and view the Roles and Profiles tabs for your S4H_DI user they should look as follows.


 


 

SAP S/4HANA Whitelisting


The Security Settings Note 2831756 provides detailed steps for whitelisting objects to be allowed for metadata explorer and data extraction in a pipeline. If you want to whitelist at a detailed level you can do so. For the purpose of this blog we'll follow the same steps outlined in the note but we'll allow all objects to be viewed and used for data extraction that are eligible to do so. As is the case with S/4HANA Cloud, mentioned in the previous posts, there are restrictions around which CDS Views are eligible for data extraction. Data Preview in S/4HANA Cloud On-Premise is allowed though, that is what the whitelisting will cater for.

Okay, let's go to transaction SM30 and enter DHBAS_WL_OBJ_V as the view, click on display.


In the popup, select F4 for the whitelist scenario input and for each of these whitelist scenarios we will need to complete the whitelist to allow these objects.


Start with MD Browse, double click it and then select continue.


Make sure you are in Change Mode to edit the scenario, select New Entries (F5) for the six Dataset Object Types as below. For each Whitelist Scenario, we need to assign Include values, Options and a Value for each Dataset Object Type as shown below. Save the scenario.


Navigate back to the screen where you can select the Whitelist Scenario, next do the same as above for the MD_PREVIEW scenario. Complete the required table, save and then do the same for OP_READ and OP_USE. Once these are complete you can move onto the next step of whitelisting.

Go to transaction UCONCOCKPIT which is the Unified Connectivity Cockpit. Here we can complete the whitelisting for the WebSocket RFC scenario, adding in the required function modules that are used during metadata browsing and data extraction.

In the scenario dropdown, select the WebSocket RFC Scenario option to proceed.


Whilst in WebSocket RFC connection in the UCON COCKPIT Make sure you are in change mode, select insert row in the toolbar at the top and add the required function modules as below. Use the Check, Save and Execute Selection buttons to complete your configuration. Your config should look as follows.




Important Notes


A number of useful and required notes are available for this scenario. Some of the ones I referenced are:

Be sure to check for notes that might need to be applied to your system. In general look out for notes under component EIM-DH-ABA.

Connection Test, Metadata Explorer and Data Extraction


Let's go back to Data Intelligence, in Connection Management under the Connections tab select the three dots along the end of your WRFC connection entry and select Check Status.


You should have a successful test as follows.


Open up the metadata explorer and let's browse a CDS View and do a data preview.

In metadata explorer, select the connection you've created to S/4HANA and then select CDS Views.


 

As we enabled all eligible CDS Views for metadata browsing and extraction, you should see a large number of CDS Views from various functional areas and be able to perform a data preview on them if they have data.


In the Filter Items text box enter I_JOURNALENTRY and then navigate to /CDS/FI/GL/IS and select the CDS View I_JOURNALENTRY.


You can view the structure of the CDS View here, rows and columns, the so called Factsheet.



Select Data Preview to view the data and test if the whitelisting was successful. This is where your choices in SM30 translate to fine grained restrictions if you chose to do so.


If you'd like to test data extraction using a pipeline, let's navigate to the Data Intelligence Modeler tool and create a new, simple ad-hoc pipeline.

In the modeler, create a pipeline and add the following operators:

  • ABAP CDS Reader

  • ABAP Converter

  • Wiretap


The ABAP operators are required for working working with ABAP integration scenarios. CDS Reader is specific to CDS View scenarios and the ABAP Converter is used for CDS Views and SLT scenarios.

Your pipeline will look as follows:


The settings for the ABAP CDS Reader are as follows:


The settings for the ABAP Converter Reader are as follows:


Run the pipeline and then open the Wiretap, it opens in a new tab and you should start to see the data flowing.

This pipeline is simple enough for a quick test of the configuration, the wiretap will show if there is data moving between the systems. After this is successful, you could of course build on the pipeline and write the data to a file on the Data Lake or Cloud Storage. You should also add the scenario to the Machine Learning Scenario Manager for structure and repeatability but for a test this will suffice.

You can of course also monitor the data flows between the systems from the S/4HANA side. The usual suspects like logs etc apply but you could use transaction DHCDCMON which is the CDS View Replication Monitor for Change Data Capture.


Side Note: The red highlight in the ABAP CDS Reader screenshot above is where the documentation is accessible, for these operators there is quite detailed documentation now. It also details a CDS View you can use to find other eligible CDS Views for data extraction scenarios: DHCDC_AUTH_CdsExtrctnDHAPEFltr. That's so meta 🙂 Try this CDS View out in the Metadata Explorer for more info.

Remember you can also view available CDS Views in S/4HANA On-Premise using the Fiori App: https://<FQDN>:44300/sap/bc/ui2/flp#CDSView-browse You'll still need to check which ones meet the data extraction criteria though.

Conclusion


Hopefully this step by step blog post has shown what's possible with SAP Data Intelligence and SAP S/4HANA On-Premise in terms of data integration.

A big thanks to all the colleagues that answered my many millions of questions while I figured all this out 🙂
15 Comments
inayah01
Discoverer
0 Kudos
Hi Phillip,

Thank you for this blog which has detailed steps to establish connection between SAP DI and SAP S/4 HANA. My question is that can we follow the same steps for connecting an on prem HANA db instance to SAP DI (On-Cloud)? Thank you
Phillip_P
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi Nazia,

For HANA DB on-prem to DI Cloud you could use the SAP Cloud Connector.

More details available in section 2.2 of the following guide and note 2485925.

Hope this helps.
0 Kudos
Hi Philip,

The guide (section 4.5) says that only the HANA_DB connection is possible through the cloud gateway. Can we write data to HANA data base using the HANA_DB connection type? Is there an alternative way to write the processed data from DI cloud to HANA DB on premise.

Thanks

Nishant Kathuria

 
PhilMad
Participant
0 Kudos
Hello Philip,

 

Many thanks for this effort. I was checking the transaction SDCDCMON and I observe that the field "Observer" is green in your screenshot. I have a red light there. I assume that it can be set to green likely by corresponding entries in transaction DHCDCSTG. If there is only a little documentation available for the monitor transaction, there is none for the control transaction available, eithe rin documentation, notes or blogs, at least I couldn't ifnd anything. Do you have a pointer here for me?

Many thanks and kind regards, Philipp
Phillip_P
Product and Topic Expert
Product and Topic Expert
Hi Philipp,

I only used transaction DHCDCMON and didn't set up the jobs manually, this might be because I'm using the fully activated CAL appliance.

About the other questions, I asked a colleague from development and this was the response.

For job scheduling, we make use of the technical job framework in S/4HANA. Note 2190119 has more information, in case of any issues when scheduling jobs. In DHCDCSTG you only have the option to change the period (in minutes) after which the observer is rescheduled or you can activate some statistics. The statistics can be helpful to investigate any performance issues with this job.

The observer job (/1DH/OBSERVE_LOGTAB) is scheduled after the first subscription to a CDS view and corresponding database triggers including logging tables were created. The observer always runs for an hour and is then rescheduling itself.

As soon as the observer job is copying logging table records and there are ODP subscribers, the observer job would schedule the transfer job (/1DH/PUSH_CDS_DELTA). If the jobs are shown as red in DHCDCMON it means that there are activities available for the respective jobs but they are neither running nor scheduled. You can take a look at SM37 to see if these jobs ran into issues. You can also select the button “Dispatcher Job”  in DHCDCMON which would schedule all required and missing jobs.

Unfortunately there is no official guide available currently. I hope this helps with your issue, if not please let me know.

Regards

Phillip
former_member598107
Participant
0 Kudos
Hi pparkinson

This is great content!!

In this blog post you have mentioned Data extraction using CDS Views. Can we extract data from SAP ECC system and then do some transformation and load it in to S4 HANA on premise?

Please share your insights on this

Regards

Arun Sasi

 
Phillip_P
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi Arun,

I'm glad the post was useful for you.

Technically its feasible, depending on the use case you could use SLT via DI or consider using the Migration Cockpit.

It would also be good to ask in the Q and A section here for DI what the official guidance is around your particular use case.

Regards

Phillip
former_member598107
Participant
Thanks for the reply Phillip!! I have posted the question in Q&A. I know there is a new feature called Direct Transfer in S4 HANA and we can also leverage SAP Data Migration Cockpit for migration of standard data objects, so I believe that DI can extract the data in to a staging layer and then it can be consumed in to a Target S4 HANA structure using SAP DMC.

Not sure how we can post through IDOC in SAP DI. This is something I cannot visualize.

Regards

Arun Sasi
Gurewitsch_IT
Explorer
0 Kudos

... Well it works fine if you run DI in BTP/SAP Cloud (using Cloud Connector).

Which approach would be feasible to connect DI on some AWS/Azure/GCP with on-premise sources?

"VPN" is not elegant having different source systems running in the different networks 🙁

Expose "backend" using BTP as a "proxy"? Should work, right?

Do you know another approach without BTP?

Thank you in advance!

Cheers.

Dimitri.

 

0 Kudos

Hi pparkinson , do you need SAP_ALL and SAP_NEW roles at all? If I only add the role SAP_DH_ABAP_MASTER and then whitelist the CDS I want to access, should this be enough?

aghoshal
Explorer
0 Kudos
Hello Philip, Thankyou for the detailed explanation.

As mentioned in the end that we can similarly setup connection from DI to HANA Data Lake, can you please suggest the approach or any blogs to achieve the same.

Thanks

Aporup

 

 

 

 

 
rajeshps
Participant
0 Kudos
pparkinson

Gday! Requesting you to please check on the below Question and revert

https://answers.sap.com/questions/13795123/call-abap-proxy-from-sap-di.html

Thankyou in advances! Appreciate your Valuable Inputs on above question
amish1980_95
Participant
0 Kudos
Hi pparkinson

Thank you for the details.

I have followed all the steps but while testing connection status in SAP Data Intelligence Connection management, I am getting connection not opened error.

The certificate I downloaded from S/4 hana Fiori server link is invalid, hence it is not getting uploaded in SAP DI.

S/4 hana Fiori server link: was provided in bootstrap file (SAP CAL)

I don't se a way out of this, could you please suggest?

Thanks,

Indu Khurana.
MKreitlein
Active Contributor
0 Kudos
Hello pparkinson

I found your blog comment today by googeling the OBSERVE_LOGTAG.

Do you know if there is any official documentation in the meanwhile?

I would have expected to find something about it in the SAP Datasphere help pages, but could not... seems this topic is excluded completely until today.

I only found this note: https://launchpad.support.sap.com/#/notes/2190119 but this is rather general than specific.

Thanks and best regards,

Martin
albertosimeoni
Participant
0 Kudos

Hello Phillip,

I would like to enrich the topic of CDS Mechanism, as SAP Datasphere / DWC make use of them, there is a tight integration between ODP - CDC and replication capabilities in DWC ( remote table real-time replication and replication flows ).

I was able to track some of what is done at database level (S4 side) by these mechanism but many things remain obscure.

  1. For every table in every extractor it creates 4 trigger ( insert , update non key column, update key column, delete). + 4 sequences.
  2. We are not able to test the heaviness of this mechanism -> we do not want to deploy a solution into a customer system with something that potentially could slow down the ERP. and if we test on our instances we do not have the OLTP workload to test if the solution slows down the system, so I was not able to track completely whats going on behind.

Have you some practical experience with CDC deltas ? like if we set up half of the older 2LIS extractors (using CDS Views) and MATDOC + ACDOCA. Will this run easily or will it slow down the ERP ?

Another consideration is that for Datasphere the remote table replication with "Real-Time replication" enabled, Issues a request every 15 seconds to the ERP (that seems not feasible) !!
Now with replication flow this interval changes to 1 hour.

Another idea of mine was to test if CDC can capture data for reconstruct CDS Views with joins after aggregations, (like this logic):

Select A.*, B.ACT_DELIV_QTY
from VBAP A
left join ( select KDAUF,KDPOS,SUM(LFIMG) as ACT_DELIV_QTY
from LIPS ) B
on B.KDAUF = A.VBELN
and B.KDPOS = A.POSNR

I expect that LIPS will be logged ( trigger INS,UPD,DEL with keys VBELN, POSNR).
But can not reconstruct the final CDS View as VBELN and POSNR from LIPS are not projected in the final CDS View (the aggregation is on KDAUF, KDPOS, the VBELN POSNR detail from LIPS can not be projected to final CDS View as they will change the aggregation level of the output (VBAP keys)).

If my guess are right, this problem will be an obstable to have CDC delta extractors of Standard Compatibility CDS Views (the one used for ECC -> S4 migration):

MBV_MBEW -> Compatibility CDS View to MBEW table.

NSDM_V_MARDH -> MARDH

Many columns of these compatibility views are calculated on the fly based on aggregations of transactional tables (ACDOCA_ML_EXTRACT, MATDOC_EXTRACT).

These compatibility views are used extensively in S4 and for final customers in that works ABAP rather than DB layer "these are ERP tables".

Have you some exaple of these "Join after Aggregation" that works ?

Best

Regards,

Alberto