Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Thomas_Cayzerg
Product and Topic Expert
Product and Topic Expert

Introduction

Realizing & executing a complete end-to-end flow to serve eXtended Planning & Analytics (xP&A) purposes might sometimes be painful for business users, who need to trigger the integration of their own dataset, without assistance from IT department.

Considering the fact, that input data do usually have to be transformed and harmonized, before being used as planning drivers or based data for calculations, the way to manage this data preparation step might definitively put this task out of the responsibility of business department, especially when robust controls and quality have to be performed.

Classical IT integration flows for having the appropriate data in SAP Analytics Cloud was to consider a batch or scheduled approach for:

  • Extracting a global dataset from a source operating system; delta options are always tricky to manage.
  • Preparing the extracted data with a dedicated middleware, considering transformation rules provided by the business.
  • Importing to SAP Analytics Cloud models, based on a frequency that might not be relevant according to the business calendar.

In this blog, we are going to present how you can setup a complete integration flows for business where they could be autonomous for execution and transformations, but keeping full capacity to controlled integrated data.

 

Description of the business scenario

Let’s try to make it simple, but realistic.

Financial controllers in charge of a scope of reporting units must usually collect their key figures from source systems on a regular basis (quarter, month, week, …).

For simplification, sources extraction has been already done and each controller can access them as flat files organized in a shared and secured files repository (SFTP, Sharepoint, etc.).

Here are the different steps of this classical reporting workflow:

  • Business user asks for data integration from his SAC Planning story through a multi-action trigger button.
  • Contextualized data are extracted from the shared repository in alignment with business user data accesses.
  • Source data are injected into an SAP Datasphere staging table for managing specific transformations or validations before import.
  • Ready-to-import data are exposed from an analytical view to SAP Analytics Cloud Planning import job.
  • SAC Planning import job executes data integration into the target model.
  • Business user can verify their newly integrated data.

Let’s define now the different elements to make this scenario working properly.

 

Setup a shared files repository with Google Cloud Platform

We are going to create a new files repository, that would be securely accessible through the SFTP (Secured File Transfer Protocol) standard by our middleware solution.

Navigate to the google cloud console (https://console.cloud.google.com/) and create a new project from the welcome screen.

tcayzergues_1-1710856619780.png

 

Give an explicit name to your project and click on Create button.

tcayzergues_2-1710856619782.png

 

Navigate to your new created project from the Console list box:

tcayzergues_3-1710856619783.png

 

 

Enable the Compute Engine API in your new Project by clicking Compute Engine, then Enable button.

 

Create a new VMWare instance with the button Create Instance:

tcayzergues_4-1710856619789.png

 

 

Define the characteristics of your new server; for files sharing a low-cost instance should be enough at least for development purpose:

tcayzergues_5-1710856619803.png

 

 

 

 Ensure that the target Operating System is Debian GNU Linux and that incoming HTTPS inbounds are allowed:

tcayzergues_6-1710856619807.png

 

 

Click on the Create button to initiate the server creation.

Once the server is instantiated, it will be visible in the console with an external IP address to connect to:

tcayzergues_7-1710856619811.png

 

Connect to your instance by clicking the SSH button:

tcayzergues_8-1710856619815.png

 

 

Validate authorization demand to connect the browser in SSH:

tcayzergues_9-1710856619817.png

 

 

You are now connected to your server through the SSH console:

tcayzergues_10-1710856619820.png

 

 

To connect to your server through SFTP, it is highly recommended to add a dedicated user.

Add a new sftp group:

tcayzergues_11-1710856619821.png

Create a director for an SFTP backend user:

tcayzergues_12-1710856619821.png

Add the sftp_user to the sftp group:

tcayzergues_13-1710856619822.png

Define the password for the new sftp_user:

tcayzergues_14-1710856619823.png

Change the ownership of the directory /home/sftp_user:

tcayzergues_15-1710856619823.png

Change the access mode to the directory /home/sftp_user:

tcayzergues_16-1710856619824.png

Now modify the standard SSH behaviour to authorize password authentication. Edit the file /etc/ssh/sshd_config in a text editor like vi and set the option PasswordAuthentication to yes:

tcayzergues_17-1710856619824.png

 

tcayzergues_18-1710856619829.png

Save the file with vi command esc + wq.

Restart the ssh-daemon:

tcayzergues_19-1710856619829.png

 

Test the SFTP connection with an SFTP client:

tcayzergues_20-1710856619832.png

 

Ensure that you can create a new folder at the root level of the SFTP directory:

tcayzergues_21-1710856619834.png

You can upload any file to the SFTP repository.

 

Create the staging data layer with SAP Datasphere

Data upload to SAP Datasphere will be performed through the Open SQL access to the underlying Hana Cloud layer of the instance.

 In SAP Datasphere, you need to create a new space or select an existing one. Then open the configuration of the space from the Edit button:

tcayzergues_22-1710856619838.png

 

 

Click on the Database Access tab:

tcayzergues_23-1710856619841.png

 

 

Create a new database user by clicking the Create button:

tcayzergues_24-1710856619843.png

 

Set username suffix and enable SQL read & write accesses by clicking the checkboxes:

tcayzergues_25-1710856619849.png

 

New Open SQL user has been created but is still not deployed:

tcayzergues_26-1710856619850.png

 

 Click on the space Deploy button to save and deploy the new Open SQL user:

tcayzergues_27-1710856619853.png

 

The status of the space deployment is displayed by SAP Datasphere:

tcayzergues_28-1710856619854.png

 

Click on the Info symbol:

tcayzergues_29-1710856619855.png

 

 Click on Request New Password button:

tcayzergues_30-1710856619857.png

 

Copy the new generated password the user and close the dialog box:

tcayzergues_31-1710856619858.png

 

Click on the new created user from the list and connect the schema with the SAP Hana Database Explorer:

tcayzergues_32-1710856619860.png

 

Ensure that “Authenticate using username and password” is selected and paste the new password from the clipboard with the save checkbox to true:

tcayzergues_33-1710856619863.png

You are now connected to Hana Cloud instance of SAP Datasphere through Open SQL:

tcayzergues_34-1710856619865.png

 

Expand the Catalog from the left panel, right click on Table and click on Open SQL Console menu:

tcayzergues_35-1710856619869.png

 

Create a new SQL table by typing the DDL statement into the console and execute it with F8:

tcayzergues_36-1710856619872.png

 

 

The new staging table has been created and is now available for data upload:

tcayzergues_37-1710856619874.png

 

 

Build a REST API with SAP Cloud Integration

 Connect to your SAP Integration Suite tenant and click on Integrations and APIs:

tcayzergues_38-1710856619877.png

 

Create a new package and save it with the Save button:

tcayzergues_39-1710856619881.png

 

 Click on the Artifacts tab and add a new REST API artifact from the Add list box:

tcayzergues_40-1710856619885.png

 

 Fill Name, ID and Description of the new artifact, then click the Add and Open in Editor button:

tcayzergues_41-1710856619889.png

 

A new Integration Flow has been created with the fundamental functions of your Rest API:

tcayzergues_42-1710856619892.png

 

Click on the Edit button on the top right menu to modify the predefined flow:

tcayzergues_43-1710856619893.png

 

 Remove the Receiver box on the right by clicking on it then selecting the trash bin icon:

tcayzergues_44-1710856619894.png

 

 Double-click on the local sub-process Handle Request and rename it “Get SFTP Data”:

tcayzergues_45-1710856619897.png

 

 

In the Integration Process, click on Handle Request box and rename it “Get SFTP Data”:

tcayzergues_46-1710856619900.png

 

 Select the right arrow after Get SFTP Data and add a new step of type Process Call by scrolling down the list:

tcayzergues_47-1710856619902.png

 

 Adjust the size of the graph by extending its size.

Add a new local integration process by choosing it from the top bar menu:

tcayzergues_48-1710856619904.png

 

 

Position the new local process box to the right bottom of the page:

tcayzergues_49-1710856619908.png

 

Rename the new local process “Post Data to Datasphere”:

tcayzergues_50-1710856619910.png

 

 Click on the “Process Call 1” box and rename it “Post to Datasphere”:

tcayzergues_51-1710856619914.png

 

 From the Process call menu, click the Process tab and hit Select button:

tcayzergues_52-1710856619915.png

 

 Choose the Post to Datasphere from the list:

tcayzergues_53-1710856619917.png

 

In the Post Data to Datasphere local process, add a flow step of type Content Modifier:

tcayzergues_54-1710856619920.png

 

In the Post Data to Datasphere local process, add a flow step of type Content Modifier:

tcayzergues_55-1710856619922.png

 

 

Click on the Save as version button from the top right menu:

tcayzergues_56-1710856619923.png

 

Fill version information and click OK button:

tcayzergues_57-1710856619924.png

 

 

Controlling the execution context of the Integration Flow will be managed thanks to SAP Analytics Cloud sent variables in JSON format. The request body will thus content dynamic information, that must be used during processing.

tcayzergues_58-1710856619924.png

SAP Cloud Integration is mostly managing its message in XML format, thus we will transform the JSON body from the caller to an XML message.

Edit the Integration Flow and click on the arrow before the Get SFTP Data process call box to add a new step JSON to XML converter:

tcayzergues_59-1710856619927.png

 

 

Rename the step JSON to XML Converter and go to the Processing tab to uncheck the Use Namespace Mapping option:

tcayzergues_60-1710856619929.png

 

We will take the values formatted as XML and save them as message properties through a new message content modifier step added after the JSON to XML converter:

tcayzergues_61-1710856619931.png

 

 Name the new step Set Job Parameters:

tcayzergues_62-1710856619932.png

 

 Go to Exchange Property menu and add a new constant property named STAGING _TABLE with the value TBL_SFTP_STAGING:

tcayzergues_63-1710856619933.png

 

Add three additional properties named SAC_VERSION, SAC_DATE and SAC_CONSUNIT of type XPath with the below source values:

tcayzergues_64-1710856619937.png

 

In the Get SFTP Data local process, add a new step of type Poll Enrich before the End event:

tcayzergues_65-1710856619940.png

 

Rename the step Get SFTP File:

tcayzergues_66-1710856619941.png

 

Add a new Sender external system to SFTP poll from the top menu participant list:

tcayzergues_67-1710856619944.png

 

Rename it SFTP_Server:

tcayzergues_68-1710856619945.png

 

Link the SFTP_Server to the Poll and choose SFTP as communication protocol:

tcayzergues_69-1710856619947.png

 

Save the Integration Flow before going further.

To create the SFTP credentials, you need to go the menu Integration and APIs from the left bar menu:

tcayzergues_70-1710856619948.png

 

Click the Security Material tail:

tcayzergues_71-1710856619949.png

 

Click on the Create button and choose User Credentials from the list:

tcayzergues_72-1710856619951.png

 

Enter required fields and click on Deploy when finish:

tcayzergues_73-1710856619952.png

To ensure a secure dialog between the SFTP server and Cloud Integration, you must update the known_hosts config file. You can use the connectivity test to get the host key of the SFTP server.

In the security management tiles, click on Connectivity Tests:

Thomas_Cayzerg_0-1712579870887.png

Go to the SSH menu tab and fill the fields with you SFTP server properties, click the Send button:

Thomas_Cayzerg_1-1712579909746.png

After the server’s answer, copy the server key through the Copy Host Key button:

Thomas_Cayzerg_2-1712579942543.png

Thomas_Cayzerg_3-1712579959950.png

From the Monitor menu, open the Security Material tile:

Thomas_Cayzerg_4-1712579977997.png

Download a copy of the known.hosts file:

Thomas_Cayzerg_5-1712579998070.png

Open the known.hosts with a text editor and paste the content of the clipboard (Ctlr+V) at the end of file as a new row:

Thomas_Cayzerg_6-1712580037062.png

Save and close the known.hosts file.

Now, upload the modified file in the security material list:

Thomas_Cayzerg_7-1712580071020.png

Select the modified file from your computer and click the Deploy button:

Thomas_Cayzerg_8-1712580094284.png

 

Reopen the Integration Flow designer and click on the SFTP arrow, go to Source tab.

Enter connection information to navigate to the right directory et put the filename identified by dynamic parameters as properties:

tcayzergues_74-1710856619956.png

 

Click the Processing tab to setup the behaviour of adapter when file has been processed:

tcayzergues_75-1710856619958.png

 

Save the Integration Flow once you have done.

File content read from the SFTP adapter will be saved as CSV data in the body message of the Integration Flow. Translate it in XML format through a CSV to XML step:

tcayzergues_76-1710856619961.png

 

You will need to describe how you want to represent the XML data after the transformation. You will create an XSD document to manage this in a notepad, then upload this file as an attachment of your Integration Flow.

Content of the XSD schema file to create with a notepad editor:

tcayzergues_77-1710856619966.png

 

Click on the global area of the Integration Flow, then navigate to the References tab and click the Add button to add a new XSD schema:

tcayzergues_78-1710856619967.png

Select the new created file from your filesystem, then click add:

tcayzergues_79-1710856619968.png

 

Select the CSV to XML step, navigate to Processing tab and set the different fields:

tcayzergues_80-1710856619972.png

 

Save the Integration Flow once done.

In the Post Data to Datasphere local process, rename the “Content Modifier 2” step to “Prepare DELETE” from the General tab:

tcayzergues_81-1710856619973.png

 

Add a new Exchange Property as an expression with the following setup to persist the retrieve content as a variable:

tcayzergues_82-1710856619975.png

 

In the Message Body tab, enter the following Expression:

<root>

    <Delete_Statement1>

        <dbTableName action="DELETE">

            <table>${property.STAGING_TABLE}</table>

            <access>

                        <VERSION>${property.SAC_VERSION}</VERSION>

                       <DATE>${property.SAC_DATE}</DATE>

                        <CONSUNIT>${property.SAC_CONSUNIT}</CONSUNIT>

            </access>

        </dbTableName>

    </Delete_Statement1>

</root>

tcayzergues_83-1710856619979.png

 

This dynamic SQL Delete statement will clear existing data with same execution scope into the staging table.

Select the arrow after the Prepare DELETE step and add a new step of type Request Reply:

tcayzergues_84-1710856619982.png

 

Name it Post DELETE in the general tab.

Add a new Receiver from the top bar menu and name it DELETE_Data:

tcayzergues_85-1710856619983.png

 

Link the Post DELETE to the DELETE_Data receiver by choosing a JDBC adapter type:

tcayzergues_86-1710856619985.png

 

A JDBC connection to the Open SQL Schema of Datasphere must be setup before continuing the Integration Flow.

Save your Integration Flow and navigate to Monitor / Integrations and APIs menu from the left bar:

tcayzergues_87-1710856619986.png

 

Click on the JDBC Material tile:

tcayzergues_88-1710856619987.png

 

Add a new JDBC source by filling the dialog fields to connect to the Open SQL schema of Datasphere, click on Deploy to save:

tcayzergues_89-1710856619989.png

When successfully deployed, the JDBC connection appears in the list:

tcayzergues_90-1710856619992.png

 

Go back to the Integration Flow in edit mode and navigate to JDBC link connected to the step Post DELETE. Enter the connection properties in the connection tab:

tcayzergues_91-1710856619994.png

 

Add a new flow step of type Groovy Script after the Post DELETE to prepare the insert SQL statements to write our data the staging table, name it Prepare INSERT:

tcayzergues_92-1710856619996.png

 

 

In the processing tab of the Prepare INSERT step, add a new Script File from the Create button:

tcayzergues_93-1710856619997.png

 

Replace the default content with the following script:

tcayzergues_94-1710856620003.png

 

Add a new Request Reply step after the Prepare INSERT and name it Post INSERT:

tcayzergues_95-1710856620006.png

 

 

Add a new receiver from the top bar menu, name it INSERT_Data:

tcayzergues_96-1710856620007.png

 

Link Post INSERT to the receiver INSERT_Data through a JDBC adapter:

tcayzergues_97-1710856620010.png

 

Configure the JDBC link with the JDBC Open SQL alias and the Batch Mode option checked:

tcayzergues_98-1710856620013.png

 

Setup of the Integration Flow is completed, save it as the initial version:

tcayzergues_99-1710856620016.png

 

And deploy it:

tcayzergues_0-1710858012639.pngtcayzergues_1-1710858021246.png

 

Once the deployment is done, navigate to Monitor / Integrations and APIs left bar menu:

tcayzergues_2-1710858036442.png

 

Click on Manage Integration Content / All Started tile:

tcayzergues_3-1710858083609.png

Click on your SFTP_Integration API from the list and copy the URL of the service with the dedicated button:

tcayzergues_4-1710858097294.png

 

Create a REST Api connection in SAP Analytics Cloud

Go to SAP Analytics Cloud and open the connections menu from the left menu panel.

Add a new source of type HTTP API:

tcayzergues_5-1710858128422.png

 

Set the information to connect to the REST API and the Auth 2.0 security values provided by your technical administrator (taken from the BTP Cockpit), click on create to validate:

tcayzergues_6-1710858148766.png

 

Create the data transformation view in SAP Datasphere

Navigate to SAP Datasphere and from the data builder connected to your space, create a new graphical view:

tcayzergues_7-1710858172947.png

From the left bar, select the staging table as a source of your view by expanding the Open SQL schema:

tcayzergues_8-1710858189183.png

 

Import and deploy the remote table in your space:

tcayzergues_9-1710858207879.png

 

You can now decide to enhance the calculation flow of your view to manage transformation or mapping rules with standard Datasphere features:

tcayzergues_10-1710858224411.png

In our current example, we just want to expose data as they were uploaded in the staging table, thus rename the view VW SFTP STAGING and set it as Fact view exposed for consumption (define a measure as LC_AMOUNT):

tcayzergues_11-1710858238889.png

Save and deploy the view to Datasphere.

 

Create a source view connection in SAP Analytics Cloud

Navigate to SAP Analytics Cloud and go to the connections list from the left menu panel.

Add a new OData Services connection:

tcayzergues_12-1710858278765.png

 

Set connection parameters in the dialog box.

OData consumption view URL has the following structure: <Datasphere based URL>/dwaas-core/odata/v4/consumption/relational/<SPACE>/VW_SFTP_STAGING.

Click on Create to validate the connection:

tcayzergues_13-1710858305331.png

 

Create a SAC Planning data import job

Navigate the modeler view of your SAC Planning model and add a new source of type OData Services:

Select the connection created in the last step and click Next:

tcayzergues_14-1710858325469.png

 

Select your Datasphere consumption view from the list and click Next:

tcayzergues_15-1710858353012.png

 

Select all fields by double clicking the view name, then click Create:

tcayzergues_16-1710858370875.png

 

You can now setup the transformation as in any data import of SAC Planning and rename your job SFTP Upload from DSP:

tcayzergues_17-1710858391246.png

 

 Create the SAC Planning multi-action to trigger data import job

Let’s now create the main sequence of actions in SAP Analytics Cloud to trigger the SFTP extraction flow, data staging and transformation in SAP Datasphere and import data in our SAC Planning model.

Go to the multi-action menu and create a new multi-action from the tile:

tcayzergues_18-1710858410653.png

 

Add a new API step in the canvas:

tcayzergues_19-1710858422855.png

 

Rename it and choose your REST API connection from the list. Fill the REST service URL as set in the connection:

tcayzergues_20-1710858437822.png

 

To add dynamic input parameters for the integration flow execution, insert parameters:

tcayzergues_21-1710858453245.png

 

Create parameter for Version and do the same for Date and Organization-type dimension (ConsolidationUnit in the example):

tcayzergues_22-1710858491360.png

Once created, click on Edit JSON to organize to body of the calling message with a correct JSON syntax:

tcayzergues_23-1710858513275.png

tcayzergues_24-1710858530643.png

Click OK to validate the JSON syntax.

Save the multi-action to the directory of your choice in SAC.

 

Add a Data Import Step after the execution of the REST API:

tcayzergues_25-1710858568834.png

 

Set the properties of your import step to connect to the target planning model and select the import job you have configured in a previous step:

tcayzergues_26-1710858585013.png

Save the multi-action.

 

Open your SAC Planning story in edit mode and add a new multi-action trigger from the Planning Actions menu:

 

tcayzergues_28-1710858627745.png

 

Set the name and the multi-action to trigger, add any relevant story filter to be passed directly as execution parameters:

tcayzergues_29-1710858659994.png

 

Set the multi-action trigger style properties to fit your enterprise rendering and save the story.

 

Testing the solution

Put a new CSV file into your SFTP server directory:

tcayzergues_30-1710858682141.png

 

Go to your SAP Analytics Cloud story and trigger the multi-action button:

tcayzergues_31-1710858716273.png

 

Parameters are read from the story filters and you can now run the multi-action:

tcayzergues_32-1710858740737.png

 

Once the execution has ended successfully, the story will be populated with data loaded from the original file.

 

Conclusion

In this use case, we have seen a concrete example on how to combine several pieces from the SAP BTP catalogue to serve a classical xP&A reporting process.

Usage of an intermediate layer like SAP Datasphere rapidly becomes a must have when you need to transform, harmonize, or map data from several sources.

Because business users want to keep some high degree of autonomy to manage this at their own pace, you would also need the gluing capacity of a message-oriented middleware like SAP Integration Suite.

We hope you can expand this starting use case to adapt it and create the one that fits to your own situation.

5 Comments