Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
mario_nadegger
Participant

Scope


SAP Analytics Cloud offers many possibilities for reporting and planning. Especially SAP Analytics Cloud core planning includes features like “Generating Private Versions” and “Advanced Commenting”. To be able to use SAP Analytics Cloud core planning, data have to be imported into SAP Analytics Cloud Planning Models. SAP Analytics Cloud Planning Models can use global dimension importing data from different sources. In a hybrid scenario with SAP BW as source system it could be very useful to use existing InfoObjects as import source for SAP Analytics Cloud global dimensions and keep data in sync.

Prepare


In this example a SAP BW InfoObject ZEPPART is used as template for our SAP Analytics Cloud global dimension.


In SAP Analytics Cloud a new global dimension was created called ZEPART_001 with the objective to load key and country key out of SAP BW source system.


In Data Management Data Source SAP BW (Including BW/4HANA) with Data Source Type “On-Premise” was selected and further the import data source for the SAP BW System – in our case ZW2_IMPORT.


Filter to the right data source.


Select the SAP BW Import Connection and click next.


After searching for the InfoObject – in our case ZEPPART – we selected fields to be extracted, did some mapping and created the global dimension.

 

Search for the Infoobject in our case ZEPPART.


Select Data you want to integrate and go on the finish the mapping.



Now we were ready to schedule a data load out of SAP Analytics Cloud. We scheduled a load and received 89 records out of SAP BW (ZW2).



Get into scheduling


A first look at the request with the help of Chrome Development tools, could help to identify requests that are used to trigger schedules and read out status of the scheduling.



 

The main service that was called is located in /sap/fpa/services/rest/fpa/dataintegration in our case  https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration. Different parameters could help to steer the scheduling with the help of the service.

The get out logs by model a request to the main service was posted to with parameters action=getScheduleLogsByModel&tenant=G

https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration?action=getScheduleLogsByModel&tenant=G

 

The request payload of this call include the model name in JSON format.

In our example :

{"jobType":"bpcDataImport","modelName":"t.G:ZEPPART_001","mappingId":[]}

 

To schedule a run a request to the main service was posted to with parameters action=runScheduledJob&tenant=G

https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration?action=runScheduledJob&tenant=G

The payload includes name of dataimport “F58CA15F78AA9039E200586F05E2BB9” and mappingid “F48CA15F78AA9039E200586F05E2BB93” in json format like:

{"jobType":"DAFrameworkDataImport","name":"F58CA15F78AA9039E200586F05E2BB93","description":null,"recurrence":null,"param":{"mappingId":"F48CA15F78AA9039E200586F05E2BB93","optParams":{}},"status":"STOPPED"}

Trigger schedule via REST API Call


Wouldn’t it be great to be able to schedule the loading from outside, for instance from process chain?

We succeeded in a first step to run and check a run with the help of Advanced Rest Client version 10.0.12 in Chrome.

User Authorization for REST API Calls


The first hurdle is to be able to post a request from external. The blog by Patrick Volker https://blogs.sap.com/2018/04/20/sap-analytics-cloud-apis-getting-started-guide/ helped us a lot to establish the setup and managed our REST API calls with oauth.

If you did the setup correctly you should have now cookie and bearer token enabling you to call REST API located in your SAP Analytics Cloud instance.



Get Schedule Logs by Model


First we checked the current status of the schedules of our model ZEPPART_001 by posting a request to the main service /sap/fpa/services/rest/fpa/dataintegration with parameter action=getScheduleLogsByModel including the following request payload {"jobType":"bpcDataImport","modelName":"t.G:ZEPPART_001","mappingId":[]}

In our example:

https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration?action=getScheduleLogsByModel&tenant=G

 

As already mentioned to request payload includes model name – in our case ZEPPART_001.

 

As a response we receive a list of runs identified by a “ScheduleID” including status and number of rows imported.



Trigger your Schedule


Now we try to trigger our load with?action=runScheduledJob in a next step.

In our case a request to the main service with parameters action=runScheduledJob&tenant=G:

https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration?action=runScheduledJob&tenant=G

In the payload of the request the name of the DataImport "F58CA15F78AA9039E200586F05E2BB93" and the mappingId “F48CA15F78AA9039E200586F05E2BB93” has to be provided like:

{"jobType":"DAFrameworkDataImport","name":"F58CA15F78AA9039E200586F05E2BB93","description":null,"recurrence":null,"param":{"mappingId":"F48CA15F78AA9039E200586F05E2BB93","optParams":{}},"status":"STOPPED"}

DataImport name and mappingId can be found in the previous response getting logs for the model in case.


With status 200 in payload of the response we can make sure that our request was executed successfully. In addition we receive the scheduleId that identifies our load we triggered.

{

"status": 200,

"data": {

"scheduleId": "F58CA15F78AA9039E200586F05E2BB93"

}

}

In SAP Analytics Cloud we now can spot a new schedule triggered by our REST API call.



Check Data Load


It would be very helpful to check status of our run via REST API.

We call again the REST API with parameter action=getScheduleLogsByModel for our model. With the help of the scheduleId  "F58CA15F78AA9039E200586F05E2BB93" provided by the previous response we are able to identify the status and some metadata of our load.



Conclusio


With that approach, you are able to execute a data load in SAP Analytics with the help of the REST API call and check its execution status. This could help you to integrate and/or trigger a load with the help of external tools.

In our Scenario SAP BW could trigger with the help of a process chain the SAP Analytics Cloud load right after updating the Infoobject.
35 Comments
ThomasK
Participant

Great article! Might be also a potential solution for triggering data actions via REST.

JefB
Active Contributor
Is this approach officially supported by SAP?
mario_nadegger
Participant
Hi Jef,

Currently not. But I hope that will come soon.

br mario

 
Henry_Banks
Product and Topic Expert
Product and Topic Expert
Hi jefb

SAP supports the SAP Analytics Cloud application, the functioning of its API framework, and the availability its Data Acquisition services.

However, any Custom development on top of that falls within the remit of the customers' own code support.

In this case, i think it's Postman being used to test the program logic - you wouldn't use that to send Rest requests productively.

Regards
H
JefB
Active Contributor
Thanks Henry. Great to hear the API framework itself is officially supported!
Now I wish we could use them more easily in application designer 🙂
petrisnovak
Explorer
0 Kudos
Fantastic article, thank you!

How can i obtain that x-csrf-token? I am able to obtain a bearer token but i do not know where should i find x-csrf-token value and also cookie value as mentioned on the one of your screenshots. I am able to successfully use API for users and groups resources but this "modelling API" seems to need a little bit different approach.

Thanks for any additional info!

Regards,

PN
petrisnovak
Explorer
0 Kudos
At this moment I am getting the message below while calling getScheduleLogsByModel action:

Error Code: 3000, Error Message: User is not properly configured or user may not exist in tenant



Thank you!


PN


vitran23
Active Participant
How do you get this setup to work if your using a identity provider to authenticate users and not the built in authentication?
mario_nadegger
Participant
0 Kudos
Hi Vi Tran,

We are using OAUTH2 with OAUTH2 Client from backend.

 

Like described in https://blogs.sap.com/2018/04/20/sap-analytics-cloud-apis-getting-started-guide/

br mario
vitran23
Active Participant

Is there a good guide out there when using OAUTH2 with SAML.  We are using a SAML IDP to authenticate the user and not the built in SAC user authentication.

vitran23
Active Participant
0 Kudos
Would this work around work for file server date loads?
kurt_renner
Explorer
0 Kudos
I am trying to do a similar thing. We are using Azure AD FS to do single-sign-on federation in our SAC environment for end-users and it is working fine.  Ours is a Cloud Foundry (CF) instance as opposed to a NEO instance.  Much of the documentation I've stumbled upon make more references to NEO environments as opposed to CF environments.

My goal is to create a PowerShell script that can start an SAC dimension data management refresh job which would be called by an external job scheduler.  I have reviewed https://blogs.sap.com/2018/04/20/sap-analytics-cloud-apis-getting-started-guide/ many times but am unable to get it working.

Using Postman, I am able to retrieve an OAuth token with basic authentication and a grant type of client_credentials.


OAuth Client URLs


Using that token as a bearer token for calls to various endpoints under https://{SAC_Hostname}.hcs.cloud.sap/api/v1 , I am able to retrieve a list of groups, users, get details on an individual user, and retrieve a list of Resources (SAP Analytics Cloud User and Team Provisioning API - SAP Help Portal)

However, this same bearer token does not work for the dataintegration endpoint which used to initiate data refreshes within SAC (https://{SAC_Hostname}.hcs.cloud.sap/sap/fpa/services/rest/fpa/dataintegration).

 

The path is different, and I'm assuming it is a different API all together requiring its own authentication and retrieval of a token steps.  I can't figure out how to authenticate to this API to interact with it, and cannot find any useful documentation besides this blog and the other referenced (https://blogs.sap.com/2018/04/20/sap-analytics-cloud-apis-getting-started-guide/)

Any additional guidance you can provide or working examples in a similar environment as ours would be much appreciated!
kurt_renner
Explorer
0 Kudos
I did find the following SAP article

Configuring OAuth 2.0 SAML Bearer Assertion Workflow (sap.com) but it specifically refers to "server access to the REST API when your application is deployed on a SAP Cloud Platform (SAPCP) server and uses the SAPCP SDK."  Ours is a Cloud Foundry (CF) instance so it is not fully applicable to our environment.
hackal
Discoverer
0 Kudos
Hi Kurt,

I was trying exactly the same and was also facing the same issue. Working with groups, users and so on does work, but I couldn't find any solution so far to work with the endpoint/api for scheduling.

In detail when I try to use this api I get some html/javascript-content containing a comment "..<!-- we should only load this content when not authenticated -->...".

Any additional help/guidance for this set up would be really be great!
kurt_renner
Explorer
0 Kudos
Hi Michaela,

Thank you for validating my issue.

I am getting the same response you are seeing:

<html>

<head>

</head>

<body>

    <!-- we should only load this content when not authenticated-->

    <script>

        if (window.parent === window) {

      // the page is not loaded in an iframe

      var currentUrl = window.location.href;

      // e.g. https://myhost:433

      var baseUrl = window.location.protocol + '//' + window.location.host;

      var currentRelativeUrl = currentUrl.substring(baseUrl.length);

      var encodedRedirectUrl = encodeURIComponent(currentRelativeUrl);

      var redirectEndpoint = '/approuter/v1/redirect?url=' + encodedRedirectUrl;

      window.location.href = redirectEndpoint;

    } else {

      var url = '/approuter/v1/embedded-auth';

      var popupWindow;

      window.addEventListener('message', function(event) {

        // check event.origin

        console.log(event.data);

        if (popupWindow) {

          popupWindow.close();

          window.location.reload();

        }

      });

      popupWindow = window.open(url, 'SAP Analytics Cloud', 'width=600,height=500,menubar=0,dependent=1,status=0');

    }

    </script>

</body>

</html>

vitran23
Active Participant
0 Kudos
We are facing the same issue with our Okta SAML IDP. Still playing with things as far as getting a destination server setup to see if that will allow this workaround to work.
horynao
Participant
0 Kudos
Hello everyone,

I would like to ask, if someone could help me with how to obtain x-csrf-token from, maybe  https://<TENANT>/sap/fpa/services/rest/fpa/dataintegration, api url? Is this right api url, which provides the x-csrf-token from access_token? I am in the point, I got access_token from https://<TENANT>/oauth/token.

Thanks in advance

Ondrej H.
vitran23
Active Participant
0 Kudos
What are you using to authenticate the users?  The built in SAP Authentication or a SAML SSO?
horynao
Participant
0 Kudos
Hello, built in SAP auth, Ondrej
vitran23
Active Participant
mario_nadegger I've got to the point of trying to pull the logs for the schedule job but whenever I submit my post I get this HTML page returned even though I have all the correct bearer token and authorizations.
<html>

<head>

</head>

<body>
<!-- we should only load this content when not authenticated-->
<script>

if (window.parent === window) {
// the page is not loaded in an iframe
var currentUrl = window.location.href;
// e.g. https://myhost:433
var baseUrl = window.location.protocol + '//' + window.location.host;
var currentRelativeUrl = currentUrl.substring(baseUrl.length);
var encodedRedirectUrl = encodeURIComponent(currentRelativeUrl);
var redirectEndpoint = '/approuter/v1/redirect?url=' + encodedRedirectUrl;
window.location.href = redirectEndpoint;
} else {
var url = '/approuter/v1/embedded-auth';
var popupWindow;
window.addEventListener('message', function(event) {
// check event.origin
console.log(event.data);
if (popupWindow) {
popupWindow.close();
window.location.reload();
}
});
popupWindow = window.open(url, 'SAP Analytics Cloud', 'width=600,height=500,menubar=0,dependent=1,status=0');
}

</script>
</body>

</html>
sccntt
Discoverer
Hi Guys,

I am getting the same response.  Did anybody fix this?

Thanks
vitran23
Active Participant
No I haven't gotten farther than what I've posted since it seems to want to authenticate with my Okta SAML and it gets stuck there.
sccntt
Discoverer
We have Okta SAML but some other REST APIs that are in SAP official documents work fine, for example (user provisioning APIs).

I think, SAP changed "authorization grant" options a lot since this blog was published.  So new "authorization grant" options may not allow using this REST API anymore.

 

 
vitran23
Active Participant
Yeah I noticed the other "released" api's I'm able to get working.  Was hoping I would be able to get this to work also but as you stated they could have closed the back door.
dan_dukan5
Explorer
Hi

We are facing to the same. Did you fix it?

 

Thanks

Dan
eresresr
Member
I also get this error. The tokens are all generated.
Is there any news here? I would like to manage our SAC jobs through our job scheduler.
vitran23
Active Participant
I haven't had any success and dropped the project after getting stuck at that point.  Hopefully there will be a API released down the road to be able to trigger the schedule.
0 Kudos

Hi Lukas,

thanks for your very interesting blog.

Currently, I am trying to use the API of the import job to the model by referring this blog, but it does not work well.

In addition, by referring to the URL below, we were able to execute the process to get the list of stories.
①SAP Analytics Cloud APIs: Getting Started Guide
https://blogs.sap.com/2018/04/20/sap-analytics-cloud-apis-getting-started-guide/#url_api

Probably in this blog
「If you did the setup correctly you should have now cookie and bearer token enabling you to call REST API located in your SAP Analytics Cloud instance.」
It seems that the above "cookie" part is not acquired correctly.
I would appreciate it if you could give me some information on how to get this cookie information.

Information other than cookies is obtained below.

BearerToken:
https: // [tenant authentication URL] / oauth / token? grant_type = client_credentials
Use the access_token obtained by calling (OAuth client login information passes basic authentication)

X-CSRF-Token
https: // [tenant URL] / api / v1 / scim / Users
At the time of calling, specify "x-csrf-token-fetch" in the request header and execute it.
Use "x-csrf-token" in the returned response header

Currently, what is not working is getting the job execution result list, and I am trying to execute the following in the blog of ①.
Https: // [tenant authentication URL] / sap / fpa / services / rest / fpa / dataintegration? action = getScheduleLogsByModel & tenant = I
In addition, when executing this, if you use the cookie and x-csrf-token of the request obtained by accessing SAC with a browser and remove Authorization from the header, it will work correctly, so I have confirmed that it works there is no problem if you do not use the OAuth client. (Therefore, I recognize that there is no particular problem with the JSON specified in Body.)

that's all, thank you for your cooperation.

moritzhofmaier
Explorer
Hi Mario,

 

thanks for the great blog. We are facing the same problems as some of the previous commenters. Does your solution still work for you or did any of you find a solution to the " <!-- we should only load this content when not authenticated-->" problem?

Best regards,

Moritz
former_member803493
Discoverer
0 Kudos
Hi Kurt,

 

we are currently dealing with the same issue, did you already find a solution?

BR,

 

Christian
kurt_renner
Explorer
0 Kudos
Hello Christian,

No I did not find a solution.  I gave up.
former_member803493
Discoverer
0 Kudos

Hi Henry,

I think the described function is crucial for many planning scenarios to orchestrate planning processes across different systems. For sample loading reference data from BW in daily base.

Is there any supported solution available or on the roadmap?

 

BR

 

Christian

TarikOubedda
Explorer
0 Kudos
Did anyone find the solution for this issue?

We are using SSO (Azure idp) and we are able to make it past all steps except the schedule listing and scheduling part:

First Get for retrieving access_token is successful

  • Get
    *.us10.hana.ondemand.com/oauth/token?grant_type=client_credentials

    • using Basic Authentication (credentials from data integration... )

    • Response: access_token




Second Get for retrieving x-csrf-token is successful

  • Get
    *cloud.sap/api/v1/scim/Users

    •  Headers

      • authorization: bearer <access_token>

      • x-csrf-token: fetch

      • x-sap-sac-custom-auth: true




    • Response: header x-csrf-token




Thirst Call, for getting list of schedules returns a redirect page

Thank you
0 Kudos
Hi Tarik

Any luck? I'm also stuck in the same spot.

Thank You.

Vijhay Devarajan.

 
JohnL
Product and Topic Expert
Product and Topic Expert
Labels in this area