Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
svenhuberti
Product and Topic Expert
Product and Topic Expert


Many customers use Splunk to aggregate and analyse logs from various applications.

If you want to do the same for Cloud Integration, part of SAP BTP Integration Suite, this blog is for you.

And I am not talking about the Splunk Adapter that was released in January. I am talking about the external logging feature.

Using the Splunk Adapter may give you flexibility, but if you keep in mind the licensing, it also means that messages will be metered. Also, there may be technical limitations that could impact the rest of your integration flow. Basically, it makes more sense to use the built in feature of Cloud Integration.

Since I love to test things before I talk about them, here is a summary of how I configured that external logging on my tenant. A "thank you" goes out especially to sunny.kapoor2 and the SAP PM, Dev & Ops teams who are always keen on helping.

Splunk Setup


Let's ready your Splunk environment first. There is not much to do actually, but let's go through it.

First, define an index using the Settings/Indexes page of your Splunk environment, using a meaningful name. I used "ca_de_ci_index" as you will see later. Make sure to remember the name.


I am no Splunk specialist so I am simply leaving the fields as per default:



Now we need to define how Cloud Integration will talk to Splunk. We'll do this over the "Data Inputs", especially the "HTTP Event Collector".

To do so, create a new "HTTP Event Collector" from the Settings/Data Inputs page.



Use the index you have created previously.


In the overview page, copy the token that has been generated for you. You will need it in a minute.


Now that we have configured Splunk, we can move on to the SAP BTP configuration.

SAP BTP configuration


Create Destination


As explained in the documentation, you will first need to create a destination to Splunk. That will happen in the BTP sub-account where Integration Suite is residing.

The name of the destination should be "CloudIntegration_MonitoringDataConsumer". Otherwise you may get an error during the activation of the external logging.

  • The "URL" property is the one from your Splunk tenant (I am using Splunk Cloud)

  • The "AuthHeader" property contains "Splunk" followed by the token you previously generated in Splunk.

  • The "IndexName" property is the one you configured previously.

  • The "TargetSystem" property should be "Splunk".


For your convenience, here is a screenshot of the configuration that works for me.



Enabling access to the Cloud Integration APIs


In order to access the APIs of Cloud Integration that will let you enable, disable and check external logging, you will need to generate a service key with the correct rights.

This is done in multiple steps, explained below.

Create a new role collection


In my experience I could not add the correct roles to the service key (that we will need later) if my user account had not these roles.

Hence I created a new "Role Collection" which will contain the "ExternalLoggingActivate" and "ExternalLoggingActivationRead" roles.

I assigned that role collection to the user who will create the API Service key, which was myself.

All of this can be done within one screen and is pretty easy.



Create an API Service Key


As you know, to access the underlying Cloud Integration APIs, you need to generate a Service Key with the right grant types and roles.

This can be generated easily when you go into the Integration Suite sub-account, and click on "Create" on the top right button.


Create a new "service instance".

Select the "Process Integration Runtime" service, select the "api" plan and give it a meaningful name.


In the second step, select the "ExternalLoggingActivate" and "ExternalLoggingActivationRead" roles. Select the "client_credentials" and "password" grant-types.


In case you are interested, here is the JSON format of the settings:
{
"grant-types": [
"client_credentials",
"password"
],
"redirect-uris": [],
"roles": [
"ExternalLoggingActivate",
"ExternalLoggingActivationRead"
]
}

Create service key


Now that you have a service instance, you will create a service key that will contain the credentials to call the Cloud Integration APIs.

Click on the 3 dots next to your service instance and select "Create service key".


Give it a meaningful name and click "Create". You can leave all the defaults.



Activate the external logging


The activation of external logging is done over an API Call that you will make against the Cloud Integration APIs, using the credentials previously setup. Note that this API is not yet documented in the SAP Accelerator Hub.

Open you preferred API tool, like Postman, in order to build the POST request against your Cloud Integration management API.

You can get the endpoint URL from the service key previously created:


Now that you have the endpoint, append the following to it:

/api/v1/activateExternalLogging

The tricky part comes with the Authorization, which at this point, seems to only work with "Password Credentials".

I recommend to use the very convenient Postman features that generate tokens for you after configuration.

To do so, Select the "Oauth" Authorization type in the Authorization part of your request.

In the "Configure new token" part of the wizard, fill out the details as needed and matching your configuration.

  • The "Grant Type" needs to be "Password Credentials".

  • The "Access Token URL", "ClientID" and "ClientSecret" needs to be copied from your service key.

  • The "Username" and "Password" are the ones you use to login to your BTP.

  • The "Client Authentication" should be sent as "Basic Auth Header".



 

Now let Postman generate the token by clicking on "Get new access token" button at the bottom of the page. If something fails, please review your parameters carefully.

Once you have generated the token, send the request to the backend Cloud Integration API.

If everything goes right, you should have enabled external logging in Cloud Integration.

 

If you are sure that all your parameters are right, and you are still getting an error like the one below, it makes sense to contact support.

Please do so by opening a ticket on the component "lod-hci-pi-ops".


Operations had to change some parameters on my tenant, however I am not using a productive one so that could have been the origin of the issue.

Check Splunk


If everything went fine, you are now seeing the Cloud Integration logs in Splunk.

Simply filter on the index you have created: you can see your logs and analyse them as needed.


 

In case you want to deactivate the external logging to Splunk, use the following API call:

/api/v1/deactivateExternalLogging

Integration flow configuration


As you may have seen from the documentation, you can defined what kind of logs should be sent to Splunk (NONE, INFO, ERROR). That applies to the whole of your Cloud Integration tenant.

To configure this in a more granular way, you can setup the external logging on integration flow level.

To do so, simply go into your Integration Monitor and set the level you require for each integration flow.


 
16 Comments
Sriprasadsbhat
Active Contributor
0 Kudos
Hello Sven,

very helpfull feature and neatly documented..kudos to you!!.Is it possible to use this feature or enhance the same or provide an option for customers to log their payload attachments( which are critical for most of the production support cases with very complex iflows) and select splunk as target.

if above feature is enabled then it is really helpfull for customers deal with data related issues in productive scenario ( where sap recommends not to log payload attachment to mpl and log level error [also not helpful])

Regards,

Sri
Florian_Kube
Participant
0 Kudos
Hi Sven,

thanks for that blog post.

I have some remarks

  1. Could you please add the property "IndexName" to the documentation. This information is missing.

  2. I need to use "Grand Type: Client Credentials" to obtain a token.

  3. I tried to activate logging via Postman but I get the following error:
    {
    "result": "ERROR",
    "message": "Splunk Error (6): Invalid data format",
    "infos": [
    {
    "description": "Read Tenant Information",
    "result": "SUCCESS",
    "dataSent": "",
    "dataReceived": ""
    },
    {
    "description": "Validating Connection Settings",
    "result": "SUCCESS",
    "dataSent": "",
    "dataReceived": "All Settings Valid"
    },
    {
    "description": "Instantiating Selected Adapter",
    "result": "SUCCESS",
    "dataSent": "",
    "dataReceived": "Adapter Created"
    },
    {
    "description": "Calling OPTIONS on endpoint",
    "result": "SUCCESS",
    "dataSent": "OPTIONS /services/collector/event",
    "dataReceived": "200"
    },
    {
    "description": "Polling Splunk HTTP Event Collector Endpoint",
    "result": "ERROR",
    "dataSent": "POST /services/collector/event",
    "dataReceived": "Splunk Error (6): Invalid data format"
    }
    ]
    }

     

    Is there any way i can see the payload to may see whats the real issue?

svenhuberti
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hello Sriprasad,

thanks for your kind words!

I cannot really tell you how we will evolve this in the future since I am in Customer Advisory. But it would make sense to log the payload too, for sure. I guess our development is already looking into this.

For the moment, you can use the "custom headers" (yes, that is a weird name) to store and search payload. The advantage is that you can control on a fine-granular level what information you store and do not need to worry about data governance.

HTH!

Sven

 
svenhuberti
Product and Topic Expert
Product and Topic Expert
Hello Florian,

thanks for your feedback!

To your questions:

  1. I have requested to have "IndexName" to be added to the documentation, thanks for the remark.


2) Yes: I am not sure why I needed to use Password Credentials (maybe my special environment) but glad it worked for you with Client Credentials.

3) This would be needed to be looked into by our DevOps team. I have seen a couple of errors but this one I never saw either. Could you please open a ticket on the component "lod-hci-pi-ops"?

BR,

Sven
putnamehere
Explorer

Hello Sven,

thanks for the Post.

Does the activate call require any body/payload "/api/v1/activateExternalLogging" or is it an empty POST?

Thanks,
Michael

[removed my first question after reading the article once again]

svenhuberti
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi Michael,

as far as I have seen and tested, there is no payload required for the POST call to activate the logs.

Best regards!

Sven
Florian_Kube
Participant
0 Kudos
Hi svenhuberti,

the issue was Apache NiFi was modifing the request.

But could you please also add the Role "AuthGroup_Administrator". This one is also needed to activate the Logging.

Regards Florian
Florian_Kube
Participant
0 Kudos
No payload is needed
HarshC
Active Participant
0 Kudos
Thanks for the well documented walk through. Is this functionality supported by CPI instances deployed on Neo?
svenhuberti
Product and Topic Expert
Product and Topic Expert
jvervenne
Newcomer
Hi Sven,

 

Thanks for the blogpost!

 

Is it possible to register another TargetSystem or is this currently only working with Splunk?

 

Kr,

Jan
tmc20956
Newcomer
0 Kudos

Hi Sven

Thank you very much for the very good tutorial.

Currently I get the following feedback in Postman:

{
"result": "ERROR",
"message": "LocationID is missing.",
"infos": [
{
"description": "Read Tenant Information",
"result": "SUCCESS",
"dataSent": "",
"dataReceived": ""
},
{
"description": "Validating Connection Settings",
"result": "SUCCESS",
"dataSent": "",
"dataReceived": "All Settings Valid"
},
{
"description": "Instantiating Selected Adapter",
"result": "ERROR",
"dataSent": "",
"dataReceived": "LocationID is missing."
}
]
}

Any idea where the error is coming from?
From our Basis team I have the following statement: "Cloud Connector does not use Location ID".

Thanks and kind regards
Andy

tuh
Explorer
0 Kudos
Hi Sven

Thanks for the fine blog and tutorial.

 

Is it possible to modify the output to Splunk so it is more then just the following fields that are set as output

CorrelationId: AGWf16PD_JQ_ymi4CeBIzDH4Klyi
DurationMillis: 1185
IntegrationFlowId: HTTP_JWTSENDER
LocalComponentName: CPI_integrations-suite-trial-xxxx
LogEventId: dxaKyhzjPKhox1PViRcB7A
LogEventType: Run
LogLevel: INFO
LogLevelExternal: INFO
LogSentAtMillis: 1704974245227
MessageGuid: AGWf16N8OpS2oaVfUKwcqTO0CjG5
Node: 1
OriginComponentName: CPI_integrations-suite-trial-jsz
PreviousComponentName: CPI_integrations-suite-trial-jsz
ReceiverIds: [ [-]
]
RunId: AGWf16MvzQ9RVJjpWzFEtQvbGFrC
StartTime: 2024-01-11 11:57:23.811
StartTimeMillis: 1704974243811
Status: COMPLETED
StopTime: 2024-01-11 11:57:24.996
StopTimeMillis: 1704974244996
TenantName: integrations-suite-trial-xxxx
TransactionId: dd326811b2b94b359a07e570709cbf97
TransmissionLagMillis: 231

is it possible to add more specific fields to the output ?

Thanks and hope to hear some more about Splunk logging

Torsten
NareshDasika18
Participant
0 Kudos
Hello Sven,

Is there any update from your development team whether Splunk can be used to store the payloads from SAP Integration Suite besides Message processing logs?

Regards,

Naresh

 
svenhuberti
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hello,

as far as I know, adding specific fields is not possible. Not sure about the future plans either.

But since I am not in PM, I'd rather pass that question to sunny.kapoor2 .

BR!

Sven
AnujJadhav
Newcomer
0 Kudos

Hey,
I am using a Trial account and in the Role Collections I am not able to see the roles  "ExternalLoggingActivate" and "ExternalLoggingActivationRead" as options.

 

Here is a screenshot

AnujJadhav_0-1710136649911.png

What should I do?