Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
sitakant_tripathy2
Active Participant

As part of the Digital initiatives, enterprises are continuously improving their application landscape by implementing best-of-breed SAAS applications to provide for business objectives. In doing so, integration has become a mainstay and a focus area to ensure seamless and realtime flow of data across these discrete applications. The idea of this blog to highlight one of the use cases and how enterprises can adopt a multi-cloud strategy to solve a particular problem.


Problem Statement: We are currently working on an HR transformation implementing SuccessFactors and have CPI as our strategic platform to implement any integration requirements originating out of SuccessFactors. For one of the integration scenarios, we had a requirement wherein the third party service provider application publishes real time events as a Webhook with a secret.

Webhooks are simple APIs which provide an efficient mechanism to publish events with a very small payload and are usually padded with secret keys which could be validated on the receiving end. Unfortunately, CPI only relies on basic or certificate based authentications and is not able to validate the secret tokens available on the header.

Solution OverviewThe design patterns for these scenario could primarily be as follows:

  • Expose an API end point to which the publishing applications can post the events to with the secret key and payload

  • Manage and validate the authentication secret within the API policies

  • Relay the event payload to backend CPI end-point for further processing.


This design pattern can be implemented using the Enterprise Messaging Webhook/ API Management within SAP cloud platform but in our case we already have Azure as our strategic PAAS and hence decided to build the API Layer in Azure.


High Level Process Flow


 

In the section below, I have tried to capture some detailing on the individual components identified in the image above:

Azure Gateways: In simplistic terms, the gateway provides us with a centralised, secure and scalable door to manage ingress to the enterprise web resources. There are various choices available here and could turn out be multiple blogs in themselves 🙂

Azure Key Vault: Azure key Vault provides a centralised repository to maintain and protect encryption keys and secrets like passwords, connection strings, web tokens in a secure and version controlled manner. When orchestrated, these can be managed in isolation and allow us to build robust access policies per application resource or resource group. To further understanding, please follow this produce page https://azure.microsoft.com/en-gb/services/key-vault/. For my use case, I created the following secrets:

  • Secret Key to be provided by the Event publishing client

  • Authentication secret for connecting to CPI



Key Vault Secrets


Access to this key vault was secured through the Access Policies/Managed Identities for my defined API resource which only had the read access.


Key Vault Access Policies


 

Azure API Management: API Management has majorly the following components to it:

Front-End: Front allows us to define the structure of the API and you could define the structure of the Headers, Request, Responses and Queries. Since my scenario was a payload pass through and the payload was dictated by the event publisher, hence, I only had to define the Secret Key as an element in the header.


Defining Header


Inbound Processing: Inbound processing could be named as mini workflows in themselves as they are sequential config statements that are executed to alter the behaviour of the API before it hits the back-end. Most of the times, policies can be limited to security and authentication and can range from location based, token based or MFA based. However, the policies in the API Management layer can do so much more. Please follow this link https://docs.microsoft.com/en-us/azure/api-management/api-management-policies to understand the various policies available. For comparison, please follow this link https://help.sap.com/viewer/66d066d903c2473f81ec33acfe2ccdb4/Cloud/en-US/c918e2803dfd4fc487e86d0875e... to learn about policies available in SCP API Management.

In our scenario, the policy was written to cover the following:

  • Read Secret Key From Cache if this was already read  within the last hour


<policies>
<inbound>
<base />
<!-- check the cache for secret first -->
<cache-lookup-value key="secret" variable-name="keyvaultsecretResponse" />


  • Read Secret Key from Key Vault if not already available in Cache


 <!-- call Key Vault if not found in cache -->
<choose>
<when condition="@(!context.Variables.ContainsKey("keyvaultsecretResponse"))">
<send-request mode="new" response-variable-name="keyvaultsecret" timeout="20" ignore-error="false">
<set-url>https://successfactors-keys.vault.azure.net/secrets/experian-secret/?api-version=7.0</set-url>
<set-method>GET</set-method>
<authentication-managed-identity resource="https://vault.azure.net" />
</send-request>


  • Transform Response from Key Vault to Secret Key: Response from Key Vault is a deep JSON and the transformation below was to get only the secret key into a string variable and store it in cache for an hour.


  <!-- transform response to string and store in cache -->                
<set-variable name="keyvaultsecretResponse" value="@{ var secret = ((IResponse)context.Variables["keyvaultsecret"]).Body.As<JObject>(); return secret["value"].ToString(); }" />
<cache-store-value key="secret" value="@((string)context.Variables["keyvaultsecretResponse"])" duration="3600" />


  • Validate Secret Key presented in the incoming request and prepare the response


<check-header name="X-Webhook-Secret" failed-check-httpcode="401" failed-check-error-message="Not authorized" ignore-case="false">
<value>@((string)context.Variables["keyvaultsecretResponse"])</value>
</check-header>


  • Set Back-end to Logic App


<set-method id="apim-generated-policy">POST</set-method>
<rewrite-uri id="apim-generated-policy" template="/manual/paths/invoke/?api-version=2016-06-01&amp;sp=/triggers/manual/run&amp;sv=1.0&amp;sig={{sf-bgc-updates_manual-invoke_}}" />
<set-header id="apim-generated-policy" name="Ocp-Apim-Subscription-Key" exists-action="delete" />

 

Back-End: This section allows us to build the handler and provider resources for the API. This can range from Azure services like Logic Apps, Service bus Queues/Topics, RFC function call wrapped in Logic Apps to OData Services implemented in Netweaver Gateway. In our case we could have taken a simplistic route of configuring the actual CPI end-point but that would have left us with limited capability to cater to failures arising from network and application uptime issues. We decided to wrap the CPI call inside a Logic App to avail alerts, monitoring and re-processing capabilities. The logic app is a simple 3 step workflow:

  • Define the trigger



 

  • Get CPI Authentication Key from Key Vault: In this step, the logic app step makes a connection to the Azure Key Vault and retrieves the key to login to CPI. This step requires Logic App to have a connection to Key Vault using service accounts or managed identities. Logic app provides settings wherein the step content can be hidden so as not to be shown in the run details.






  • Connect to CPI and pass the payload: Dynamic expressions can be used to pass the data variables from previous steps. In this case it allows me the pass the authentication key from previous step and also the entire body from the HTTP trigger step. Pretty handy 🙂



 

Hope this has been worth your time and provides a little window into the art of the possible in Azure API Management and how you could do a mash up of multiple services and resources to solve a problem.

 

Regards,

Sitakant

 
Labels in this area