Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
christianlechne
Active Contributor

Idea and Scenario


The topic of side-by-side extension is out now for quite a while in the SAP ecosystem, but the playfield changed a bit over the last years as some new players entered the field (e. g. a managed Kyma offering) and the combination (or should I say the embrace) of the SAP stack and non-SAP platforms (like Microsoft Azure) has also gained attention. With this post I would like to dig into one possible scenario on how such an extension can look like focusing on the “new” players and see where there are pitfalls or issues be it because of combining SAP and Microsoft or due to problems in one or the other platform itself.

The storyline from a business perspective is the following: let us assume a customer order was placed and the article was in stock at that point in time. During the processing of the order, it becomes apparent that the article cannot be delivered in time. Of course, we could now plainly inform the customer about the delay and that is it, but maybe the customer was already subject to several delays in prior orders and complained about that. Maybe it makes sense to check the order history and if there are delayed orders also if there have been complaints. Based on this analysis we can decide how to proceed and maybe give an additional discount when informing the customer about the delay.

Disclaimer: No idea if this makes perfect sense from a business perspective, but for the further implementation, let us just assume that it makes sense 😊

How would that look like from a system perspective:

  • A SAP Commerce Cloud system raises an “processing” event stating that an order is delayed

  • This event is retrieved by a Kyma function that then orchestrates several steps:

    • First it makes a call back to Commerce Cloud to get further information about the order i. e. the ID of the business partner to get the order history of the business partner.

    • Then it calls an ABAP system (e. g. Steampunk) to get the order history of the business partner.

    • If the order history contains delayed orders the function triggers a check if complaints connected to the orders exist. For that it triggers an API that analyzes if there are complaints for the orders and returns the sentiment of the complaints. We assume that this information is stored in a Microsoft Outlook account. To keep things self-contained it makes sense to develop this building block in Azure and provide the API via an Azure Function i. e. a Durable Function.

    • Based on the analysis the Kyma function can call the Business Rule Service to decide upon the next steps.




The overview of the solution looks like this:


In the following sections we will walk through the implementation of the scenario but a few words before we start to manage expectations:

  • The code for the sample is available on GitHub (find reference in each section) but it is not production ready. There are some edges that need to be removed, but I would assume it a good starting point.

  • I do not have access to “real” systems like an SAP S/4HANA system or an SAP Commerce Cloud and I assume that is also valid for some readers. So, I focused on what is possible within SAP Cloud Platform Trial and for the rest the motto is “fake it until you make it”. As we are mostly dealing with APIs this restriction does not really matter.
    The story is different for Azure as there are just “productive” accounts with free tiering.

  • The emails for the complaints are stored in my testing mail account … as mentioned: fake it until you make it.


Let us start the journey … this is the way.

Step 0: Setup Kyma on SAP Cloud Platform


As Kyma is the central spoke for our scenario make sure that you have setup Kyma on your SCP trial. You find some information about that in this blog post https://blogs.sap.com/2020/10/09/kyma-runtime-available-in-trial-and-now-we-are-complete/ or in this video: https://youtu.be/uhkbbH7oS5g

Step 1: Event from the Commerce Cloud and Callback (Mockup)



Components: Varkes, Kyma

Where to find the code: https://github.com/lechnerc77/ExtensionSample-CommerceMock

The initial trigger is the SAP Commerce Cloud event. How to achieve that without a SAP Commerce Cloud system at hand? The Kyma ecosystem helps us a lot with this as they have Varkes (https://github.com/kyma-incubator/varkes) in their portfolio. Varkes is a framework for mocking applications and providing the necessary tooling for binding the mocked applications to Kyma. Even better SAP provides a prepopulated implementation for some SAP CX systems including SAP Commerce Cloud that is available on GitHub https://github.com/SAP-samples/xf-application-mocks.

This repository is the baseline for our mockup. We make some adoptions as we do not need the full-fledged mock and consequently throw out some files from the API folder reducing the APIs to the commerce-services and events:


So far, we are fine for the event that triggers the scenario. Next, we must tell the mock what to return when the Kyma function calls back to get more information about the order that is delayed i. e. the ID of the business partner. To achieve this, we make an adjustment to the app.js file. If you look into that file you will see that there is a customizeMock function that allows us to define what data is returned when a specific route is matched. We code that a business partner ID is returned.


Now we need to bring that code into Kyma. First, we must build a Docker image. The repository contains the Dockerfile as well as a Make file. I assume you use the Docker Hub as image repository, so you must to replace the docker ID in the Make File and then run

make push-image

After that we create a deployment in Kyma that will serve as our Commerce Cloud Mock system. The corresponding yaml files are available in the deployment folder of the repository:

  • The yaml file contains the definition of the deployment (here again replace your docker ID for the container image) as well as the definition of the service

  • The yaml file contains the definition of the API rule


We deploy the app in a dedicated namespace via

kubectl -n <your namespace> apply -f k8s.yaml

and the API rules via

kubectl -n <your namespace> apply -f apirule.yaml

The deployment takes a while, so in the meantime we setup the prerequisites to allow other Kyma apps to consume the events and services of the Commerce Cloud Mock. You find the detailed procedure either in the blog post https://blogs.sap.com/2020/06/17/sap-cloud-platform-extension-factory-kyma-runtime-mock-applications...or in this walk-through video: https://youtu.be/r9mlTXHfnNM

As a rough guide you must register a Commerce Cloud system in the SAP Cloud Platform:


In this process you get a token. Switch back to deployed Commerce Cloud Mock app in Kyma and access the connection overview via the link in the API rule:


Here you find a connect button in the upper right corner where you must copy your token. After that, the mock system is connected to SAP Cloud Platform.


The next step is to create a formation in the SAP Cloud Platform Cockpit:


This way we link the system together with the subaccount.

Finally, we expose the APIs and events from our mock system via the “Register All” button in the Commerce Cloud Mock dashboard:


Result: the foundation for the first two steps handled by the Kyma function is in place.

In the next step we retrieve the order history from the ABAP system.

Step 2: Retrieve Order History from Backend



Components: ABAP Steampunk (at least I tried), Kyma

Where to find the code (you will see later why there are two repos):

We expose the order history via an OData service made available by the ABAP environment on SAP Cloud Platform aka Steampunk (at least that was my plan).

Setting up the ABAP environment on SAP Cloud Platform is easy as a so-called booster is available for that:


Triggering the booster will execute the setup and after the execution you get the service key needed for connecting to your project in Eclipse. For details on the setup see: https://developers.sap.com/tutorials/abap-environment-trial-onboarding.html

We create a simple OData service that is just needed to read the data, so no fancy behavior definition needed. Basically, we create a database table with the information, the CDS view, the corresponding service definition, and the service binding as well as the communication scenario.

This is straightforward and we end up with a simple view:



 

filled with some data via the interface if_oo_adt_classrun


The OData service delivers the expected results:



Next, we create the communication arrangement. For those of you who do not work too much with ABAP Steampunk on SAP Cloud Platform Trial (like me), I have a little surprise: the creation of an communication arrangement (as described here  https://developers.sap.com/tutorials/abap-environment-communication-arrangement.html) is not possible on trial. The corresponding Fiori apps are not available. This means that we cannot call our service from the outside as we cannot fetch a token for the call in the way it is intended (Certainly, there might be a way to somehow retrieve the token, but that is a heck of a workaround that I did not want to explore). This is the moment where we take a deep breath (it is okay to swear – be assured I did), calm down and fake it until SAP will make it.

What do we need? An API that returns the data of an order history. What do we have? Kyma. So, we create another mock but this time for the ABAP backend. I did so by building a simple Go application that returns the data as the ABAP service did. You find the code as mentioned in the beginning on GitHub https://github.com/lechnerc77/ExtensionSample-ABAPMock

It is a very basic Go program that contains the three entries hard coded and exposes an API with an endpoint called orderhistory. Super simple, just to make things work and get the ABAP impediment out of the way. The next steps are the same as for the Commerce Cloud Mock deployment:

  • Build the Docker image (Docker file)

  • Push the image to Docker Hub (Make file)

  • Create a namespace for the mock and deploy it using the deployment.yaml

  • Deploy the API rule to access the API using the apirule.yaml


In contrast to Steampunk we can put in place an OAuth flow to get the token to access the secured endpoint:


For that we set up an OAuth client in Kyma that provides the Client ID and the Client secret.

Achievement for fetching the order history unlocked, however in a way that was not planned. As we have the order history, so prior orders of the business partner including their state we can now cover the next step namely checking if there have been any mails connected to the orders and execute a sentiment analysis on them.

Step 3: Get Emails with potential complaints and do sentiment analysis



 

Components: Azure Durable Functions, Microsoft Graph, Cognitive Services

Where to find the code: https://github.com/lechnerc77/ExtensionSample-DurableFunction

We assume that the correspondence with respect to orders is collected in a central Outlook mailbox. We must search the emails that are related to an order and then trigger the sentiment analysis for them. For the first part we use Microsoft Graph (https://docs.microsoft.com/en-Us/graph/overview), which offers us a central and unified entry point to interact with the Microsoft365 universe. For the later we use the text analytics service that is part of the Cognitive Services on Azure (https://azure.microsoft.com/en-us/services/cognitive-services/text-analytics/#features). Cognitive Services are pretrained ML services covering several area like image recognition or like in our case text analytics.

We offer an HTTP endpoint for the caller to hand over the order history, orchestrate the steps from above in sequence, and we want to take care about servers less, so we will make use of Microsoft Function as a Service (FaaS) offering aka Azure Functions (https://docs.microsoft.com/en-us/azure/azure-functions/). To be more specific we use the Durable Function extension that enables us to model the step sequence code-based and still benefit from serverless paradigms. In other words the extension allows us to bring some kind of stateful workflow-like behavior to FaaS. If you want to dig deeper into that area this blog https://blogs.sap.com/2020/02/17/a-serverless-extension-story-ii-bringing-state-to-the-stateless/ and this video https://youtu.be/pRhta19FNIg might be helpful.

Center Part: Azure Durable Function


The central part is a durable function that is triggered by an HTTP request. We secure the starter function via a function access key. This is not the best way, but okay for our proof of concept. You find further details on this topic here: https://docs.microsoft.com/en-us/azure/azure-functions/security-concepts.

The durable orchestrator handles two activity functions representing the two steps that we want to execute. The first one for searching the emails and the second one for calling the Cognitive Service API. The only additional logic in the orchestrator is that it makes the result of the first activity function available to the second one and return the result to the second one:


The business logic is exclusively located in the activities.

Fetching Emails via Microsoft Graph


The first activity function takes the order numbers and calls the Microsoft Graph API to get all emails that contain the order numbers. To make this call possible we must allow the function to access the emails. This is done via an App registration in the Azure Active Directory that contains the permissions:


Be aware that we have no user interaction, so we need an Application permission and not a delegated permission. The admin must consent to set it active. The app registration provides us the necessary secrets to make the call.

The activity then uses the OAuth Client Credentials flow to get the Bearer token from the OAuth client and read the emails.


One remark: you will see that the fetching of the Bearer token as well as the call of Microsoft Graph is done manually via Axios (https://www.npmjs.com/package/axios). There are libraries that you can use for the authentication stuff (msal.js - https://github.com/AzureAD/microsoft-authentication-library-for-js) as well as for the Microsoft Graph API (Microsoft Graph SDK - https://github.com/microsoftgraph/msgraph-sdk-javascript) but both do not (yet) support the silent approach via client credentials that we use. Not a big deal, but more work on our side.

Sentiment Analysis via Cognitive Services


The second activity uses the emails and calls the text analytics to do the sentiment analysis. For that we create a Cognitive Service resource in Azure Portal that then also provides us the keys for accessing the HTTP endpoints.

The activity itself basically calls the endpoint and hands over the email. We then extract the number of positive, neutral, and negative sentiments from the result and hand it back to the orchestrator. This time we are a luckier than in the first activity as we can make use of the Text Analytics Client provided by Microsoft. Here is a snippet of the code:


Last, we deploy the Durable Function to Azure using the Azure Functions Extension for VS Code (https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions). That was it, no drama, no unexpected impediments on the Microsoft Azure side of the house.

Now time to glue things together in a Kyma function.

Step 4: Gluing the things together in Kyma



Components: Kyma Function

Where to find the code: https://github.com/lechnerc77/ExtensionSample-KymaFunction

We create a dedicated namespace for this function e. g. cx-extension. To get a connection to the Commerce Cloud Mock, we create a namespace binding to our namespace in the integration section of the Kyma dashboard:


Due to this binding, we can create service instances of the Commerce Cloud events as well as for the Commerce Cloud APIs in our namespace that we later consume in the Kyma function:


You find a decent description of the procedure in this blog post https://blogs.sap.com/2020/06/17/sap-cloud-platform-extension-factory-kyma-runtime-commerce-mock-eve...  or a walk-through in this video https://youtu.be/r9mlTXHfnNM

Now we can create the function … or? Let us recap what the function should do:

  • It is triggered by an event – this is a configuration in the function UI – check

  • It calls the Commerce Cloud Mock API via the service instance that we bind to the function to get the business partner ID– that is a configuration in the function UI – check

  • It calls the ABAP backend (mock) to get the order history – this is an external API and we need to get the OAuth token first. We must store the corresponding HTTP endpoints in the environment variables in the function UI and we must store the client ID and client secret there too … no we do not want to do that, that is what Secrets in Kubernetes are for. But we cannot reference them in the UI – investigation needed

  • It calls the Azure Function to get the sentiment analysis results. Same story here, we need the API key and we want to store that safely – investigation needed


The good news at this point is: we can use secrets and config maps in Kyma Functions. The bad news is: we cannot do that via the UI.

We create a config map containing all the relevant configuration information and a secret for all the confidential information and deploy it into the namespace of the function. Now we need to reference these variables in the function. How can we do that without the UI? The answer is via a deployment.yaml file for the function. We can reference the environment variables there as for a “regular” deployment:


The only downside is that we will not see them in the function UI after deployment. Using kubectl describe shows you the truth what is really part of your function environment variables.

You can also put the complete function code in the deployment file, but my plan was to deploy some basic skeleton and adopt the code in the Kyma UI afterwards. This plan changed as there seems to be a bug in the building/deployment of the function (see https://github.com/kyma-project/kyma/issues/10303) which screws up the environment variables. We must put the complete code into the yaml file. After successful deployment, the function should be in status “Running”:


Next, we wire up the function with the Commerce Cloud Mock event and the API using the service instances:



One further hint I stumbled across: there seems to be a limitation within the Kyma cluster on trial with respect to available resources (see https://answers.sap.com/questions/13222456/kyma-build-of-function-run-infinitely.html ).  If your function gets stuck in the status “Build” you might have too many resources allocated in your Kyma trial. To resolve it, remove some resources.

Now we should be done. Time to try things out. Let us trigger the function via an event from the Commerce Cloud Mock dashboard and see what the log tells us:



Our Kyma function gets triggered by the event:


We get the order history from our mocked ABAP service:


We get the result from your sentiment analysis:


 

So, things finally worked out!


But wait: we wanted to do something with the Business Rule Service on SAP Cloud Platform, right?

What about the decision service … well I tried


UPDATE: Thanks to archana.shukla and team (see comments) the issues with the booster could be sorted out on my account, delivering the expected experience from the ABAP Steampunk booster. In addition the Business Rule Service setup changed with respect to the app router (see comment below and updated help.sap.com section).

So expect some more to come in that area, am I right martin-pankraz 🙂

Ideally the decision of the follow-up actions after the sentiment analysis is determined via a decision service or business rules. SAP Cloud Platform offers such a service namely the Business Rules service. What is even better there is a booster available for setting things up comprising the Workflow and the Business Rules Service:


As the booster for ABAP went through quite smoothly, I gave it a go and what came back is:


Next stop: developers.sap.com to see if there is some tutorial to follow along and yes there is (https://developers.sap.com/tutorials/cp-starter-ibpm-employeeonboarding-1-setup.html) but this tutorial also uses the booster ☹.

Next try: help.sap.com (https://help.sap.com/viewer/0e4dd38c4e204f47b1ffd09e5684537b/Cloud/en-US/c045b537db3c4743a5e7d21d798...😞


Attention: these sections might be misleading as they are relevant for customers with a Portal service only (kind of legacy so to say). So when setting up you Business Rules as of now in the Cloud Foundry area, just follow along the description in the "Initial Setup" section of help.sap.com ignoring the Portal-related topics

So for the moment my journey with respect to business rules stopped at this point, but it will be continued.

Summary


Summarizing the journey of a side-by-side extension with Kyma and Microsoft Azure I would state that building side-by-side extensions with these building blocks is a feasible task. From the SAP side of the house there is quite some value in Kyma. Nevertheless, you cannot achieve everything in Kyma, so taking the best of different worlds can give rise to really cool new extensions of course especially when bringing in Microsoft Azure (I am biased okay 😉) and add a lot of value in the business processes for internal and external customers.

Looking more closely to the technical platforms my take-aways are:

  • Kyma is the most promising environment on SAP Cloud Platform when it comes to building extensions (I leave the commercials aside). The integration with the SAP ecosystem is smooth, but still a bit limited. In addition, there are some pitfalls that I did not expect. They show that the maturity of the platform needs to be further improved, but from my perspective the foundation to do so is there.
    Another important highlight is the documentation on https://kyma-project.io/ and the samples available on GitHub, that support development a lot.

  • Microsoft Azure complements the development with numerous beneficial offerings and a very pleasant development experience. Everything worked as expected. Some convenience features would be nice though.
    Overall mixing in Microsoft Azure into side-by-side extensions for SAP is a very valid option that you should consider when designing extensions.
    Documentation … let us put it this way: compared to SAP there are many many more developers using the platform, so the sources of information are great and so is the official help.

  • The SAP Cloud Platform trial has limitations that you need to consider if you can. It depends on the use-case if the offering is sufficient to build real-world proof-of-concepts or to make your hands dirty with some services (including a low entry barrier).


One thing at the very end: special thanks to

  • the Kyma team especially marco.dorn , jamie.cawley  and gabbi that supported in the SAP Community and beyond the scenes. Highly appreciated!

  • to martin.pankraz for triggering this project and giving feedback on the story

7 Comments
Labels in this area