Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
martinfrick
Product and Topic Expert
Product and Topic Expert
Hey folks! ๐Ÿ™‹โ€โ™‚๏ธ๐Ÿ‘‹

Another AI blog post, right? Nope, not this time โ€“ let's finally stop talking and start walking! ๐ŸŽฌ๐Ÿ’ฅ We've got some real goodies to share with you, whether you're an SAP customer, SAP partner, or just one of those strange but adorable enthusiasts like me, who can't get enough of SAP. ๐Ÿค“๐Ÿ’ผ

At TechEd 2023, SAP dropped some major news on the Artificial Intelligence front, including our new friend Joule, the SAP HANA Cloud Vector Engine, and the star of today's show, the generative AI hub in SAP AI Core, wrapped up in a real-world use-case. ๐Ÿš€๐Ÿ’ก

Too busy to read but still want the essential bits? ๐Ÿƒโ€โ™€๏ธ๐Ÿ’จ
Take a quick peek at our SAP-samples repository and Generative AI reference architecture๐Ÿ•ต๏ธโ€โ™‚๏ธ
Make sure you catch the brand-new openSAP course, "Generative AI at SAP"โœ๏ธ


With our fresh-off-the-press reference architecture guidance for Generative AI scenarios, you'll be more than ready to kickstart your own AI journey on SAP BTP. ๐Ÿš€๐Ÿ“š Whether you're building extensions, standalone solutions, or selling your own SaaS apps on SAP BTP, you can count on the generative AI hub for infusing your architecture with AI superpower. ๐Ÿ—๏ธ๐Ÿ’ผ๐Ÿ‘

And it doesn't stop there! You'll also get to leverage best practices like a great SAPUI5 app, a robust SAP HANA Cloud back-end, a single and multitenant CAP application, and an API for your SaaS customers using the SAP Service Broker Framework. This sample scenario has got it all, folks!๐ŸŒŸ๐Ÿ’ช๐ŸŽ‰  Great news - our sample is available for the SAP BTP, Kyma Runtime and the the SAP BTP, Cloud Foundry Runtime!


Generative AI reference architecture for multitenant SAP BTP applications


 

This blog post is gonna shine a spotlight ๐Ÿ•ต๏ธโ€โ™‚๏ธ on how customer and partner apps on SAP BTP can take advantage of SAPโ€™s freshest service offering โ€“ the new generative AI hub as part of SAP AI Core. ๐ŸŽ‰ This is all about helping you to add some serious AI capabilities to your single- and multitenant solutions.






If you're looking to deep dive into building multitenant applications on SAP BTP, head over to our SAP-samples GitHub repo "Develop a multitenant Software as a Service application in SAP BTP using CAP" that'll steer you through the whole process. ๐Ÿ”

To make things a bit easier and to lay down some solid groundwork for all you developers ๐Ÿง‘โ€๐Ÿ’ป diving headfirst into the SAP AI space, we're gonna show off a few neat AI features. These are part of a new SAP-samples GitHub repository all built around a make-believe travel agency ๐Ÿ”ฎ we're calling ThorTours. ๐ŸŒโœˆ๏ธ

And here's the fun part: we're gonna make ThorTours' customer support processes even better using Generative AI by reducing customer response efforts for faster handling of incoming requests and at the same time boosting the quality for higher consistency in support through reproducible responses. ๐Ÿค–๐Ÿš€ So hold on to your hats and let's dive in! ๐ŸŠโ€โ™€๏ธ๐Ÿ‘ฉโ€๐Ÿ’ป

 


Customer Service app before the introduction of GenAI Mail Insights


 

Our GenAI Mail Insights sample solution is the new best friend of all ThorTours ๐Ÿ๏ธโ›ฑ๏ธcustomer support employees ๐Ÿ™Œ. It's all about giving you top-notch mail insights and automation in a comprehensive, (multitenant) SAP BTP solution enhanced with a bunch of cool AI features ๐Ÿค–. How does it do it? It uses Large Language Models (LLMs) via the generative AI hub in SAP AI Core to analyze incoming mails, offering categorization, sentiment analysis ๐Ÿ“Š, urgency assessment, and even extracts key facts to add a personalized touch ๐ŸŽ.

What's the cherry on top? ๐Ÿ’ It's the super innovative feature of Retrieval Augmented Generation (RAG) using mail embeddings. This helps in figuring out how similar requests and infuses those confirmed responses as additional context to reduce hallucinations ๐Ÿ‘ป, ensuring you always get consistent service quality ๐Ÿ’ฏ. It doesn't stop there, it also summarizes and translates simple mail enquiries for But wait, there's more!

Our sample setup boosts automation by generating potential answers for customer inquiries, enhancing response accuracy and speed like a boss ๐Ÿ’ช. And the best part? It's not just for the travel industry โ€“ it's adaptable for various scenarios using custom schemas. So, whatever your need, we've got you covered! ๐Ÿ˜Ž


Customer Service app after the introduction of GenAI Mail Insights


 

Why does this matter, you ask? ๐Ÿค” Well, companies are often struggle with customer support headaches, everything from reading through long mails manually ๐Ÿ“ง to language hurdles ๐ŸŒ and a lack of automation ๐Ÿค–. What our demo solution brings to the table is quicker, more personalized, and consistent customer service ๐Ÿ‘ฅ, savings on cost ๐Ÿ’ธ, flexibility, and an edge over the competition ๐Ÿ’ผ. And we're not stopping there! In upcoming releases, we're even thinking about an integration with SAP Cloud Solutions such as SAP Concur or a Microsoft Exchange Online Inbox. This could supercharge operations and give a boost to data-driven decision-making ๐Ÿ“ˆ.

Alright, enough talking for now โ€“ let's roll up our sleeves and dive into some of the techie highlights you can uncover in our SAP-samples repository ๐Ÿ—‚๏ธ. Ready to kick off with the first steps of processing new mail enquiries? This will lead us straight into our first challenge and the GenAI spotlight of our use-case ๐ŸŽฏ. First things first, we're going to see how our solution, built on SAP BTP, actually links up to the Large Language Models using the generative AI hub capabilities โ€“ the latest and greatest service offering as part of SAP AI Core announced at TechEd 2023 ๐Ÿ”ฅ.

If you've had a go at existing LLMs like Azure OpenAI, you'll find the setup pretty familiar. ๐Ÿง When an app needs to hook up with a Large Language Model (we're talking Azure OpenAI in our scenario), it links up with the generative AI hub and dials up the URL or what we often call the inference endpoint of a tenant-specific deployment. ๐Ÿ–ฅ๏ธ๐Ÿ”— Each tenant in a multitenant setup gets their own dedicated so-called SAP AI Core resource group onboarded, which is awesome because it offers the flexibility for individual model deployments and upcoming metering features. ๐Ÿ“Š๐Ÿš€


Tenant specific SAP AI Core Resource Groups and LLM Deployments
(Preview Version - subject to change - check latest openSAP course)


 

When it comes to our sample scenario, we're sending payloads to the Inference URL for Chat Completion and creating Embeddings. This process is a piece of cake ๐Ÿฐ because it's just like the Azure OpenAI API specification, which means requests and custom modifications are a breeze ๐ŸŒฌ๏ธ. Just a heads up though, this is still an early-bird ๐Ÿฆ release of the generative AI hub, and we might simplify the integration even more in the future. Why, you ask? Well, it's all thanks to the rising popularity of contributions to Open-Source resources like LangChain. ๐Ÿš€๐Ÿ”ฅ






To learn more about the generative AI hub (incl. the latest preview) and how SAP is leveraging Generative AI, please check out the latest openSAP course Generative AI at SAP.

For the time being, we've got some handy sample wrappers ๐ŸŽ that make connecting to the generative AI hub deployments super easy. They take care of handling multitenancy and can be swiftly adapted for other Large Language models available. Plus, these wrappers ensure compatibility with available LangChain features once the LLM object is instantiated. This allows you as a developer ๐ŸŽฉ๐Ÿ’ก to keep their eyes on the prize - the actual business process!


Calling the Chat Completion inference URL of a tenant-specific Deployment
(Implementation subject to change)


 

So, we've got a customer that's just a big block of text, right? ๐Ÿ˜• Our goal is to wrangle that into some useful info we can store it away in our database. ๐Ÿ“š We're using the brainpower of a Large Language Model, LangChain and custom schemas (courtesy of the npm zod package) to make this happen. It's all about having a flexible and hands-on setup with the Large Language Model. ๐Ÿง ๐Ÿ’ฌ

We're using custom schemas and auto-generated format instructions (that's some prompt engineering magic right there! ๐Ÿ”ฎ) to get a nice and structured JSON result. ๐Ÿ“„ This we can process further and easily store in our SAP HANA Cloud Database using our beloved CAP framework making that task a childโ€™s play. โ˜๏ธ๐Ÿ’พ

The parsing features of LangChain and the custom schema package features are like two peas in a pod ๐ŸŒฑ They make sure that the LLM response fits into a properly typed object without any further type conversion hustle. ๐Ÿ›ก๏ธ Pretty cool!? Sounds almost too good to be true! ๐Ÿ˜ฒ๐ŸŽ‰ Using these kind of custom schema definitions will allow you to adapt the solution to any other kind of support scenario, by simply updating the respective schema details!


Using a custom schema (zod + LangChain) for a programmatic LLM interaction 


 

Apart from extracting a summary, insights and relevant key-facts from a new customer enquiry, we've also made life easier by automating the drafting of reliable response proposals for new mails. ๐Ÿ“ฉ This brings us to our second reveal, a cool mix of Large Language Models (LLM) magic and the capabilities of what's known as Retrieval Augmented Generation . ๐ŸŽฉโœจAlso check the following blog post to learn more.

The concept is as straightforward as it is genius โ€“ we use an LLM to create a new response for a customer enquiry, but here's the twist: we add in some extra context from previously answered similar mails. ๐Ÿ”„

While this might sound like a lot of effort, a very simple implementation is as easy as pie. ๐Ÿฅง All you need is access to generative AI hub that is generating so-called embeddings for incoming mails and a suitable vector store to hold these embeddings. ๐Ÿ—„๏ธ Embeddings are vector representations of versatile input formats (e.g., text, images, videos) in a high-dimensional space, capturing semantic relationships for natural language processing tasks ๐Ÿค–. The SAP HANA Cloud Vector Engine will also support this requirement soon!

Once you've got these two features ready to go, you only need a few lines of custom code to spot similar mails in real time and inject the previous answer as additional context. All part of the regular prompt engineering process. ๐Ÿš€ 


Leveraging the power of Retrieval Augmented Generation


 

Last but not least, let us try to handle the language barrier! As you all know, many of our fabulous SAP customers and partners are global rockstars ๐ŸŒ๐Ÿค˜. This means dealing with international customers and team members from various countries. So, getting reliable translations is super crucial, right? ๐Ÿ‘๐Ÿ”






Our advice for top-notch translations in the SAP universe hasn't shifted a bit. The SAP Translation Hub is our go-to recommendation for translating texts and documents in productive SAP scenarios. (Need a refresher? Click here!) ๐Ÿ“š๐Ÿ’ก

Only for our simple and non-productive demo scenario, we're mixing things up a bit and using Large Language Model capabilities - just for the sake of demonstration, mind you ๐Ÿ˜‰. The LLM translates incoming customer demands into the user's preferred language. And it doesn't stop there! When you reply to those queries in your comfy working language, guess what? The recipient gets your response plus an automated translation whipped up by the Large Language Model. Another interesting use-case for using Generative AI ๐Ÿ–ฅ๏ธ๐Ÿ”—, showing you about the endless possibilities this new technology has to offer.


Explore further exciting use-cases and scenarios for Generative AI


 

Hungry for more knowledge? ๐Ÿง Get the ball rolling by diving into our Generative AI Reference Architecture ๐Ÿ—๏ธ and exploring our SAP-samples GitHub repository or related Discovery Center mission. ๐Ÿš€ Once the generative AI hub in SAP AI Core goes public by end of , you'll be able to navigate our step-by-step guide ๐Ÿ—บ๏ธ to deploy our sample solution to your own SAP BTP account. ๐Ÿค“

The generative AI hub as part of SAP AI Core is generally available since Q4/2023 and the planned availability for the SAP HANA Cloud Vector Engine is scheduled for the Q1/2024 release (see Road Map Explorer). Please check the available SAP-samples GitHub repository to learn more!

Big virtual high-five ๐Ÿ™Œ to all the team members who played a part in putting together this sample use-case for TechEd 2023, including kay_, Iyad Al Hafez, Adi Pleyer, Julian Schambeck, anirban.majumdar, and the entire SAP AI Core team around z1133666 and Andreas Roth. ๐ŸŽ‰๐Ÿ‘

Further links
9 Comments