Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
danishmeraj
Explorer

Introduction


If you are super excited about upcoming SAP offerings based on generative AI like JustAsk and Joule and can’t wait to test their full potential for your use case believe me you are not alone, count me in as well.

But in meantime why not implement Generative AI to build interactive dashboards which takes user query in natural language and changes the dashboard filters, performs What-if analysis, and gives user insights on changes to dashboard and KPIs, in this case giving suggestion to user based on applicable tax regulation.


Figure 1: The figure shows the dashboard.

The figure above provides a glimpse of the dashboard, which includes two additional features beyond those of a standard dashboard:

1. Just Query: An input field for user's queries.

2. Tax Sensei: A feature that displays relevant insights applicable to the KPI and offers suggestions. It leverages the Retrieval Augmented Generation (RAG) framework.

In this blog, we will discuss how to build such powerful dashboards. At the end, there is a demonstration video showcasing all the capabilities discussed earlier.

Excited? Then, let's get started.

Technical Architecture



Figure 2: The figure shows the technical architecture.

This blog is divided into two parts as shown in the above figure:

Part 1: This focuses on the implementation of 'what-if' scenario functionality, which dynamically changes the dashboard in response to user queries.

Part 2: This focuses on the implementation of the Retrieval Augmented Generation (RAG) architecture framework, providing users with relevant insights. Don't worry about it, it's just a fancy name for giving relevant context to large language models.

Prerequisite:


There are 3 prerequisite that has to be completed before we proceed.

1. OpenAI API Subscription:


Sign up for the OpenAI API subscription in order to access the GPT-4 Turbo model. You can do so from URL here.

Why GPT-4 Turbo?

Simply because its performance has been the best so far.

Create key as shown in figure 3 from here, and make sure to store your API key securely as it will be used in later steps.


Figure 3: The figure shows creation of API keys on OpenAI website.

2. Pinecone trial account and store sample data at vector database:


Create a free trial account on Pinecone here. You would already find an API key in your account but if you want to create new one you can easily do so. Use Pinecone API key to store vectors after embedding it using OpenAI Embedding APIs.

I used the LangChain library in Python to accomplish this because it simplifies many steps such as adding the text chunk as metadata of the vector to be stored on Pinecone.

This process isn't covered in this blog post as it would require a separate blog in itself but there are plenty of resources available out there that explains it in much detail, for instance, this article here could be a good start.

I would like to mention at this point that SAP HANA Vector Engine will be available from Q1 2024 which could be the go-to solution for storing vectors.

 


Figure 4: The figure shows creation of API keys on Pinecone website.

3. Create Custom widgets:


In my previous blogs, I have already demonstrated the impressive capabilities of custom widgets, particularly how it allows to connect with external services and consume it within SAP Analytics Cloud. The process of creating these widgets remains the same as discussed in the previous blogs, with minor adjustments in the header, data format, and response to accommodate different requirements for various API requests. We need three custom widgets to implement the architecture discussed earlier:

1. Custom widget to make POST requests to the GPT-4 using the information about Completion API from here.

2. Custom widget to make POST requests to the OpenAI Embedding API using the information about embedding API from here.

3. Custom widget to make POST requests to the Pinecone query endpoint using the information from here.

Part 1: Applying 'What-If' Scenarios to the Dashboard


In this section, we will discuss how to implement the “What if analysis” when user queries. The idea is to use user query to retrieve information related to filter and KPI and then use this information to change the dashboard. Now, there are two methods to do this:

1. Prompts:


In this method, we would use prompt to retrieve information related to filter of KPI from the user query.

I would recommend you to do some prompt testing to make sure you get the output in the right format. Prompt testing can be done at OpenAI playground as shown in the figure 5.

After prompt testing, we make an API request to GPT-4 model with information regarding filters and KPIs on the dashboard, and then request GPT-4 to provide us with output in a specific format based on the user's query, such as “Filters = [], KPI = []” as shown in the figure 6. We would then use the server response, parse the output, and apply it to the dashboard.

While this method is relatively simple to implement, one downside is potential inconsistency. However, newer models like GPT4 - Turbo tend to deliver more consistent output.


Figure 5: The figure shows prompt method to get structured output.

 


Figure 6: The figure shows API response for prompting method.

 

2. OpenAI Function Calling:


This method uses the function calling functionality of OpenAI. We need to write a function based on the description given in the documentation on the OpenAI website from here. When a user makes a query, the Large Language Model (LLM) determines which function to call and responds with an output that can be further utilized in the application, as shown in the figure 7. In this case, the output will be used to make changes to the dashboard as discussed in the previous method.

Please do check out this nice blog OpenAI Function Calling: Integrate with SAP BTP destination for external API call. It gives nice overview of OpenAI function calling.

Essentially, OpenAI function calling enables developers to generate consistent output in response to user queries, making it suitable for building applications on top of LLMs.

 


Figure 7: The figure shows response from Completion API using function calling method.


Part 2: Giving user relevant insights


In this section, we will discuss the implementation of the retrieval augmented generation architecture, which provides users with relevant insights. For example, it may help analyze the influence of new tax regulations on an entity based on its KPIs. The implementation is achieved in three steps as discussed below.

Step 1: Embedding Query


Convert the user query into a vector using the OpenAI Embedding API with the help of the OpenAI Embedding custom widget. The response would be a vector, as shown in figure 8.


Figure 8: The figure shows response from OpenAI Embedding API

Step 2: Vector Search


In this step we will use the vector output from step 1 to perform a vector search in the Pinecone vector database. The Pinecone custom widget will be used to carry out the similarity search and return the most relevant text information as a response, as illustrated in the figure 9.


Figure 9: The figure shows response from Pinecone vector DB.

Step 3: Bringing It All Together


At this point, we have gathered everything that is required to give users personalized insights such as KPIs, user query, and relevant tax information from Pinecone. These data will be sent again to GPT-4, wrapped in a prompt requesting insight based on the provided information.

And… Here is how it all works together.


Conclusion:


The blog demonstrates how Gen-AI can be used in SAP Analytics Cloud to create dashboards which offers great user experience. It discusses the possibility to use Gen-AI to perform "what-if" analysis as well as gives user personalized as well as relevant insights. The solution discussed in this blog can be applied to any dashboard.

If you find this blog useful then, please like this blog post and follow me for contents related to SAP Analytics Cloud. If you have any questions or feedback, please leave a comment below.

 

Contributed by: Sebastian Dietz and Guillermo Cobos Laguna
1 Comment
Labels in this area