Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
vedant_gupta
Product and Topic Expert
Product and Topic Expert

CONTEXT

Supply Chains are complex networks of interconnected entities and processes responsible for the planning, sourcing, manufacturing, transportation, and delivery of goods or services to customers. The networks typically include suppliers, manufacturers, distributors, logistics providers, retailers, and end consumers as their nodes with goods flowing  physically  between the nodes. The networks are assessed on their performance for their cost efficiency, speed and for minimizing total inventory while matching the demand and supply.

 

BUSINESS CHALLENGE

Supply chain networks are periodically analyzed and tweaked at strategic, tactical, and operational levels for performance through the planning processes.  The strategic planning aligns long term resource requirements with the long term demand, tactical planning aligns medium term resource requirements with the medium term demand whereas the operational planning deals in a similar way in the short term. While the process of planning is automated through heuristics and optimizers, the planning process also requires answering several what, why and what-if questions. For example, the questions may involve,  how input components and production resources involved are constraining the final demand fulfillment, and more ad hoc questions like which constraint needs to be alleviated to improve the demand fulfillment of a particular product. Such questions  are currently answered  using pre-conceived charts which have some shortcomings like:

  • These charts can answer common questions and not ad hoc questions.
  • Contextualizing these charts is cumbersome.
  • High resource utilization motivates stakeholders to rely on intuition more.

In one of the discovery sessions, a major FMCG customer reported that upto 4% of lost sales are on account of not finding answers to the questions in time. Finding answers to the questions of supply chain managers has potential not just within the supply chain domain, but also at the business level by bringing in additional data sources from Procurement and Finance. To cross over this major hurdle we turn to the might of Generative AI. The task at hand, uses the power of LLMs, specifically Agents, to advise supply chain planners and leaders in building risk-resilient networks. We call this application the “Supply Chain Advisor”. The projection for the Supply Chain Advisor is to incrementally build the advisory capabilities by starting with smaller data sets (with narrow scope) and expand the scope gradually. The vision for this GenAI powered application is to augment supply chain optimization with:

  • Intelligent query handling
  • Data driven decision making and problem resolution
  • Streamline surrounding business processes
  • Proactive risk management

SAP INTEGRATED BUSINESS PLANNING (IBP) FOR SUPPLY CHAIN

SAP IBP is a cloud-based solution offered by SAP, designed to help businesses streamline and manage their end-to-end supply chain processes. It provides a comprehensive set of planning capabilities across various functions including demand planning, inventory optimization, production planning, procurement planning, and distribution planning.  It is a powerful tool that provides advanced decision support for cross-functional decisions involving finance, sales and operations. SAP IBP is already used by over 1000 companies from multiple industries, ranging from large MNCs to medium size companies with a varied complexity of supply networks.

SAP IBP generates various types of logs that provide insights into the planning processes. One of the most important types of logs rich with information is the Optimizer Log (OL). This log contains information on what demand is met, what is not met and the reasons for not meeting the demand. Other important information like geographic endpoint IDs, resource IDs and product IDs helps the planners gather complete information. Figure 1 shows a typical OL.

vedant_gupta_0-1716200204818.png

vedant_gupta_1-1716200226314.png

Figure 1. Optimizer Log File

The primary information contained here is the delta by which the target wasn’t achieved in the “Issue” column, while the “Reason” column provides explanation for non-fulfillment. There could be multiple reasons that lead to the issue, like, component unavailability and low resource capacity. For example, consider the case of an automotive supply chain, which might need  thousands of components like engine, transmission system, wheels, etc. The resources could represent the Transmission-Chassis-Final Assembly line, paint shops, body shops, engine assembly lines etc. The ‘Issue’ here pertains to the demand-supply imbalance for a particular product (fully assembled vehicle). All other columns in the file talks about the important metadata like product ID numbers, location details (target and source locations), IDs for resources in consideration, and the date of evaluation.

A typical optimizer log could contain supply chain constraints with thousands of such rows of data depending on the product portfolio. While answering the questions of the supply chain managers, one has to dig into the log and often augment with other files with information on projected sales prices, granular resource and component information and other master data. With this we understand the pronounced resource demand for optimizing a large supply chain, and so the requirement of a LLM agentic workflow that can in essence mitigate the load on planners.

The entire process of supply chain optimization was envisioned as shown in Figure 2.

vedant_gupta_12-1716201082310.png

Figure 2. Supply Chain Advisor Process Flow

 

LLM AGENTS

Why Use Agents?

To effectively address supply chain optimization queries, it is imperative to parse through extensive log and metadata files. Large Language Models (LLMs), lacking contextual knowledge of custom data, necessitate a mechanism for injecting this data. Retrieval Augmented Generation (RAG) poses challenges, notably its limited effectiveness with structured data, compounded by the time-consuming embedding process and potential for generating incoherent results. Moreover, ad hoc queries, requiring a step-by-step approach, present additional complexity. Consider a question – “What is the profit lost corresponding to the lost demand for west region of country X, for week 15 of the year, pertaining to product ID Y?”. The action must be broken down into many steps –

“Find the demand lost for given product and region on the date from table 1 – evaluate the value from sales price as lost value from table 2 – find the utilities and resource cost from table 3 that was left unused – calculate the total amount based on the values generated in above steps.”

In such scenarios, a solution must exhibit following traits:

  1. Iterativeness
  2. Capture relevant data from various points in the workflow/solution chain.
  3. Operate autonomously.

 

Technology Behind LLM Agents

Given above considerations, LLM Agents emerge as the preferred choice. Their inherent reasoning abilities along with above three traits aligns with the demands of navigating complex supply chain optimization challenges.

In essence, agents use LLMs at their core to carry out complex reasoning with multiple other components, in an iterative fashion (depending on the task). Figure 3 explains the idea of LLM agents.

vedant_gupta_2-1716200313603.png

Figure 3. LLM Agent Components

As mentioned at the core of agentic workflow is the LLM itself which does the heavy lifting of processing text expressions and feeding the flow. One must ensure that the LLM with its knowledge is relevant to the task, example if carrying out a financial agentic flow, a LLM trained on finance terminology and mathematical formulations will perform better than a typical LLM. Along with the core, the main components streamline the query-iterate-response cycle.

The Recipe can be thought of as a pattern or orchestration in which tokens flow. These recipes often form from either of the two strategies – Reflection, where task is performed based off which next steps are tweaked; or Task Decomposition, where the prescribed steps are curated and tokens flow adhering to the same.

Memory consists of two main features, first being the short-term memory which ensures adherence to the recipe and second being the long-term which allows steps downstream to pick up relevant information from the upstream results.

Tools are the executable functions or sub-flows within the main agentic flow that allows completion of the intermediate tasks.

Knowledge is the data we allow the agent to attach to perform the task.

In our case of the IBP OL and metadata logs, we use the LangChain CSV Agent. It uses the Python Pandas package to work with CSV files and carry out data manipulation and analysis tasks. To break the approach down, we use the GPT-4 LLM with the CSV agent, where the Knowledge is the number of log files we attach. It uses the Python REPL (Read Evaluate Print Loop) assistant as the Tool to communicate with the Pandas framework and carry out data manipulation. The Recipe broadly follows the Thought-Action-Input-Observation cycle. So essentially the agent would at first produce a thought about what step to carry out, followed by an action which is nothing but generating the command for the relevant Pandas which acts as the input. An output is then generated which again feeds for the thought. As part of the Proof-of-Concept (PoC), these intermediate thoughts are also submitted to the user to understand how the information is processed. 

 

Schema Aware Prompting

During the exploration phase of our PoC, we experimented with various prompt formations. A common discovery was the significant improvement in output quality when the initial prompt included schema information of the CSV files being loaded. This inclusion enabled the LLM and agent to gain a deeper understanding of the problem setting, leading to better handling of queries with the provided knowledge. So, the prompt template passing to the agent in the invoke statement looked something like what is shown in Figure 4.

vedant_gupta_3-1716200401778.png

Figure 4. CSV Agent Prompt Template

Another significant observation was that including the system message towards the end of the template resulted in improved memory performance of the agent. This positioning reinforced the main aim of the query outfit and proved beneficial, particularly as the prompt could become lengthy due to schema information, especially when multiple tables were .

 

APPLICATION

The solution follows the architecture diagram shown in Figure 5. The backend is a Python microservice application deployed on SAP BTP. The CSV agent is currently supported only on the Python offering of LangChain. The microservice consists of three main components:

  1. Mechanism to ingest data in form of the CSV files, uploaded by the user on HANA DB as shown in Figure 7 & 8. This file also orchestrates and manages the cache for the application through automated CRUD operations.
  2. Agent script which fetches the schema description, user query and prepares the prompt template for the invoke call.
  3. The API application itself.

vedant_gupta_4-1716200524447.png

Figure 5. Architecture Diagram

The UI is a HTML5 application with main screen as shown in Figure 6. It offers an interactive yet simple to navigate screen.

vedant_gupta_11-1716200970480.png

Figure 6. Supply Chain Advisor UI - Main Screen

From the left-hand pane, user can access the older queries. The query bar is at the bottom of the screen where user can enter the question for supply chain optimization. On clicking the Upload Files button shown in Figure 7, user can add multiple OL and related files to the HANA instance.

vedant_gupta_7-1716200706308.png

Figure 7. Supply Chain Advisor UI - Upload Files

The Manage Files button is a crucial one where use can add, edit, or delete the running files, as well as the schema description, which we have established is crucial to agent output. The action is shown in Figure 8.

vedant_gupta_8-1716200770961.png

Figure 8. Supply Chain Advisor UI - Manage Files

The ability to edit description allows to track and improve performance on the go if required. Figure 9 shows the feature at play with a generic OL file.

vedant_gupta_9-1716200818851.png

Figure 9. Supply Chain Advisor UI - Schema Description

To tie the UI and microservice, we use the Cloud Application Programming (CAP) framework. The CAP app envelopes both components of the Supply Chain Advisor app and helps orchestrate the entire operation.

 

CONCLUSION

The application has undergone testing with OL files, encompassing both typical and ad hoc user queries. The outputs, coupled with intermediate steps, have consistently demonstrated commendable results and explainability, as illustrated in Figure 10. Leveraging a natural language-based platform, the framework facilitates enhanced planning of the supply chain, enabling organizations to navigate complexities with greater insight and efficiency. Currently, the framework is undergoing further validation with customers and diverse datasets to ensure its robustness and applicability across various scenarios and industries.

vedant_gupta_10-1716200891525.png

Figure 10. Supply Chain Advisor - Few Results

We see tremendous potential of this framework in creating impact in attempting to the lost sales on account of unresolved supply chain bottlenecks and creating a big delta in business top .

 

AUTHORS & ACKNOWLEDGEMENTS

Many thanks to Girikanth Avadhanula and Narendranath Allu for driving this collaboration from domain side and liaising with customers to validate and propagate the use case. Thanks to @PVNPavanKumar  for supporting the project. Thanks to @Aryan_Raj_Sinha  for contributing to UI development for the.

Authors -

Vedant Gupta, Associate AI Developer – BTP Platform Adoption & Architecture, SAP Labs India

Praveen Kumar Padegal, Development Expert – BTP Platform Adoption & Architecture, SAP Labs India

vedant_gupta_11-1715848692929.png

 

 

 

 

 

 

1 Comment