Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
kurzdo
Product and Topic Expert
Product and Topic Expert
Today I would like to introduce a highly anticipated feature that we have developed specifically for customers with high data volumes and who like to operate in the regular ABAP / BW world.

Cold Storage enablement for SAP BW bridge in SAP Datasphere is available now. 


Our cold store solution with the embedded hana cloud datalake relational under SAP HANA Cloud helps SAP BW bridge customers to classify the data stored in the DataStore object (advanced) as warm or cold, depending on the cost and performance requirements for the data.

Depending on this classification and how the data is used, the data is stored in different storage areas.

The following options are available:

  • Standard Tier (WARM): The data is stored in an SAP HANA of SAP BW bridge (NSE).

  • External Tier (COLD): The data is stored externally in separate DB (SAP HANA Cloud embedded datalake relational of SAP Datasphere

  • Reporting Tier (HOT): SAP HANA Cloud in SAP Datasphere


How does it work? 


In SAP BW bridge, all data is stored in SAP HANA native storage extension (NSE), which is our warm storage area. As everyone knows, SAP BW bridge is not enabled for reporting and acts as a cheaper staging option for all legacy related sources like SAP ECC or SAP S/4HANA. If you also license the embedded HANA Cloud datalake relational under SAP Datasphere as cold store for SAP BW bridge, you can easily move your data out into the cold store like with SAP BW/4HANA and SAP IQ as cold store, like the Data Tiering Optimization (DTO). SAP Datasphere acts as your hot storage area and reflects your reporting layer for SAP Analytics Cloud.


Procedure to move data into the embedded hana cloud data lake relational. There are two options available, manual or automated. 


The cold store option for the respective advanced DataStore Object (aDSO) must be checked in the data tiering properties.


Create partitions based on a field and define the partition granularity of the aDSO. This can be done in the settings of the object.


Jump into the Data Tiering Management Setting in the SAP BW bridge Cockpit and select your aDSO.


Select the desired partition or year that you want to move towards SAP HANA Cloud, Data Lake relational.


Change the temperature to cold and execute the temperature settings in the previous screen.




How to move data into the Cold Store automated.


Create a rule for the selected aDSO.


Move for example data which is older than 5 years to the cold store and set the flag "active".


 


Execute the temperature change for the schemas in the previous screen.


With the job sceduler you can decide, if you want to start the job immediately or planned.


 



If the job successfully finished, the status changes to OK.


In the overview page you can check afterwards if all partitions are fine.


Your data is now successfully moved from SAP BW bridge to SAP HANA Cloud, Data Lake relational.

How to handle temperatures in SAP Datasphere:


To use the cold store for reporting purposes, you have to import the respective remote table of the aDSO from SAP HANA Cloud, Data Lake relational and SAP BW bridge.


Remote Table: SAP BW bridge


Remote Table: SAP HANA Cloud, Data Lake relational


In the last step you have to create a union between your regular SAP BW bridge Remote Table and SAP HANA Cloud, Data Lake Remote Table.


As you know, our recommendation is clearly to replicate data from SAP BW bridge to SAP Datasphere. More Information about the Layers shift approach can be found under the following Link.

For data out of the cold store, you have to decide individually what kind of data is really reporting relevant and if a replication of some partitions makes sense or not.

In practice it often turns out that data sets, which are older than 5 years are only very rarely reported and you can access the data on request.

 

How to get the cold store enabled for SAP BW bridge? 


Status quo via Ticket. Please refer to SAP Note 3401908 - Cold Store enablement for SAP BW bridge for further information on the procedure.

In future, we plan of course to automate this process.

 

Prerequisites


HANA Data Lake must be provisioned by the customer itself and in advance. This can be done in the flexible tenant configuration.

 

What's about sizing?


The size of your cold store heavily depends on the usage pattern of the data. There are however certain guidelines and restrictions that need to be considered. All data that is written to or read from the ColdStore goes through Hana Cloud instance of the SAP BW bridge tenant. This means the Hana Cloud instance must be sufficiently sized to handle all parts of the cold data that is processed in parallel. Please keep in mind that not all query operations can be pushed down to the Cold Store, but in certain cases Hana Cloud is required to read more data (or finer granularity) in order to perform the requested computation. Additionally it is important to consider that changes to the data in the ColdStore (e.g. updates) are not performed in the Cold Store but the data needs to be re-loaded into the warm partition in Hana Cloud, updated and then moved back to Cold Store.

 

With this in mind a rough guideline is a warm:cold ratio of maximal 1:3 if the data in the ColdStore is frequently accessed (with the performance penalty) and may be updated during its lifetime. Data which is not changed and not accessed for reporting, like historic data stored for legal reasons only, can see a warm:cold ratio of up to 1:10. This is however the maximum that is supported and any such sizing should be done with great care and thorough usage considerations only.

 

Conclusion:


As described in the above part, data can be easily outsourced to SAP HANA Cloud, Data Lake relation in a cheaper storage medium, specially with SAP BW bridge. It is also very clear that we have based this functionality very closely on SAP BW/4HANA in order to keep the user experience of both worlds synchronized.

 

Please let me know, if you have any detailed question!

 

Regards,

Dominik
5 Comments
srinivas4sap
Explorer
0 Kudos
Thanks a lot for informing the latest developments in SAP DSP...Its real great news for the big volume customers
wounky
Participant
0 Kudos

Thanks, kurzdo, it clearly explains how to migrate the DTO into SAP HANA Cloud embedded datalake relational / Data Lake Relational Engine (DLRE).
Would it also work if Spark & Hadoop were used as cold/raw storage of the datalake instead of relying purely on object storage? Would you say that my understanding is correct that the DLRE + Object Store is a go-to for DSP cold storage?

SAP HANA Cloud, Data Lake / Data Lake Files with Apache Spark
"Data lake Relational Engine can read from and write to data lake FilesData lake Files provides a convenient location for data processing requirements and removes the need for an external object store."

bpoteraj
Explorer
kurzdo thanks for explaining.

Is automated movement of data from warm store to cold store (e.g. more than 5 years old) available for 'core' Datasphere local tables? Or do you require the BW bridge to have this kind of automation?
wounky
Participant
0 Kudos

Also interested to know, I think for now you need to write your own procedures based on the API 😕
https://help.sap.com/docs/SAP_DATASPHERE/be5967d099974c69b77f4549425ca4c0/12b6825ac6d34db9902460f665...

 

kurianz
Newcomer
0 Kudos

Thanks @kurzdo  for sharing this information. 

I have looking into how to convert a SAP BW 7.5 on HANA system (which also utilizes a SAP IQ database for Near Line storage, to SAP Datasphere with a BW Bridge activated. DAP jobs have been setup on ADSOs in this system

I have read SAP Note 2469514, which tells me that there is no migration tool to transfer the DAP jobs to BW Bridge. What is not clear is whether in case of ADSOs (in the source SAP BW on HANA), we need to reload data from the archives to HOT Store and then transfer the data from the source SAP BW on HANA to the Datasphere tenant. This would imply we need to before.hand, scale up the HANA memory capacity in the source SAP BW to accommodate the data reloaded from the archives.

I wanted to ask if my understand is correct and if there is any other way to transfer the archived data from SAP IQ directly to the BW Bridge(WARM) or the embedded HANA Cloud Data Lake relational engine(COLD) associated with the Datasphere tenant?