Application Development Blog Posts
Learn and share on deeper, cross technology development topics such as integration and connectivity, automation, cloud extensibility, developing at scale, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
olegbash599
Active Participant

Disclaimer


The article described author's approach for fast and consistent database update. The approach is speeding up development without decreasing of application robustness.

About SAP LUW


As a rule data of business transactions are updated in several tables (not only in one). It is very important to update all tables or do not update any (it is called to provide data consistency). In SAP NetWeaver to ensure data consistency database updation is not executed directly, but registered for a separate work process and executed in a single DB LUW. Details of that approach are described in abap help. Technically there are several approach for bundling of database updation statements from SAP LUW into separate DB LUW. But the popular and very old one is using CALL FUNCTION ... IN UPDATE TASK. Let's see example of how it works for one table.

 

Example of using CALL FUNCTION ... IN UPDATE TASK


Let we have a table with the following structure





















































Structure of table ZTC8A005_SAMPLE (the separate entity)
Field ID Data Type Comment
MANDT MANDT Key
ENTITY_GUID CHAR32 Key
ENTITY_PARAM1 CHAR10 Sample of short char-field
ENTITY_PARAM2 NUMC10 Sample of numc-field
ENTITY_PARAM3 SYUZEIT Sample of time-field
ENTITY_PARAM4 SYDATUM Sample of date-field
ENTITY_PARAM5 TIMESTAMP Sample of TIMESTAMP (dec-field)
ENTITY_PARAM6 INT4 Sample of integer-field

We need  to create update function module to provide table consistent updation with UPDATE TASK and pass the content of our internal table by value (let me name the function Z_C8A_005_DEMO_UPD_SAMPLE). That means that separate table type also should be created. So update function module and table type could be like shown below on the screenshots (pictures 1, 2, 3).


Picture 1. Attributes of update function module for table ZTC8A005_SAMPLE


 


Picture 2. Import parameters pass by value for update function


 


Picture 3. Separate table type for table ZTC8A005_SAMPLE


 

As for code itself in update function module it could be simple. Below is code listing for that
FUNCTION z_c8a_005_demo_upd_sample.
*"----------------------------------------------------------------------
*" IMPORTING
*" VALUE(IV_UPDKZ) TYPE UPDKZ DEFAULT 'M'
*" VALUE(IT_SAMPLE) TYPE ZTC8A005_SAMPLE_TAB_TYPE
*"----------------------------------------------------------------------
DATA lc_modify_tab TYPE updkz VALUE 'M'.
DATA lc_upd_tab TYPE updkz VALUE 'U'.
DATA lc_del_tab TYPE updkz VALUE 'D'.

IF it_sample IS INITIAL.
EXIT.
ENDIF.

CASE iv_updkz.
WHEN lc_modify_tab.
MODIFY ztc8a005_sample FROM TABLE it_sample.
WHEN lc_upd_tab.
UPDATE ztc8a005_sample FROM TABLE it_sample.
WHEN lc_del_tab.
DELETE ztc8a005_sample FROM TABLE it_sample.
WHEN OTHERS.
ENDCASE.
ENDFUNCTION.

 

The sample of using update function could be as on the code listing
    DATA lt_sample_tab TYPE STANDARD TABLE OF ztc8a005_sample.

lt_sample_tab = VALUE #(
( entity_guid = 'FUNC_GUID_MOD' entity_param1 = 'CHAR10' entity_param2 = '0504030201' )
( entity_guid = 'FUNC_GUID2_MOD' entity_param1 = '2CHAR10' entity_param2 = '0102030405' )
( entity_guid = 'FUNC_GUID2_DEL' entity_param1 = '2CHAR10' entity_param2 = '777909034' ) ).


CALL FUNCTION 'Z_C8A_005_DEMO_UPD_SAMPLE'
IN UPDATE TASK
EXPORTING
it_sample = lt_sample_tab.


CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
wait = abap_true.

In a similar way we could create function modules and table types for any custom table.

What we get:

  1. the created functions which provide data consistency

  2. some time spend with keyboard

  3. additional objects in the transport request


But could we provide data consistency and do not create additional objects? Yes 🙂


Utility AnyTab UpdateTask helps with that. Detailed description and utility itself with demo-reports are available on github-project AnyTab UpdateTask.

Let us see how could be the sample with direct update function transformed
    DATA lc_db_tab_sample TYPE tabname VALUE 'ZTC8A005_SAMPLE'.
DATA lt_sample_tab TYPE STANDARD TABLE OF ztc8a005_sample.

lt_sample_tab = VALUE #(
( entity_guid = 'FUNC_GUID_MOD' entity_param1 = 'CHAR10' entity_param2 = '0504030201' )
( entity_guid = 'FUNC_GUID2_MOD' entity_param1 = '2CHAR10' entity_param2 = '0102030405' )
( entity_guid = 'FUNC_GUID2_DEL' entity_param1 = '2CHAR10' entity_param2 = '777909034' ) ).

NEW zcl_c8a005_save2db(
)->save2db( iv_tabname = lc_db_tab_sample
it_tab_content = lt_sample_tab )->do_commit_if_any( ).

 

The same class and methods could be used in case of many tables. Below is an example for several tables.
DATA lc_db_tab_sample TYPE tabname VALUE 'ZTC8A005_SAMPLE'.
DATA lt_sample_tab TYPE STANDARD TABLE OF ztc8a005_sample.
DATA lt_sample_empty_tab TYPE STANDARD TABLE OF ztc8a005_sample.
DATA lt_head_tab TYPE STANDARD TABLE OF ztc8a005_head.
DATA lt_item_tab TYPE STANDARD TABLE OF ztc8a005_item.
DATA lv_ts TYPE timestamp.
DATA lo_saver_anytab TYPE REF TO zcl_c8a005_save2db.

GET TIME STAMP FIELD lv_ts.

lt_sample_tab = VALUE #(
( entity_guid = 'ANY_GUID_MOD' entity_param1 = 'CHAR10' entity_param2 = '0504030201'
entity_param3 = sy-uzeit entity_param4 = sy-datum entity_param5 = lv_ts )
( entity_guid = 'ANY_GUID2_MOD' entity_param1 = '2CHAR10' entity_param2 = '0102030405'
entity_param3 = sy-uzeit entity_param4 = sy-datum entity_param5 = lv_ts )
( entity_guid = 'ANY_GUID2_DEL' entity_param1 = '2CHAR10' entity_param2 = '777909034'
entity_param3 = sy-uzeit entity_param4 = sy-datum entity_param5 = lv_ts )
).

lt_head_tab = VALUE #(
( head_guid = 'ANY_GUID_UPD' head_param1 = 'ANY_GUID_ADD' head_param2 = '9988776655'
head_param3 = sy-uzeit head_param4 = sy-datum head_param5 = lv_ts )
( head_guid = 'ANY_GUID2_UPD' head_param1 = 'ANY_GUID2_ADD' head_param2 = '9988776655'
head_param3 = sy-uzeit head_param4 = sy-datum head_param5 = lv_ts )
( head_guid = 'ANY_GUID_DEL' head_param1 = 'ANY_GUID_ADD' head_param2 = '9988774444'
head_param3 = sy-uzeit head_param4 = sy-datum head_param5 = lv_ts )
( head_guid = 'ANY_GUID2_DEL' head_param1 = 'ANY_GUID2_ADD' head_param2 = '9988774444'
head_param3 = sy-uzeit head_param4 = sy-datum head_param5 = lv_ts )
).

lt_item_tab = VALUE #(
( head_guid = 'ANY_GUID_UPD' item_guid = 'ANY_ITEM_GUID_ADD' item_param1 = '2CHAR10' item_param2 = '9988776655'
item_param3 = sy-uzeit item_param4 = sy-datum item_param5 = lv_ts )
( head_guid = 'ANY_GUID2_UPD' item_guid = 'ANY_ITEM_GUID2_ADD' item_param1 = '2CHAR10'
item_param3 = sy-uzeit item_param4 = sy-datum item_param5 = lv_ts )
( head_guid = 'ANY_GUID_DEL' item_guid = 'ANY_ITEM_GUID_ADD' item_param2 = '9988776655'
item_param3 = sy-uzeit item_param4 = sy-datum item_param5 = lv_ts )
( head_guid = 'ANY_GUID2_DEL' item_guid = 'ANY_ITEM_GUID2_ADD' item_param1 = '2CHAR10'
item_param3 = sy-uzeit item_param4 = sy-datum item_param5 = lv_ts )
).


CREATE OBJECT lo_saver_anytab.
lo_saver_anytab->save2db( EXPORTING iv_tabname = lc_db_tab_sample
it_tab_content = lt_sample_tab ).

lo_saver_anytab->save2db( EXPORTING iv_tabname = 'ZTC8A005_HEAD'
it_tab_content = lt_head_tab ).

lo_saver_anytab->save2db( EXPORTING iv_tabname = 'ZTC8A005_ITEM'
it_tab_content = lt_item_tab ).

CLEAR lt_sample_empty_tab.
lo_saver_anytab->save2db( EXPORTING iv_tabname = lc_db_tab_sample
it_tab_content = lt_sample_empty_tab ).


" database changes are to be after commit-command (which is in method do_commit_if_any )
" empty table does not take into account while commit command
lo_saver_anytab->do_commit_if_any( ).

 


Update



Also it is possible not to pass table name
DATA lt_sample_tab TYPE STANDARD TABLE OF ztc8a005_sample.

lt_sample_tab = VALUE #(
( entity_guid = 'ANY_SIMPL_GUID_MOD' entity_param1 = 'CHAR10' entity_param2 = '0504030201' )
( entity_guid = 'ANY_SIMPL_GUID2_MOD' entity_param1 = '2CHAR10' entity_param2 = '0102030405' )
( entity_guid = 'ANY_SIMPL_GUID2_DEL' entity_param1 = '2CHAR10' entity_param2 = '777909034' )
).

CREATE OBJECT lo_saver_anytab.
lo_saver_anytab->save2db( EXPORTING it_tab_content = lt_sample_tab )->do_commit_if_any( ).

 

Typing of internal table could be by different options.
DATA lt_sample01 TYPE ztc8a005_sample_tab_type. " by table type

" recomended:
DATA lt_sample02 TYPE STANDARD TABLE OF ztc8a005_sample. " dynamical typing by STANDARD TABLE OF DB_tabname

" Could by on the basis of interface
INTERFACE ltc_interface4type.
TYPES ts_data TYPE ztc8a005_sample.
TYPES tt_data TYPE STANDARD TABLE OF ts_data WITH DEFAULT KEY.
ENDINTERFACE.

DATA lt_sample03 TYPE STANDARD TABLE OF ltc_interface4type=>ts_data.

Details of the ABAP-utility is available in github-project AnyTab UpdateTask. I want to pay attention that core of the utility and demo-reports are in different packages. So you can leave DEMO-reports in development system, but core-utility use for the whole system landscape.

I will appreciate any comments, question and stars 🙂

Conclusion


I think it is possible to speed up ABAP-development with utility AnyTab UpdateTask. But what do you think about that? Please share your feedback and thoughts in comment to the blog post.

Also please read and check blogs in ABAP Community and propose your comments and answers to questions 🙂
12 Comments
oberon_ntpl
Explorer
Hi Oleg

The pros and cons as I see it:
+ No need to create a separate FM or to modify existing one each time a new table appears
+ Now youre sure all your data is updated within LUW, no dumb mistakes, no direct updates etc, all that was out of laziness
+ The more OOP, the more neat, layered and portable code

- All data is being updated through one FM. All you see now in the update queue is a bunch of similar calls that is hard to analyse without drilling down through the parameters
- The need to pass the tablename as a parameter along with the data. It's not good to make ABAPers care about a type name during both the declaration and runtime. You can get rid of it, though, using RTTI to get the table name out of the type of passed data making only nessessary that there always should be a type directly connected with DB table name.
olegbash599
Active Participant
Hi Dmitry,

thank you for highlighting pros and outlining cons.

as for cons - let us go with help of best practises (as usual) and clarify them:

1) "The need to pass the tablename as a parameter along with the data"
Agree with you, but not always RTTI will provide correct name of target database table. But the option where it is possible will be provided. Thank you!

I have created an issue and put into the project board. It is public for now. So you can check and comment.

2) "you see now in the update queue is a bunch of similar calls that is hard to analyse without drilling down through the parameters" 
Yes, you are correct, but it is not issue of the current project. No one can guarantee you that from update function name you will definitely understand without drilling down to parameters. When you analyzing queue it is normal to check parameters.

Nevertheless, I  have also created issue for that, but marked it as nice to have add-on. I think it would be better to add the enhancement of the tool Sm13 with option to check parameters to ABAP-wish list and inform ABAP-mentors ( thomas.jung horst.keller ).
Hi

I think your's solution will be useful sometimes. A similar idea is used in BOBF, where the FM '/BOBF/CL_DAC_UPDATE' is used to update node tables. But it uses xstring type to transfer data, which is obtained when exporting internal tables to a data buffer (EXPORT ... to DATA BUFFER...). You are using JSON for this. It's much slower.

I wrote a test program that compares the speed of serialization/deserialization using export/import and using the code from your solution. Serialization/deserialization via JSON is about 50 times slower.

I think you could change the type from string(JSON) to xstring(DATA BUFFER) for parameter passed to update-FM with serialized  data, without changing the main logic and you'll get increase perfomance (and descrease amount of data transferred).

 
olegbash599
Active Participant
Hi Alexander,

thank you very much! No issues with performance for now during more than 6 years of using. One clever guy said: Premature optimization is the root of all evil 🙂 

Nevertheless, according to best practices I have created an issue and put it on the board.

It is public for now. So you can check and comment.
olegbash599
Active Participant
0 Kudos

Hi Dmitry,

for the option not to pass tabname is implemented. please check. Documentation and blog are also updated.

So you can proceed with that

DATA lt_sample_tab TYPE STANDARD TABLE OF ztc8a005_sample.

lt_sample_tab = VALUE #(
( entity_guid = 'ANY_SIMPL_GUID_MOD' entity_param1 = 'CHAR10' entity_param2 = '0504030201' )
( entity_guid = 'ANY_SIMPL_GUID2_MOD' entity_param1 = '2CHAR10' entity_param2 = '0102030405' )
( entity_guid = 'ANY_SIMPL_GUID2_DEL' entity_param1 = '2CHAR10' entity_param2 = '777909034' )
).

CREATE OBJECT lo_saver_anytab.
lo_saver_anytab->save2db( EXPORTING it_tab_content = lt_sample_tab )->do_commit_if_any( ).

You can use anything as reference for typing, but it recommended approach is DDIC reference, because it would be more readable and can use standard tooling as where-used and code_scanner.

" recomended:
DATA lt_sample02 TYPE STANDARD TABLE OF ztc8a005_sample. " dynamical typing by STANDARD TABLE OF DB_tabname

you issue is changing status to DONE.

If no objection during 2 days - it would be closed. ... and no cons would exist 🙂

Thank you very much!

olegbash599
Active Participant
0 Kudos

Hi Alexander,

According to tests it seems that approach like a BOBF is not robust at all; more over using of DATA BUFFER in broad way is not providing robustness. And the most robust approach is with transformation.

The situation when something changed in ABAP dictionary is quite often situation for any systems. and it is also possible that some data element could be changed when something is proccessing. Only transformation with obvious fields (but not in bin-mode) provide stable and robust updation; and clear picture in debug mode (that is also important)/

The main falls of BOBF: it is not using power of EXPORT/IMPORT at all and no exceptions handling (bad practices). As for EXPORT/IMPORT via BUFFER it also has some bad tests (as below). So JSON seems to be the most robust.

Let's walk through the following test.

Initial states and test-samples.

The initial state of table ZTAB_CHANGABLE
Field nameData TypeComment
MANDTMANDTKeyField
FIELD_KEY1CHAR80KeyField
FIELD2INT1 
FIELD3CHAR20 
FIELD4CHAR10 

Test-Sample “like a BOPF”

Could be checked in function /BOBF/CL_DAC_UPDATE. The main idea: deserialization via buffer data with no exceptions handling and no options.

Include /BOBF/LCL_DAC_UPDATEF01 (lines 66-69)

Or here ; the sample itself

Test-Sample “EXPORT/IMPORT with options”

Deserializing also via BUFFER but using exception handling and options for structure changing. I have prepared function here (called Z_CALLFUNC_UPD_BIN)

Test-Sample “transformation with JSON with option”

Another function module prepared for test purposes (called Z_CALLFUNC_UPD_JSON)

 

Test cases

The test Case should be launched with new session.

Test Case (short)Details of test case
10Increasing the length of existing fieldThe table should be in initial state.

After data serialization and sending serialized data to update function module, the following should be done.

In table ZTAB_CHANGABLE the data element for FIELD3 should be changed from CHAR20 to CHAR30.

After running test-case the table should be changed to initial state.

20Decreasing the length of existing fieldThe table should be in initial state.

After data serialization and sending serialized data to update function module, the following should be done.

In table ZTAB_CHANGABLE the data element for FIELD3 should be changed from CHAR20 to CHAR5.

After running test-case the table should be changed to initial state.

30New field to tableThe table should be in initial state.

After data serialization and sending serialized data to update function module, the following should be done.

In table ZTAB_CHANGABLE the new field with name FIELD5 should be added (data element = SYINDEX).

After running test-case the table should be changed to initial state.

40Removing field from tableThe table should be in initial state.

After data serialization and sending serialized data to update function module, the following should be done.

In table ZTAB_CHANGABLE the field FIELD4 should be removed.

After running test-case the table should be changed to initial state.

50Change from integer to char with no length changing.The table should be in initial state.

After data serialization and sending serialized data to update function module, the following should be done.

In table ZTAB_CHANGABLE the field FIELD2 should be changed from INT1 to CHAR3.

After running test-case the table should be changed to initial state.

 

Testing report itself also available in ABApTestinator

Result table (you can check by yourself)

 Number of test case from the above
Test Samples1020304050
1. Like a BOPFRuntime ErrorRuntime ErrorData ParsedRuntime ErrorRuntime Error
2. EXPORT/IMPORT with optionsData ParsedNot parseData ParsedData ParsedNot parse
3. transformation with JSON with optionData ParsedData ParsedData ParsedData ParsedData Parsed

 

So, it is not reasonable to implement the approach as no performance issues (performance is quite sufficient) but risk of bad updation is increasing.

 

shais
Participant
A very detailed analysis.

May you please explain what is the exact test case?

How/Why wouldn't the data structure match the dictionary DB table?
olegbash599
Active Participant
0 Kudos

Here are 5 test cases actually 🙂

And the general idea: is to test situation when you making changes to the system and transport it to target system. three approaches of deserialization during UPDATE TASK-process was analyzed.

So, answer to this question:

> How/Why wouldn't the data structure match the dictionary DB table?

because somebody has done the changes for business requirements and transport it into target system. In the moment of transport the structures could be not equal in different work processes.

shais
Participant
I wonder if such extreme case is really feasible (not in debug).
olegbash599
Active Participant
0 Kudos
Change is The Only Constant thing 🙂 So, the system should be prepared to as many changes as possible.
shais
Participant
I agree, but what you are describing may actually affect any update request in the system (and not specifically these of your solution) and I have never run into such issue in the past.
olegbash599
Active Participant
0 Kudos
Yes, but only AnyTab UpdateTask will deal with the situation correctly 🙂
> I have never run into such issue in the past.

Actually, to know If you had such issue or not - you need to analyze every field in tab. Nobody will inform you that this specific error happened. Only some problems with data in process - it will be the only sign that something has gone wrong. And reasons for that could be wide range. And the described test cases group is one of them.
Labels in this area