Hello, I have a question about what happens when an error occurs during a data load in SAP Datasphere. If we use a data flow to perform a load to a DSP Local Table, do all the processed data operate within a database transaction, and if it fails, is ...
I have been contributing to the Datashpere community for a few years now. As one of the users of the first version od DSP, having participated in betas such as Data Flows, I have tried to help all those who had doubts about the product, and of course...
Hi, After having created dozens of websites, and a couple of communities with thousands of users, I find myself unable to use SAP Community fluently.I want to find information on how to use a particular functionality of an SAP product.There is no "ch...
I am having a problem trying to delete rows from a local table with the data editor.I always get INTERNAL SERVER ERROR.After some research, I think the problem is that the table has a field with a technical name that is "GROUP", and I guess that the ...
Hello,Yes, I dream with a "parameterizable task chain", where I can feed a variable with for example a company code, and that this value is passed as parameter to the data flows and these pass it to the views and finally to the data sources.Now that ...
I think the question is what does the system consider a transaction? If it considers the whole process or each one of the batches.From the tests, it is clear to me that the whole process is not managed as a single DB transaction.Actually this is fine...
@TuncayKaraca ok, I'll do the tests TLDR: Commits are done for each block defined in fetch size Test definitionThe test consists of loading a CSV with 30,000 rows by means of a Data Flow, and to provoke an error in the middle of the process, to see...
@TuncayKaraca it's a Data Flow thing. Python has no limit (more than addressable memory).This is something I was able to verify myself since the first version of the DFs.You can test it using the following code: def transform(data):
data['row_num...