cancel
Showing results for 
Search instead for 
Did you mean: 

I dream with a "Parameterized" Task Chain

XaviPolo
Active Contributor

Hello,

Yes, I dream with a "parameterizable task chain", where I can feed a variable with for example a company code, and that this value is passed as parameter to the data flows and these pass it to the views and finally to the data sources.

Now that I wake up from my dream 😂, I have to see how I can emulate this (so basic in an ETL) in Datasphere.

Since Task Chains have no parameters, and parameterized views cannot be used in Data Flows, it gets complicated.

I have thought of using a local table as a way to pass parameters between views, that means that a TC has a DF that insert into this table the Company Code to be loaded, and the nexts DFs and Views in the Task Chain read that table to know what they need to load.
BUT how to manage multiple executions (for example when 1 company is loading, then another execution tries to load another company, and things like that?

In SAP Data Services (on-prem ETL from SAP) we can use the JOB_RUN_ID, that is an unique identifier of the task, so if we save in a table the job_run_id and the value, all the steps on this job can read the table correctly filtering by job_run_id. So the process 1 is loading one company, and the process 2 another company .. all works fine.

BUT in DSP there is no JOB_RUN_ID, ate least there is no SQL Function to get it, and to be able to use in a DF or view., so I can't identify different procesess or executions. I tried using the SESSION context, or another HANA DB function, but without success.

Any idea, inspiration, experience, etc... doing something like this in Datasphere? I'm missing something?

Thanks!

MKreitlein
Active Contributor
0 Kudos

Hello XaviPolo,

I think, this is the right space for your dream 😉

https://influence.sap.com/sap/ino/#/campaign/3433/ideas-all

BR, Martin

XaviPolo
Active Contributor
0 Kudos
yes @MKreitlein, I have already registered a few 🙂 But I need a short term solution, for something that from my point of view is basic in an ETL. Right now I have projects with a very important overload of objects due to the lack of this functionality, and the maintenance is going to be a headache. That's why I have some urgency. Regards,
View Entire Topic
SiarheiShydlou
Explorer
0 Kudos

Hello Xavi,

The behavior you described can be achieved using Data Flow Input parameters. This means that you will need to create a separate Data flow for each parameter default value and then combine them in Task Chain.

For the corresponding ETL logic to be reusable across different Data Flows you can create a SQL view.

XaviPolo
Active Contributor
0 Kudos

You cannot use a view with parameters in a data flow so you cannot pass the parameter from the DF to the view.

You could use the parameter in the DF to filter a view without parameters, but then you lose functionality and control of where the filter occurs and you risk for example not pushdown to the remote source, creating performance issues, etc.

The possibility of passing IPs between views, allows just this, to control WHERE the filter is made, and to facilitate the PUSHDOWN, so not being supported in the DF, all these advantages are lost, and that are necessary in many ETLs.

Regards,