Dear all,
The instance created on GCP created two storages (1) CLOUD_STORAGE with description Google Cloud Storage and it woks fine. But the (2) DI_DATA_LAKE where the artefact producer stores model.
I noticed the DI_DATA_LAKE was created as S3 c...
Dear all,
We tried to create a simple training pipeline(in trial account) as in above image using the Python producer template, but we get an error as below.
"Graph failure: operator.com.sap.system.python3Operator:python3operator1: Error while...
Dear all,
We managed to create an instance for SAP DI using my GCP(Google) account(paid)and able to work on the system through RDP as an administrator.
Now, we would like our team members to work on the same instance to create a POC, as we have a...
Dear all,
I tried to create a SAP DI instance in GCP with my GCP service account and SAP CAL returned with an error message that I need to increase my quota to 26 vCPU instead of available 24 vCPUs in GCP.
Anyone who has created the instance succ...
Hi Dimitri,I tried the same as you have said above but we get an error message as below in the pipeline on the python3 operator.Graph failure: operator.com.sap.system.python3Operator:python3operator1: Error while executing callback registered on port...
Hi dimitri,Thank you for your continued support.It is the same issue to make the DI_DATA_LAKE connection in trial version work with SDL type.The standard artefact producer points to SDL which is in DI_DATA_LAKE so trying the same.I just mentioned tha...
dimitri Thankyou for pointing out the correct tab.Unfortunately, the JSON file looks OK with "type":"SDL" and I didn't have to change it.So, the problem remains and we can't produce our model and stuck with our first simple pipeline. (1) Please let u...
Dear al,We faced the same issue in our trial version. We are not able to use the DI_DATA_LAKE connection and we don't have an 'option' (button not displayed) to correct the connection by exporting and importing as mentioned in one of the posts as in ...