This is the second part of a two parts tutorial. In this part we take the trained model that we created in the first part, upload it to Machine Learning Foundation and use it. We will create three Python files, one that carries a bunch of helper func...
This is the first part of a two parts tutorial. In this part we create a TensorFlow model and save the trained model. In the second part we will take our saved model upload it to Machine Learning Foundation and use it. Our model will be a three-layer...
Hallo,does someone has experience with external context mapping?My example terminates with 'cannot create nodes, no mapping defined yet', but I do not know how to create the mapping between the inner and the outer component dynamically.Thanks,Oliver
Hallo,does any one knows, how to display SAP icons on a Web Dynpro screen. I know that they are on the J2EE server, but I do not know how to reach them.Thanks,Oliver
To be honest, I had to ask a colleague to support me, as I hadn't use Keras before.
You need to access the TensorFlow layer and get the data from there. This can be done e.g. like this:
input_tensor_info = tf.saved_model.utils.build_tensor_info(mode...
Hallo Dhanu,you can use a type describtor: DATA: my_name TYPE abap_abstypename.
my_name = cl_abap_classdescr=>get_class_name( me )
Please see also the class documentation and the documentation about the <i>RTTS - Run Time Type Services</i> .Regards,...
Hallo Palani,for historical reasons not all clusters used in HCM have type that is defined in DDIC. Al least the payroll cluster have. But usually there are includes which describe the data structure. There is a naming convention for the include name...