Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member439824
Contributor
When you want to expose data residing into a HANA database to the outside world, the recommended best practice is to use OData.
Recently, SAP started promoting a new Cloud Application Programming model (CAP). The SAP Cloud Application Programming model is a framework of languages, libraries, and tools for building enterprise-grade services and applications. It guides developers along a ‘golden path’ of proven best practices and a great wealth of out-of-the-box solutions to recurring tasks.

CAP-based projects benefit from a primary focus on domain. Instead of delving into overly technical disciplines.

In this blog, I will use the SAP Cloud Application Programming Model to create tables on SAP HANA Cloud, and expose these tables as OData services.With this method, data is exposed using OData v4.0. As opposed to the traditional xsodata method, where data is exposed using OData v2.0.

References :
Combine CAP with SAP HANA Cloud to Create Full-Stack Applications
SAP Experience academy (SAP Internal)
CAP Getting started guide



Prerequisites



Get your development environment ready


When your HANA Cloud instance is set up, and you are ready to start, open the subscriptions in your subaccount and click on SAP Business Application Studio.


Create a new Dev space.


Select the SAP Cloud Business Application Template and provide a name for your Dev Space.


Wait until the status changes from Starting to Running, then click on the tile with the Dev Space name. In the background, the Dev Space has been prepared with all of the necessary components that you would otherwise have to install on your laptop. For example, Node JS, CDS, etc.



Create a CAP Project from a template


Now we have our Business Application Studio started, configured and ready for use. On the Welcome tab, click on "Create project from template".


Select the @Sap/cap Project template.


Check the hana box in order to include SAP HANA-related features in your project.


Behind the scenes, your project will be generated. Once complete, the screen will return and you will see a pop-up message box in the bottom right to open a workspace with your project. Click on the Open in New Workspace button.


The editor will reopen in a new workspace and now you can start creating.
Notice the blue bar at the bottom of your screen that the Space has not been set with Cloud Foundy.

Click on this bar to connect Business Application Studio to the space where you want to deploy your OData service.


 

Insert the Cloud foundry endpoint, then enter your credentials and select the space in which you want to work.



Create your database model


Business Application Studio is now connected to your Cloud Foundry space, let's create objects in your database model.

From the File Structure on the left, right-click on the db folder and create a new file ending with .cds .
I call mine schema.cds . The name of the file can be anything, this file will define all objects(tables, views) deployed in your HANA database.


Within the new schema.cds, create your first CAP Structure :
namespace scp.cloud;

using {
cuid,
sap.common
} from '@sap/cds/common';

entity SafetyIncidents : cuid {
title : String(50) @title : 'Title';
description : String(1000) @title : 'Description';
}


In this example, we are defining a namespace scp.cloud.
We then call the library @sap/cds/common and use the cuid aspect. It automatically defines an ID column for us in the entity SafetyIncidents. Learn more about aspects in the CAP documentation.
An entity defined in CAP will be deployed as a table in your database.

Open a terminal window by going to the Top Menu and selecting Terminal -> New Terminal.


Within your project folder, execute the command npm install


Now we will introduce you to a very useful command: cds watch .
Whenever you feed your project with new content, for example, by adding or modifying .cds , .json , or .js files, the server automatically restarts to serve the new content.

Execute cds watch in the terminal window.

As long as that command is running, each time you change the project structure, it will automatically save and redeploy those project changes.



After a few seconds, the cds watch command generates your OData service.
It also creates the table from schema.cds in an SQLite database within your development environment.
Click on the button Expose and Open to see if your initial empty service gets rendered in the browser window.


It is still empty now.



Expose an entity as an OData service


Now that we have an entity for SafetyIncidents defined, you can easily add a service definition to expose it as an OData service. Lets do that now! Create a new file within the srv folder called incidentService.cds


Within incidentService.cds, enter the following code:
using scp.cloud from '../db/schema';

service IncidentService {
entity SafetyIncidents as projection on cloud.SafetyIncidents;
}


The first line references the schema.cds file we created earlier. The second line exposes cloud.SafetyIncidents as an OData service called Incident Service.


If you closed your preview tab, you can always re-open it by clicking on view: find commands, then searching for the command Ports: Preview . This will open a preview of the currently exposed ports.


Now let's insert some data into our table. Start by creating a new folder called data within the db folder.


Within that folder, create a file called scp.cloud.SafetyIncidents.csv with the following entries:
ID;title;description
067460c5-196c-4783-9563-ede797399da8;Broken machine;The printing machine is leaking
efec3e9f-ceea-4d17-80a7-50073f71c322;Software bug;The computer is on fire

The file name has to match the name space (scp.cloud) and the entity name (SafetyIncidents) where you want to insert data.


Double check as shown in the screenshot that you have the right spelling of the data folder under the db folder and that the filename is spelt correctly. Make sure that the column names are correct in the actual csv file.

If the cds watch is still running, stop it once and execute cds run in the terminal to ensure data is imported into your SQLite table.

The message > filling scp.cloud.SafetyIncidents from db/data/scp.cloud.SafetyIncidents.csv  tells you that data is being imported.


Now that's it running, you can open the service and click on the SafetyIncidents entry, you should see the following data:


You now have a table deployed on your SQLite database within your development environment, filled with some test data. This table is exposed through an OData service which can be accessed from outside through REST calls.

Deploy your data model and your OData service to SAP BTP


Now that you have the backend services running on SQLite in a local environment, it's time to get this project running on SAP HANA Cloud.

Quick recap



  • A schema for the incident management application has been created schema.cds

  • A service definition has been added to expose the correct entities incidentService.cds

  • The SQLite node module let us run the application connected to SQLite with data loaded into a table


Prepare your project for HANA Cloud


On SAP HANA Cloud, CDS models are deployed through the hdbtable and hdbview formats instead of hdbcds. Edit your package.json to set the deploy-format to hdbtable.
Add the following line in the "cds" section of package.json.
"hana" : { "deploy-format": "hdbtable" }

Your code should be similar to this screenshot :




Build your project


Within the Node.JS world, there is an environment variable called NODE_ENV. Until now we have been using the "development" environment. It is time to switch that variable to "production". It will affect the way that CDS behave. In order to deploy your project to SAP BTP, the following commands must be run from the Terminal window.

  1. Stop your running cds process with CTRL+C if it's already running.

  2. Execute : export NODE_ENV=production

  3. After this command runs successfully, execute : cds build/all --clean



This command will build all of the relevant HANA artifacts and place them in a newly created folder that you should now see called gen. If you expland it, you should see 2 folders DB and SRV. As you might expect, if you drill into the DB folder, you will see the HANA DB artifacts and if you drill into the SRV, there are new files in there as well.

Create your HDI Container and deploy objects


Once the build process has completed, you will now execute 3 commands in succession in order to (1)create the HANA Deployment Infrastructure (HDI) container on Cloud Foundry, (2)deploy the HANA Artifacts and (3)deploy the SRV Artifacts.
Notice that in your terminal, the build process tells you which command you need to run in order to create the HDI container.

Execute the following command: cf create-service hana hdi-shared cap_project-db
(The creation of the container can take a few minutes, you should wait before executing the next step!)

This will create a HDI container called cap_project-db.

Note: On the screenshot below, I run the command cf create-service hanatrial hdi-shared cap_project-db: I am actually deploying on "HANA as a Service".
Use the parameter hana to deploy on HANA Cloud.


Execute the following command: cf push -f gen/db -k 256M


This will deploy the generated  hdbtable and hdbview objects to your HDI Container.
The HDI container creation takes a couple of minutes,


Execute the following command: cf push -f gen/srv --random-route -k 320M

This will deploy the Node.JS application exposing your OData service.


If all three of the last commands executed correctly, you should see a route specified towards the bottom of the terminal window. The use of the word option --random-route directs the process to create a random URL.

Once you find the route name that was generated uniquely for you, you can paste that URL into a browser to validate that it is running and available on the internet.


Open a web browser, paste your newly created route and you should see a familiar screen that looks like this when you open your entity. This fully deployed service is now available on the internet and using SAP HANA Cloud as a persistence layer.


Good job, you just deployed an OData service on SAP HANA Cloud !

Expore further


If you want to build a more complex OData service, here are some ideas to start. I will use this service in my next blog where I create a Fiori UI to create an application where users can report Safety Incidents. Explore the entity definitions below and the CAP Documentation to learn how CAP makes your life easier.

Replace your schema.cds file with the following code:
namespace scp.cloud;

using {
cuid,
managed,
sap.common
} from '@sap/cds/common';

entity SafetyIncidents : cuid, managed {
title : String(50) @title : 'Title';
category : Association to Category @title : 'Category';
priority : Association to Priority @title : 'Priority';
incidentStatus : Association to IncidentStatus @title : 'IncidentStatus';
description : String(1000) @title : 'Description';
incidentResolutionDate : Date @title : 'ResolutionDate';
assignedIndividual : Association to Individual;
incidentPhotos : Association to many IncidentPhotos
on incidentPhotos.safetyIncident = $self;
incidentHistory : Association to many IncidentHistory
on incidentHistory.safetyIncident = $self;
}

entity Individual : cuid, managed {
firstName : String @title : 'First Name';
lastName : String @title : 'Last Name';
emailAddress : String @title : 'Email Address';
safetyIncidents : Association to many SafetyIncidents
on safetyIncidents.assignedIndividual = $self;
}

entity IncidentHistory : cuid, managed {
oldStatus : Association to IncidentStatus @title : 'OldCategory';
newStatus : Association to IncidentStatus @title : 'NewCategory';
safetyIncident : Association to SafetyIncidents;
}

entity IncidentPhotos : cuid, managed {
@Core.IsMediaType : true imageType : String;
@Core.MediaType : ImageType image : LargeBinary;
safetyIncident : Association to SafetyIncidents;
}

entity IncidentsCodeList : common.CodeList {
key code : String(20);
}

entity Category : IncidentsCodeList {}
entity Priority : IncidentsCodeList {}
entity IncidentStatus : IncidentsCodeList {}

Replace your incidentService.cds with the following code:
using scp.cloud from '../db/schema';

service IncidentService {

entity SafetyIncidents as projection on cloud.SafetyIncidents {*,assignedIndividual: redirected to Individual };
entity Individual as projection on cloud.Individual {*,safetyIncidents : redirected to SafetyIncidents};
entity SafetyIncidentsNoImages as projection on cloud.SafetyIncidents{ID ,createdAt, priority, incidentStatus,description};
entity IncidentPhotos as projection on cloud.IncidentPhotos {*,safetyIncident : redirected to SafetyIncidents};
entity IncidentHistory as projection on cloud.IncidentHistory {*,safetyIncident : redirected to SafetyIncidents};
entity IncidentsByCategory as select from cloud.SafetyIncidents {count(ID) as categories:Integer,key category} Group By category;

@readonly entity Category as projection on cloud.Category;
@readonly entity Priority as projection on cloud.Priority;
}

Find here a few more advanced examples of using CAP to develop applications :

Maxime SIMON
34 Comments
Murali_Shanmu
Active Contributor
0 Kudos
Very useful. Thanks for sharing.
thomas_jung
Developer Advocate
Developer Advocate
0 Kudos
You said you are deploying this to HANA Cloud, but in your HANA HDI container instance creation you are using the hanatrial service:

cf create-service hanatrial hdi-shared cap_service-db

The hanatrial service is actually the older HANA As A Service offering not HANA Cloud.  Actually the hana service in the create-service command should be used for the HANA Cloud trial as well.  It is not just for production as you stated in this blog post.
former_member439824
Contributor
0 Kudos
Hello Thomas, thanks for the comment.

As you said, I deployed my HDI container on HANA as a Service as I did not have a HANA Cloud instance running.
I updated the blog to use the hana service in the create-service command, in order to stick to a HANA Cloud tutorial.

 
former_member666403
Active Participant
0 Kudos
Hi Maxime,

If you want to try it out with SAP HANA Cloud, you can sign up for trial. Alternatively, reach out to me and I can share more info.
stevenspronk
Explorer
0 Kudos
Hi Maxime,

 

Fantastic blog! It explains how to easily create a CAP based application with a HANA Cloud db. I have one remark:

When creating the HDI service you say:

Execute the following command: cf create-service hana hdi-shared cap_service-db

While the name of the service should be cap_project-db

You may want to change that.

Thanks!
JuanDK78
Participant
0 Kudos

When executing the create-service command, it wouldn’t it create an HDI instance under the SAP Hana Service service and not SAP Hana Cloud?

cf create-service hana hdi-shared cap_service-db

We are on a migration scenario, so I want to make sure the container is binded SAP Hana Cloud service.

SAP documentation indicated this pattern to be used:

cf create-service hana hdi-shared SERVICE_INSTANCE -c '{"database_id":"hana-db_service_instance_guid"}'

https://help.sap.com/viewer/db19c7071e5f4101837e23f06e576495/LATEST/en-US/2863434ddda042b8b8011a3f24...

 

 

former_member711830
Discoverer
0 Kudos

Hi Maxime,

The Blog is really wonderful. Can you please share sample data (csv files for all tables) for Incident Example.

 

Thanks

a srikanth

former_member439824
Contributor
0 Kudos
Thanks, I corrected the mistake
former_member439824
Contributor
0 Kudos
The -c parameter allows you to define the database ID if you have several databases running in the same space.
janithi
Participant
0 Kudos
How to expose cloud HANA db tables & CDS views as XSOData services
former_member1760
Participant
0 Kudos
This was very helpful 🙂 one question though, as we are in the process of implemeting something similiar. Just to have a clear idea about the possiblities after exposing the OData service, can it be consumed as a destination in Cloud foundry and used in another app created on BAS?
former_member439824
Contributor
0 Kudos
Yes. After you deploy your Odata service, it can be consumed from the destinations in Cloud foundry and from other apps.
0 Kudos

Hi,

I am trying to acquire data in SAP Analytics Cloud through OData Services. I have carried out all the required steps in this blog. I wanted to know about what authentication (Basic Authentication, OAuth, no authentication) will we require to fill when connecting it to SAC. Also, is the Data service URL the same as the random URL generated through --random-route ?

This is the dialog box in SAC 

former_member439824
Contributor
0 Kudos
CDS views need to be exposed as OData services through the ABAP layer of S/4HANA.
This blog does not cover S/4HANA.
former_member439824
Contributor
0 Kudos

Hello, in this blog I did not cover about authentication. If you replicated my steps, authentication is not required to access your OData service.

If you want to set up authentication, you need to use token-based authentication (XSUAA).

0 Kudos
Hi, thank you for the reply. Could you please also tell if you have idea about the Data Service URL.
Is it the same as the random URL generated through --random-route ?


Is the "Application Routes" the Data Service URL to be used ?

former_member439824
Contributor
Yes, this is the correct. As explained in the blog in the "Create your HDI Container and deploy objects" section
0 Kudos

Thank you for the blog post. All steps were successful for me except for deployment of tables and views.

0 Kudos
Hi Ruchi,

Could you please check if the HANA Database instance is running, in which your HDI container is deployed? I got the same error and it worked for me when I started my database instance.
0 Kudos
My database is running. HDI container was created, however the tables/ views were not deployed.
0 Kudos
Thanks for sharing blog post. You have well explained.

I am getting an error while running below command.
cf push -f gen/srv --random-route -k 320M

Error details:
npm ERR! code ENOTFOUND
npm ERR! syscall getaddrinfo
npm ERR! errno ENOTFOUND
npm ERR! network request to http://nginx-redirector.repo-cache.svc.cluster.local/repository/appstudio-npm-group/yallist/-/yallis... failed, reason: getaddrinfo ENOTFOUND nginx-redirector.repo-cache.svc.cluster.local
npm ERR! network This is a problem related to network connectivity.
npm ERR! network In most cases you are behind a proxy or have bad network settings.
npm ERR! network
npm ERR! network If you are behind a proxy, please make sure that the
npm ERR! network 'proxy' config is set properly. See: 'npm help config'
npm ERR! A complete log of this run can be found in:
npm ERR! /tmp/cache/final/.npm/_logs/2022-02-09T01_13_22_945Z-debug.log
**ERROR** Unable to build dependencies: exit status 1
Failed to compile droplet: Failed to run all supply scripts: exit status 14
Exit status 223
Cell ********************** stopping instance *************************
Cell ********************** destroying container for instance *************************
BuildpackCompileFailed - App staging failed in the buildpack compile phase
FAILED
former_member439824
Contributor
0 Kudos
It seems to be a network connectivity error. Check that your terminal can reach the Cloud Foundry endpoint where you are trying to deploy the app : use "cf login" to login, then "cf services" to list the available services.

If both commands do not work correctly, check network connectivity.

If both commands work, you should be able to use "cf push" to deploy apps. If you cannot deploy your app : post details about your issue on answers.sap.com
john_knight00
Explorer
Hi Vikas,

I had exactly the same problem when pushing the srv module. The solution was to add the following section in the package.json file:

  "engines": {
"node": "12.22.7",
"npm": "6.14.15"
}

You can get both versions by running these commands:

  • npm -v

  • node -v


Actually, the node's version in my case was 14.17.6, but when placing that value in the package.json file, the push command would crash and it offered me these versions:
**ERROR** Unable to install node: no match found for 14.17.6 in [12.22.6 12.22.7 14.18.0 14.18.1 16.10.0 16.11.1]

I picked the nearest version (ie 12.22.7).

After editing the package.json file, execute "cds build/all --clean". The push command should now work properly.

omoliu
Explorer
0 Kudos
Hi Juan,
Thank for your comment soooo much!

I had exactly the same error and now it'd been solved!

Anna
0 Kudos
HI All,

 

I am also facing the similar issue and even after trying every solution provided here it's not working. Any help please?

 

Thanks

Kapil
former_member439824
Contributor
0 Kudos
Post details of your issue on https://answers.sap.com/index.html in order to get help from the community.
You can also raise an incident on the SAP portal as a customer/partner to get enterprise support.

Link this blog as a reference and explain at which step you are facing issues.
0 Kudos

..

jcabral
Explorer
0 Kudos
Hi Maxime, I'm having some troubles while reaching to the end of this blog. Could you give me some advice?

 

While running this command:
cf push -f gen/srv --random-route -k 320M

 

Service fails to start, and in the cf logs --recent i can visualize the following error:
ERR [ERROR] Unable to require required package/file passport

 

I searched for this error but couldn't find anyting that addresses this same issue.

 

Thanks in advance.

Juan.
former_member439824
Contributor
0 Kudos
From the error, it seems the Node.js package 'passport' cannot be imported.

Can you check whether the cds build/all --clean command correctly imported passport into your node_modules folder ?

Dependencies are listed in the package.json file. If it is not imported correctly, a possible reason might be that the package.json is not up-to-date, you might need to increase the version of the 'passport' package.
jcabral
Explorer
0 Kudos
Thanks Maxime... it was a bit more tricky than that.

Only way I could make it work was:

  1. Run "npm add passport" in terminal

  2. Adding xs-security.json and modifying mta.yaml and package.json to use JWT and XSUAA as auth methods and services according to the project in the following link: https://github.com/SAP-samples/fiori-tools-samples/tree/3e7787764635f7454cc3b479af014926c91276e4/cap...

  3. Building the .mtar from the mta file and deploying the generated .mtar


After doing all that, service app is correctly deployed and started. And from that point on any modification to database or service can be pushed with "cf push -f ..." command.

 

I have no idea why SAP made such complication... this is by no means something easy to do or solve for a person whom is new to CAP.
aoyang
Contributor
0 Kudos
My node version = 14.17.6 and npm version = 8.13.2 but configuring these values did not solve the issue. Strangely, I tried your version and the push was successful. Don't know why but thank you:)
yasuyukiuno
Active Participant
0 Kudos
Nice article !

After pressing the connect bar, I entered the CF API endpoint, but there is no response when I press the enter key.
Is there any solution?

mariana-nagy
Newcomer
0 Kudos

Hi Siddharth, Maxime,

I am facing a similar issue with importing data into SAC through the deployed oData Services. I used the HANA academy generator to create a simple CAP project with user authentication (no authorization) and succesfully deployed the services.

I am also able to use Postman and test the oData service successfully, however I have not had any luck connecting from SAP Analytics Cloud to my oData service with oAuth 2.0 Client Credentials.

I am getting an error "Connection to server failed: oData services"

Could you please share how you were able to create this import connection in SAP Analytics Cloud?

Thank you

 

Prashant_1986
Explorer
0 Kudos
HI Simon,

I have tried doing and exposing the calculation view using OData services through CAP Model. yes, it is a good application where lot of flexibility available but in order for testing the POST Method am facing some issues. where am not able to see the inserted records  in the calculation view.

 

I know am not giving you any steps but still do we have any option to insert the new records in Odata services of a calculation view. Also if possible we can connect on my gmail.