Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
A New Home in the New Year for SAP Community!

SAP Developer Challenge - APIs - Task 4 - Discover the Date and Time API Package

qmacro
Employee
Employee

(Check out the SAP Developer Challenge - APIs blog post for everything you need to know about the challenge to which this task relates!)

It's time to take a look at what the SAP Business Accelerator Hub has to offer in terms of APIs.

Background

The SAP Business Accelerator Hub (formerly known as the SAP Business API Hub) is the central location for all things APIs.

While APIs are key resources on there, the hub contains other types of resource too, including:

and much more.

But we're interested in APIs, which are generally organized by product, and within product, by package. For example, the SAP Business Technology Platform product has APIs, Events, Integrations, Business Process and Workflow Management resources.

Selecting the APIs category, you'll see that the APIs are organized into categories (such as SOAP, OData V2, OData V4 and REST) and presented in packages. You can see this clearly on the packages page for the SAP Business Technology Platform product:

sap-btp-api-packages.png

Once you choose an API package, you will find one or more APIs, each of which have one or more endpoints, which are usually arranged into endpoint groups.

You may find this diagram, taken from the Learning about the API structure, authentication and use section of Exercise 05 - Preparing to call a Core Services API of the Hands-on with the btp CLI and APIs SAP CodeJam material serves as a helpful example of this:

+-------------+
|             |
| API Package |      Core Services for SAP BTP
|             |
+-------------+
       |
+-------------+
|             |
|     API     |      Entitlements Service
|             |
+-------------+
       |
+-------------+
|             |
|    Group    |      Regions for Global Account
|             |
+-------------+
       |
+-------------+
|             |
|  Endpoint   |      /entitlements/v1/globalAccountAllowedDataCenters
|             |
+-------------+

This example is from the Core Services for SAP BTP API package.

Your task

Your task is to find and explore the Date and Time API (which lives in its own dedicated package) on the SAP Business Accelerator Hub. It's in the REST category, and has an API specification available in both YAML and JSON formats. Specifically, you should download the API specification either in either of those two formats (your choice).

You should then write a script to parse the downloaded specification, to determine the API endpoints ("paths") that have both of the following properties:

  • the endpoint is accessible with the HTTP GET method
  • responses returned from the endpoint are in JSON

You should take that list of paths, sort them and then join them together with colons, into a single string, with no spaces. That single string is the value that you should then send to the hash function and post the resulting hash as a new reply to this discussion thread, as described in Task 0 - Learn to share your task results.

For example, if you determined that the endpoints paths that fit these conditions are:

  • /endpointOne
  • /endpointTwo
  • /anotherEndpoint

then the string you should produce to send for hashing should be:

/anotherEndpoint:/endpointOne:/endpointTwo

Hints and tips

In order to download API specifications from the SAP Business Accelerator Hub, you'll need to be logged in. You must download them manually via the browser (i.e. rather than retrieve them via another HTTP client).

Response representations are declared using MIME types, and the MIME type for JSON is application/json.

If you want to parse YAML, and you're a JavaScript fan, you may find the js-yaml module useful.

For discussion

The API specification is available in two formats - JSON and YAML. Often for APIs there is also a third format - EDMX. Why isn't that available here?

Which API specification format did you choose, and why? Do you prefer parsing YAML or JSON?

How did you apply the endpoint conditions described above, and in what language?

133 REPLIES 133

abdullahgunes
Galactic 3
Galactic 3
0 Kudos
2a23f73d8853d5eedef0239a8d303c15a9be84a57abd2e26794a638724aaf0f2

0 Kudos

That was fast! 🙂

Afenna
Galactic 1
Galactic 1
0 Kudos

b2d99205831548e9de1cd05ae4f8dfb2a8b5ab2e231d8906f069ec5b99c890e1

bztoy
Galactic 4
Galactic 4
0 Kudos

657f3187f8af8cb9beddbf3c39466a945e228082395189d0cac2542a90a39468

I take a short break from my daily job to do this assignment so submitted my answer with 1 solution first, will cross check and add more stuff later.

The API specification is available in two formats - JSON and YAML. Often for APIs there is also a third format - EDMX. Why isn't that available here?

I remember that we had a discussion about this in the b2b hands-on and you already gave us the answer but my bad, I forgot it already. I will re-run that session again. 😂

IMO, I think the EDMX is written in full technical details and the CDS can help compile the API into the EDMX format if needed.

Which API specification format did you choose, and why? Do you prefer parsing YAML or JSON?

In coding I prefer JSON because I familiar with the tools to work with it but for reading, YAML is very nice for me.

How did you apply the endpoint conditions described above, and in what language?

I've done it with bash script.

[Updated#1] I didn't read the instruction carefully then I wrote a script that produce a wrong hash value. I fixed it with 2 version of code (hopefully I have the right answer now 😅).

I would like to share something that I've learned here.

task-4.png

Method 1: using jq functionalities

1. I transform the JSON input file to new JSON format with to_entries and map as the following code.

 

 

jq '.paths | to_entries | map({endpoint: .key, produce: .value.get.produces[0]})' < "$api_spec_json_file"

 

 

the above statement create a JSON string in format a list (array) of objects with 2 key fields including "endpoint" and "produce" like below

 

 

{
    "endpoint": "/getTimezoneFromLocation",
    "produce": "application/json"
}

 

 

the key player is the map function that help transform individual objects back to an array with new user define keys.

2. then, I use map, select, sort and join to produce a string of endpoints separated by : with below command

 

 

echo "$transform_json" | jq -r '. | map(select(.produce=="application/json").endpoint) | sort | join(":")'

 

 

I use -r option to ask jq to return raw string, select only record I want with select (this was my mistake in the 1st answer), sort the output then join them together with join function.

If anyone are interested and would like to learn more in this topic, I highly recommended this blog by DJ ADAMS which I found it from a google search. 😄😁

 

Method 2: Use jq and POSIX commands

Please note that, I am a new UNIX/Linux user(but really like it). If you have a question why I use these instead of those commands that because I don't know them yet 😅(I keep learning Linux everyday).

1. I extracts the list of endpoints with the following chain of commands

 

 

jq .paths < "$api_spec_json_file" | jq 'keys' | paste -s | sed 's/"//g' | sed 's/\[//' | sed 's/\]//' | xargs | sed -e 's/[[:space:]]\/*/\//g'

 

 

jq 'keys' help me extract the keys out from the JSON string and as you can see, I have to manually do things like replace character myself.

I get back a string of all API endpoints (the 1st key of .paths) separated by a comma as below.

/getCountryDateFormat,/getIetfDetailsFromTimezone,...

2. I read the above string in to an array.

 

 

IFS=', ' read -r -a array <<< "$api_endpoints"

 

 

3. loop through all items then get item 0 of the produces array from those particular endpoint from the input file (again).

yes, this looks does not make sense because I have to read data from file multiple time but I have to use this approach because I have an issue with the name of the endpoint.

the API endpoint name leading with a special character (/) ,I have to escape it with double quotes before I can read this JSON property via jq. In my case, variable key was the endpoint name with escapes char.

However, If I want to chain jq's commands, we chain them within a pair of single quotes (as far as learn from the official document). I found that I cannot chain the property name in variable key that contains double quotes with others jq function because a mix of single quote and double quotes create a wrong syntax.

 

 

        key="\"$element\""
        data=$( jq -c -r .paths."$key".get.produces < "$api_spec_json_file" | awk '{printf $0}' | sed 's/\["//g' | sed 's/"\]//g' )

I might did it wrong, will take a look into it later but my work around here was I ask jq to do just 1 thing (no need ' ') then chain the output to other POSIX command like awk and sed to do the job instead as you can see in the above command.

variable data contains type of data the the API will send back, I use if clause statement to collect the endpoint that return application/json into the output string before call the hash API to generate the final output.

This was my journey with bash and jq in this task. 😊


I will have fun with this assignment  in TypeScript and Go this weekend.

Thanks,

Wises

Thanks for sharing your thoughts here, @bztoy . If you (and others) remember, EDMX is used for specific types of RESTful APIs ... (that's a clue btw ;-))

Regarding your comments on YAML vs JSON, I agree - reading and writing YAML is (for me) a little easier on the eye and brain (did you know that the official YAML website has its main contents ... in YAML? 🙂

qmacro_0-1691758795363.png


But the biggest plus for YAML is that you can have comments, plus there's a reference mechanism too so you don't have to repeat data that you want to use in different areas.

(Also, a very minor brag is that I was in the room when YAML was coming into the world; I was at a Perl conference way back (over 20 years ago now, gosh!) when and YAML's co-creator Ingy came up with what became YAML as we know it today, more or less during a conversation / session we were having there).

That said, I like the machine-like simplicity and rigidity of JSON, and of course I like JSON as I can parse it with my current favourite programming language jq!

0 Kudos

 

(did you know that the official YAML website has its main contents ... in YAML?

This is really cool. 😎

 

But the biggest plus for YAML is that you can have comments,

I do agree that one of JSON drawback is comment is not supported.

 

plus there's a reference mechanism too so you don't have to repeat data that you want to use in different areas.

this is new for me, just know it now. this sounds fantastic, I have to learn YAML right now. 😉

 

That said, I like the machine-like simplicity and rigidity of JSON, and of course I like JSON as I can parse it with my current favourite programming language jq!

😎

 

Thanks DJ for the knowledges you spread around the blog (as always)

AAncos
Galactic 2
Galactic 2

9aabbb9c6c040e33e1ad591d7b85152ef8c4606d19b8ef10e08c81561b76098f

SSchuck
Employee
Employee
0 Kudos

d2f7ac29c29841c3e387b756732d1a8a96fd1998f8a4e6b6478d2592b2a78057

govardhansahil
Galactic 3
Galactic 3
0 Kudos

2baef2a2ace253804005db6fc4b6727268c77a040b323e82c0806573f9be6692

harsh_itaverma
Galactic 4
Galactic 4
0 Kudos

7b367390432c1f903a4e3c1c9b3ef2b7eae7b1cb28dec72d452515f881476827

Petchimuthu_M
Employee
Employee
0 Kudos

adefda62027144dde101ca42a94b85cfad5202d2ef648479389371f0b37da39a

emiliocampo
Galactic 3
Galactic 3
0 Kudos

08efec35700dfb2ed380274ffd439b535b6a9e242ad6834839d5633e768e915c

prachetas
Galactic 2
Galactic 2
0 Kudos

7dd0abf3ef14706bf4db5c73135599fa249cc2890d36774a23b08591763591e5

ajmaradiaga
Employee
Employee
0 Kudos

3bc8f1632c1fbaa7881dec56cbda74f7dd769665bf3456ff425af0c7803c14fd

I tried using ChatGPT again... it does produce a string in the format specified, but it includes all paths.... I did a manual check and had to remove the paths that don't produce an application/json format.

0 Kudos

I'm absolutely fascinated by ChatGPT's ability to do what it does here - what are your thoughts generally on this?

berserk
Galactic 3
Galactic 3

Loving this mini-learning series by the way @qmacro.

0 Kudos

💙 Thanks @berserk ! BTW, be sure to move the hash in your reply to a separate reply, otherwise it won't be counted (remember: hash replies should just contain the hash and nothing else at all). 

berserk
Galactic 3
Galactic 3
0 Kudos

ac120142b7ada67a18025e64b5244ee1187be16d7527834c1338229cb724a904

RaulVega
Galactic 3
Galactic 3

aa577776e777a0b574d6c38b74cc374b8d72e2debc61997868b543e9fab2ab5f

harsh_itaverma
Galactic 4
Galactic 4

The API specification is available in two formats - JSON and YAML. Often for APIs there is also a third format - EDMX. Why isn't that available here?

This is all I could understand from the internet and connect with earlier tasks (I can be completely wrong)
The API specification document would have been available in EDMX format if the API supports more than just standard HTTP data exchange protocols like GET, POST...
For example, in our first task when we listed the entity sets, we used the service document representation of the API and got all the entity sets. But when we had to find the product using NorthBreeze for each one of us, we used the metadata document to understand the "Action". The API specification can be exposed in two ways; Service Document  (here: JSON, YAML - Using ATOM Protocol) and the XML Document (here: EDMX) which would describe the data model [like for OData we have a entity data model and edmx is the xml representation of the same].

Which API specification format did you choose, and why? Do you prefer parsing YAML or JSON?
JSON, as I found it pretty easy to parse and use.
I did try using yaml but was getting errors in BAS while loading it.

How did you apply the endpoint conditions described above, and in what language?
I used Javascript, and the approach was:
1. Load the JSON representation

2. Extract the "paths" and Using Object.entries() parsed each entry into an array

3. Check if there was a value 'get' and if the  'produces' array had an entry 'application/json' and pushed all such path values into an array.

5. Used array function sort.

4. Used array function join with separator as ':' and got a string out of it.


I would say that EDMX is unavailable because it is not an OData service.

Note: If you only have an EDMX, you can easily convert it to an OpenAPI spec using the OData OpenAPI converter.

0 Kudos

True that. 

Date and Time is just a REST API and not an oData API and hence no edmx.

While replying I got more intrigued on why not just edmx for odata, like it's just wdsl for SOAP.  For Rest API we have yaml and json formats and oData we have json and yaml too in addition to edmx. 

And the OpenAPI converter answers that!

Wow, not seen "WSDL" for a long time. Brings back some "interesting" memories in the dim and distant days of when we fought with the SOAP interoperability issues in the early days. And WSDL just led (IMHO) to WS-Deathstar in the end 😉

Nice detail @harsh_itaverma ! Interesting thoughts about the EDMX, but I have to agree with @ajmaradiaga in that it's because basically the API is not an OData API and doesn't support the metadata document standard.

stickman_0x00
Galactic 3
Galactic 3
0 Kudos

aba788fea9a26dda48e272bb7ce0c667755781ab8cd8d076b5e26f0ddc8d3164

seVladimirs
Galactic 4
Galactic 4
0 Kudos

a260add4e4133701ef30dbac19e10f76ea4d173b9f73c6696d32c08398617099

Ok, so many different variants but my answer is matching only with @ajmaradiaga so far. Hey all, did you do sorting? 🙂

Odd, I tried with the CommunityID of a few others, and it does match...

My bad, you are right - something was wrong with my GET request.

0 Kudos

I am one who submitted a wrong hash in my 1st answer 😂

SandipAgarwalla
Galactic 4
Galactic 4
0 Kudos

1c00e1b31cdde0c81ec90ff480db1aa6451d361cfa13ba5d697d88ff1dfa85ee

dan_wroblewski
Employee
Employee
0 Kudos

edce6e2e5dbf75febed7a190c3534e6aac23c0ba02cde109eb879dc8dfa5a37e




--------------
See all my blogs and connect with me on Twitter / LinkedIn

dan_wroblewski
Employee
Employee

A little rusty, since I'm now immersed in SAP Build but here was my quick routine.

import json

f = open ('DateAndTime.json', "r")
data = json.loads(f.read())
f.close()

paths = data["paths"]
mypaths = []
for path in paths: 
    if ("get" in paths[path].keys()):
        if ("application/json" in paths[path]["get"]["produces"]):
            mypaths.append(path)
print(':'.join(mypaths))

I am assuming that "produces" will always be a list, and that the challenge meant that the path had a GET method and that GET method produced JSON.

Still, I hope I got it right.

P.S.: I actually did it manually first in order to submit something quickly, and then created a script to parse the JSON.




--------------
See all my blogs and connect with me on Twitter / LinkedIn

I think you forgot to sort values 🤔

0 Kudos

It was good I published my code 😺 Thanks for correcting me




--------------
See all my blogs and connect with me on Twitter / LinkedIn

0 Kudos

Thanks for sharing the code, @dan_wroblewski !

To your comment "I am assuming that "produces" will always be a list, there is a way to find out for certain.

Can anyone here in this thread (perhaps apart from @ajmaradiaga ;-)) tell us how? Where would you look to find out 

 

@qmacro : I too got stuck on this while writing the Javascript function. Checked it before I proceeded.

1. The @Produces annotation specifies the list of the media types produced by a particular API or class.

2. In the JSON file, even if it has a single entry it is returned in an array.

Also, on the API reference screen, under responses; the response content type's value is bound to 'produces' and it is a drop-down; being a UI developer I was pretty sure it had to be a list!  🙂