Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

May Developer Challenge - SAP AI Services - Week 3

noravonthenen
Developer Advocate
Developer Advocate

CHECK OUT WEEK 4 OF THIS CHALLENGE

Welcome to week 3 of the AI Developer Challenge! I am so excited about every single submission! Now I don’t know about you but in my family, we love avocado! Especially my youngest could eat it all the time. Let’s use the Data Attribute Recommendation service to build an avocado price prediction engine! With Data Attribute Recommendation you can train regression as well as classification models and you can use the SAP AI Launchpad, Postman, Swagger or the AI API Python SDK to implement your use case! With Data Attribute Recommendation you need to specify a model template or a business blueprint to specify what kind of algorithm should be used.

For this challenge we will use the regression model template to implement an avocado price predictor! If you do not like avocado, feel free to come up with another use case and search for a fitting dataset (e.g. here).

This week we start by uploading and preparing our dataset schema and our dataset! For that we will be using the AI API which is a standardized way to interact with different AI runtimes and services such as SAP AI Core or Data Attribute Recommendation.

  1. Go to your BTP trial account and create a DAR instance (you can use the booster): https://developers.sap.com/tutorials/cp-aibus-dar-service-instance.html
  2. Download this dataset
  3. Download the postman collection for the Data Attribute Recommendation service and import the folder to Postman
  4. Go to the environment tab on the left and add the information from your DAR service key from step 1.
    1. authUrl = url from the uaa section in your .json file
    2. url = urlnoravonthenen_7-1715759911268.png
  5. Make sure to select the correct environment and go back to the Collections tab to create a token with the Get XSUAA OAuth Token GET request in the Authorization folder. If everything is set up correctly it will take the url, username and password from your environmental variables you configured in step 4. It will also assign the token to the correct variable in the environment for further use. Make sure to repeat this step once the token timed out.
  6. Create a dataset_schema.json file
  7. Go back to the Environments tab and assign a folder name (e.g. avocado_price_predictor) and a datasetSchemaFileRegression name (e.g. dataset_schema.json) as well as a datasetFileRegression name (e.g. avocado_data.csv) and also assign the scenarioIdRegression = '35cd9fbc-7290-4042-b6df-178d74c67363' or find the correct Scenario ID here.
  8. Upload a dataset schema with the PUT Upload Dataset Schema Regression request and make sure to upload your dataset_schema.json
  9. Upload this dataset with the PUT Upload Dataset Regression request
  10. Create a dataset schema artifact with the POST Create Dataset Schema Artifact Regression request (the required values were assigned automatically to your environment)
  11. Create a dataset artifact with the POST Create Dataset Artifact Regression request (the required values were assigned automatically to your environment)
  12. Learn more about AI API and the underlying concepts such as artifacts
  13. Submission is again a screenshot of the last POST request and the correct returned result:noravonthenen_6-1715759819554.png
  14. Stay tuned for next week to train your model!

Week 1 challenge

Week 2 challenge

27 REPLIES 27

geek
Participant

geek_0-1715790561667.png

Update: For anybody else who's not a regular Postman user, installing the Desktop Agent fixes this. 

After much wailing and gnashing of teeth:

geek_0-1715871963374.png

Vitaliy-R
Developer Advocate
Developer Advocate

I am not a regular user of Postman, but I am a regular user of BAS and VS Code, so I tried the Postman extension in VS Code and it worked for this exercise 🤓

moh_ali_square
Participant

Hi, 

Update: I got it right. Thanks to Mio for help .

 DAR week 3 submissionDAR week 3 submission

 

I updated the enviorment with value of scenarioIdRegression

Collection environment variablesCollection environment variables

I had an issue with the last two steps, I uploaded the dataset schema and the dataset. but there is API error related to scenarioID!

Did I miss anything?

below screenshot for my submission, and another screenshot shows that dataset was uploaded, when I tried to upload it again, I got a message that it is already there. 

Post request for Dataset.Post request for Dataset.

 

PUT request for datasetPUT request for dataset

MioYasutake
Active Contributor

My submission for week 3.

MioYasutake_0-1715804377888.png

I encountered the same error as @moh_ali_square posted. As the document says "scenarioId refer to pre-existing values", I tried to find where the scenarioId is defined. I found it by calling the following endpoint. I hope this is the right approach. 

MioYasutake_1-1715804574225.png

 

 

For anyone trying to follow this:


10. Create a dataset schema artifact with the POST Create Dataset Schema Artifact Regression request (the required values were assigned automatically to your environment)


The scenarioid is not automatically assigned.

@MioYasutake thanks a lot for catching that! I updated it in the post 🙂

Nagarajan-K
Explorer

NagarajanK_0-1715878716059.png

Thanks @MioYasutake for the tip in your response which helped in moving forward. 

Alpesa1990
Participant

My submission for this this task.

Alpesa1990_0-1715959713145.png

 

emiliocampo
Explorer

Submission for week 3

emiliocampo_0-1716059889352.png

 

M-K
Explorer

Here's my submission:

2024-05-20 22_51_04-Window.png

xavisanse
Active Participant

CameronWilson
Explorer

Jordi_C
Explorer

Week 3 done and dusted!

 

week3.png

Hira
Explorer

Hi @noravonthenen ,

Hope you are doing well. 

Below is the screen shot for Week 3 exercise. 

Hira_0-1716365935791.png

 

martaseq
Associate
Associate

Here's my submission for this week.

Postman AI API.png

Mikkelj
Explorer

My Submission - not sure what i did, but looking forward to week 4 🙂

Mikkelj_0-1716462351542.png

 

thomas_mller13
Participant

Hi Nora,

thank you for your interesting articles. They are really helpful.

Anyway, to be honest, I don't understand this dataset: 

Date,AveragePrice,TotalVolume,PLU4046,PLU4225,PLU4770,TotalBags,SmallBags,LargeBags,XLargeBags,type,year,region

  • Why is the date needed? The function cannot depend linearly on date.
  • What unit is this average price? What is PLU*?
  • How is the algorithm handling the categorical variables?
  • Why is the year needed? That is actually the same question as for the date. Are these fields ignored by the model?

Of course the pure technical aspects of applying a ML model can be challenging. But in my opinion the decision which is a suitable model and how to handle the data is much more challenging. Therefore it would be interesting to learn a little bit more about these aspects.

Thanks and best regards,

Thomas

@thomas_mller13  All very good questions!

1. Here you can get some more info on the dataset, I would assume US dollars

2. Quick google search on PLU: "A PLU code, or price look up code, is a 4 or 5 digit code that is unique to a particular produce item; based upon the commodity, the variety and the size group; and will typically appear on a small sticker that is applied to the individual piece of fresh produce."

3. In week 4 you will see that we actually will get feature importance scores from the model and we will see that we do not need all the features from the dataset.

4. Good catch! That is one of the benefits of using the DAR service, you do not need to encode categorical text values yourself. DAR does that automatically for you.

Vitaliy-R
Developer Advocate
Developer Advocate
0 Kudos

I am glad I could try out and practice the Postman extension in VS Code. Too bad the same extension is not available on https://open-vsx.org/ to try it in the SAP BAS.

 

VitaliyR_1-1716564448116.png

gphadnis2000
Participant
0 Kudos

 

my submission for week 3

gphadnis2000_0-1716643246188.png

 

Ruthiel
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hello @noravonthenen ,

 

Unfortunately, I'm blocked in the first step!
Don't know why yet:

Ruthiel_0-1716672934192.png

 

sabarna17
Contributor
0 Kudos

AI-API 🔀 

sabarna17_0-1716840977140.png

 

jasperdebie
Explorer
0 Kudos

My Submission for week 3:

jasperdebie_0-1716884234486.png

 

sainithesh21
Active Participant
0 Kudos

Here is my submission for Week3 

Week3.png

Ruthiel
Product and Topic Expert
Product and Topic Expert
0 Kudos

Here my last result:

Ruthiel_0-1717023523656.png

 

hariharan_s
Discoverer
0 Kudos

My submission for Week 3! - "Artifact acknowledged"

hariharan_s_0-1717054668429.png