Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Clen
Participant
Dear community,

 

Some days ago, I started to work on a PoC about the CI/CD Azure pipeline for Integration contented. The very initial steps were to know how works the Piper commands.

For more information about project “Piper” and the other CI and CD offerings by SAP, see Overview of SAP Offerings for CI and CD. And more about CI/CD for SAP Integration Suite, see CI/CD for SAP Integration Suite? Here you go!

And of course, the blog post that has inspired me to do this blog post, see Working with Integration Suite Piper commands.

 

Let's follow the below steps:

  1. Creating Azure DevOps Project

  2. Configure Cloud Integration API service key in Azure DevOps as security credentials

  3. Creating a new pipeline project in Azure DevOps

  4. Conclusion


 

Creating Azure DevOps Project

Let's create the project "SCP-Pipeline" and a new repo "Garage.SAPCI.PoC", and finally clone in VS Code.



 

 

Configure Cloud Integration API service key in Azure DevOps as security credentials

Get the service key, explained in step 2 in the blog post.

Below a two steps to perform with the service key file.

  • Convert from service key payload from JSON to JSON string. I found useful this online tool to perform this task.

  • Beautify the JSON String file by changing the double quote(") for an apostrophe (') at the beginning and the end and finally clean each backslash (\). The beautified JSON String file show looks like the below one. With this small change, Piper will send a complete client secret in the HTTP request and not just the initial part before the character $ that could cause an HTTP 401 error.


'{"oauth":{"clientid":"xx-xxxxxxx-85yy-zzz-a56b-xxxxxx99!a99999|it!a99999","clientsecret":"x1x1x1x1x-9z9z9z-9y9y9-9x9x9-9x9x9x9x9x$Mi_xxxxYYYYYYzzzzzzzzzz-xxxxxxYZ=","tokenurl":"https://xxxxxxyztrial.authentication.us10.hana.ondemand.com/oauth/token","url":"https://xxxxxxyztrial.it-cpitrial05.cfapps.us10-001.hana.ondemand.com"}}'

 

Create a variable "CREDENTIALS" and add as value the modified JSON String payload.


 

Creating a new pipeline project in Azure DevOps

The important part is to download piper from github. Then you can use the piper executable in the jobs. In the below example the first job gets piper and puts the executable into a cache. The next job gets piper from the cache and does some action for cloud integration content.

 

a.- Get piper and put the executable into a cache.
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml

trigger:
- main

jobs:
- job: downloadPiper
pool:
vmImage: ubuntu-latest
steps:
- checkout: none
- task: Cache@2
inputs:
key: piper-go-official
path: bin
cacheHitVar: FOUND_PIPER
displayName: Cache piper go binary
- script: |
mkdir -p bin
curl -L --output bin/piper https://github.com/SAP/jenkins-library/releases/download/v1.199.0/piper
chmod +x bin/piper
condition: ne(variables.FOUND_PIPER, 'true')
displayName: 'Download Piper'
- script: bin/piper version
displayName: 'Piper Version'

 

b.- Piper command to deploy an iFlow
  - job: deployiFlow
dependsOn: downloadPiper
variables:
- group: development
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Cache@2
inputs:
key: piper-go-official
path: bin
displayName: deploy iflow
- script: |
bin/piper integrationArtifactDeploy --verbose --apiServiceKey $(CREDENTIALS) --integrationFlowId "TestCICD"

Running pipeline project and verifying results

Below the results of the pipeline execution.


The deployment status also in SAP CI Web UI->Manage Integration Content


You can combine these piper commands and build a complex scenario.


 

Conclusion

Finally, with the above instructions, we can perform the Piper commands using a Microsoft Azure DevOps pipeline. Based on this one, we can create more complex scenarios.

I hope you find useful this blog post. You are very welcome to provide feedback or thoughts in the comment section. And thanksmayurbelur.mohan for supporting me in this journey!

Related to this topic you can also find Q&A and post questions by the tags DevOps, SAP Integration Suite, SAP BTP, Cloud Foundry environment.
2 Comments
Muniyappan
Active Contributor
0 Kudos
Hi Alex, Thanks for the blog.

 

We are looking to download the iflow and commit it to repository. How do you do that? I mean Where would you provide branch name and commit message?

I have tried this configuration, but I can not see any files getting committed to the folder.


tried repository/root dir/sub folder , rootdir/subfolder. it did not work.

 


 

If we are downloading, what is the file extension? is it a zip or folder?
0 Kudos

Hi Alex, It's very nice blog.

is it possible to deploy integration flow at package level?

example - if in one integration package, 20 iflow artifacts are available then it will be difficult to do deployment one by one. So is there any way we can achieve the same deployment at integration package level so it will be one job to do deployment in one go.

Thanks & Regards

Bhanu 

 

Labels in this area