Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 

Introduction


As indicated in our previous blog post, the AmazonWebService Adapter was added to the family of adapters of SAP Cloud Integration in January 2021.

Amazon Web Services (AWS) is Amazon’s cloud web hosting platform that offers flexible, reliable, scalable, and cost-effective solutions. AWS provides a variety of basic abstract technical infrastructure and distributed computing building blocks and tools.

Amongst the many services provided by AWS, the Amazon Web Service adapter currently supports the following 4 services:

  • S3 (Amazon Simple Storage Service)


Amazon Simple Storage Service is known as the storage for the Internet. It is designed to make web-scale computing easier for developers. Amazon S3 has a simple web services interface that you can use to store and retrieve any amount of data, at any time, from anywhere on the web. It gives any developer access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its global network of websites. The service aims to maximize the benefits of scale and to pass those benefits on to developers.

  • SQS (Amazon Simple Queue Service)


Amazon Simple Queue Service (Amazon SQS) offers a secure, durable, and available hosted queue that lets you integrate and decouple distributed software systems and components.

  • SNS (Amazon Simple Notification Service)


Amazon Simple Notification Service (Amazon SNS) is a web service that manages the delivery or sending of messages to subscribed clients. In Amazon SNS, there are two types of clients, namely publishers (or producers) and subscribers (or consumers). Publishers communicate asynchronously with subscribers by producing and sending a message to a topic. Subscribers can consume or receive the message via one of the supported protocols (Amazon SQS, HTTP/S, email, SMS, Lambda).

  • SWF (Amazon Simple Workflow Service)


The Amazon Simple Workflow Service (Amazon SWF) enables the building of applications that can coordinate work across distributed components. In Amazon SWF, a task represents a logical unit of work that is performed by a component of your application. Coordinating tasks across the application involves managing inter-task dependencies, scheduling, and concurrency following the logical flow of the application.

High-level features:


Amazon Web Service Adapter provides you with the following features:

Sender Adapter:

  • AWS S3 to read file from AWS S3 service:

    • Support for patterns in filename.

    • Possibility to archive processed files to the same bucket as part of the post-processing step.

    • Possibility to archive processed files to a different bucket as part of the post-processing step.

    • Possibility to sort files based on Filename, Filesize, and Timestamp.

    • Capability to retrieve additional metadata maintained for the file on the S3 bucket. Multiple attributes can be retrieved at the same time.

    • Support for Server-Side Decryption.

    • Most properties support dynamic properties and headers.

    • Functionality to generate pre-signed URL.





  • AWS SQS to read a message from AWS SQS queue:

    • Support for Standard and FIFO queues.

    • Possibility to keep or delete the message from the queue after reading as part of the post-processing step.

    • Possibility to keep a message in the queue after processing.

    • Capability to retrieve additional metadata maintained for a message on the SQS queue. Multiple attributes can be retrieved at the same time.




Receiver Adapter:

  • AWS S3 to push files into AWS S3 service:

    • Option to append timestamp and messageId to the file name during the creation process.

    • Option to select a storage class.

    • Different handling options for existing S3 bucket files

    • Option to upload attachments to the S3 bucket.

    • Server-Side Encryption.

    • Possibility to add multiple custom metadata to the file while storing it.

    • Capability to read a particular file from the S3 bucket using the Read operation of the receiver adapter.





  • AWS SQS to send a message to the AWS SQS queue:

    • Support for Standard and FIFO queue.

    • Option to add multiple message attributes while writing a message to a queue.

    • For Standard queue, option to provide delay seconds to avoid subsequent processing by any other consumer.

    • For a FIFO queue, the option to provide message deduplication id and message group ID.





  • AWS SNS to push real-time notification messages to interested subscribers over multiple delivery protocols:

    • Support for Standard topics.

    • Option to provide Identical Payload for all consumers.

    • Option to provide custom payload for different consumers.

    • Option to format the response in XML and JSON formats.

    • Option to provide multiple Message attributes.





  • AWS SWF to provide control over implementing tasks and coordinating them:

    • Support for multiple operations that can be selected from a pre-defined list.

    • Option to determine request and response format. Currently, JSON is supported.




We will keep expanding the list of supported services and features in the coming releases, and we encourage you to submit any new feature suggestions. For that, you can use the Influence session on SAP Integration Suite at:

https://influence.sap.com/sap/ino/#/campaign/2282

Let us next explore an example scenario that uses the Amazon Web Services adapter.

Example Scenario:

Let us use a simple S3-based integration scenario that illustrates the use of the Amazon Web Service Adapter. Assume that you have an AWS application that stores and writes files to be an S3 Bucket. In AWS, an S3 bucket is a container that can store objects. It also helps in organizing the Amazon S3 namespace, and manage access control. To keep things simple, in a bucket, you can store objects that include folders and files.

As part of your integration scenario, you want to read a file from your S3 Bucket, archive it and push it to an SQS Queue. An AWS SQS provides a message queuing service that enables you to decouple your applications. The figure below represents the integration scenario to be built.


To achieve this, follow the steps below:

Step 1: For the sender side, Create the Folder and add files to S3 Bucket


The First step will be to ensure that a bucket, required folder, and files exist in our S3 bucket. For that, proceed as follows:

  • Login to your AWS account, and select the S3 Service.





  • Create a Bucket.



Name your bucket and click on the “Create bucket”.

  • Create a folder and upload a file to simulate the AWS that places a file. In our case, we placed a file named “Example.json”.



We have set up the sender side in AWS. To connect SAP Cloud Integration to AWS S3, you will need to capture the following details:

  • Region of the bucket.

  • Bucket Name.

  • Access Key.

  • Secret Key.


Please note these details for later.

Step 2: For the receiver side, Create an SQS queue


The next step will be to ensure that an SQS Queue is available in AWS. For that, proceed as follows:

  • From your AWS account, select the Simple Queue Service.





  • In the next screen, click on “Create queue” and specify its name. As the figure below shows, I have created a queue named “demo”. Note that this a standard queue.



We have set up the receiver side in AWS. To connect SAP Cloud Integration to AWS SQS, you will need to capture the following details:

  • Region of the bucket.

  • Account Number.

  • Queue Name.

  • Access Key.

  • Secret Key.


Now we need to create the Integration Flow in SAP Cloud Integration.

Step 3: Your integration Flow in SAP Cloud Integration


Create an Integration Flow with a Sender and receiver system representing AWS S3 and AWS SQS respectively. See below.


Note that to keep things simple this is a pass-through scenario that does not include any mapping or transformation.

On the sender side, you will need to select the AmazonWebServices adapter and S3 as protocol.


The figure below shows its configuration. You will need to re-use the information that you saved in Step 1. Make sure to select the correct region where the bucket was created.


Note that for both the Access Key and Secret Key, you will need to create a Secure Parameter in SAP Cloud Integration via the Security Material. In our case, we create a Secure Parameter named “AWS_Tester_AccessKey” for the Access Key and “AWS_Tester_SecretKey” for the Secret Key.


We also need to configure the Processing tab as shown below. You will need to specify the name of the Directory and the pattern to be used for the file name as shown below.


Now we can configure the receiver adapter to use the SQS service and write the file to the queue.


The figure below shows its configuration. You will need to re-use the information that you saved in Step 2. Make sure to select the correct region where the queue was created. Furthermore, you should specify the AWS Account number and the Queue as shown below.


Finally, let us deploy our Integration Flow. See below an example of the Integration flow which runs and can be seen in the Monitor.


Lastly, you can see check that a message has arrived in the Queue, as seen below.


Et voilà, congratulations. Be on the lookout for the next blog about the Amazon Web Services adapter.
9 Comments
deanmg1
Participant
Hi john_bilay ,

Thanks, this is very informative and great that CPI is bringing these adaptors 'in house'

We have set up an SQS adaptor but very quickly hit the limit of processing. As per the settings available there is no way to achieve parallel processing, unless I'm mistaken?

We current retrieve 10 entries at 1000ms requests - processing time using an individual adaptor is sitting at 67 minutes for 10000 records, which is really slow

 

We are working on a requirement where we have an NFR to adhere to 300 TPS. We are light years away from that

 

Any idea if there is a way to optimise this drastically?

 

PS. we have also tried multiple iflows each with their own adaptor connecting to the same queue. 4 adaptors bring processing down to 4m45s for 10000 records. Still a country mile from 300 TPS

 

Thanks!

Dean

 
0 Kudos
Hi Dean

Let us look together at your scenario. I will reach out. I have been able to process messages at a far greater speed than that. I will suggest looking at the following aspects:

- The nature of your queue. Is it an SQS FIFO Queue normal or async queue?

- The number of available nodes

- Is the Iflow of sync nature. example: are you returning ack to the queue?

- Are the message already available in the queue or they are being written as you are reading (by a third-party system)?

Thanks
John
andy_dingfelder3
Participant
0 Kudos

Hi john_bilay ,

thanks for the blog. Very interessting. I´m wondering whether the AWS adapter is available in the trial environment of SAP Integration Suite as well?

Till now I couldn´t find the adapter there. 🙁

Thanks,

Andy

andy_dingfelder3
Participant
0 Kudos

Hi,

 

in the meantime I´ve found a very detailed HowTo from Amazon which describes the usage of the AWS adapter.

https://aws.amazon.com/de/blogs/awsforsap/integrating-sap-systems-with-aws-services-using-sap-busine...

Andy

Tri
Participant
0 Kudos
Hi john_bilay,

Thanks for a detailed blog.

One question: how do we troubleshoot the adapter?

My case is reading files from sub-directory of a S3 bucket. For example, S3bucket.s3/Folder/SubFolder


settings


However, it gets this error when deploying.
[CONTENT][CONTENT_DEPLOY][RuntimeError] : {"message":"EXCEPTION","parameters":["com.rojoconsultancy.amazonwebservices.exceptions.AwsException: Fail to list directory (Folder/SubFolder/): "]}

What did go wrong here?

Thanks.

Minh
0 Kudos
Hi J.Bilay,

I need to push files on to AWS server using ABAP only program.

Could you please advise.

Thanks.
former_member736850
Discoverer
0 Kudos
Hi john_bilay,

 

Thanks for the detailed blog. I have one question.

Do we have a way to have a scheduler for S3 bucket? Eg: If I have to poll only on Sundays, how can we achieve it?

 

Regards,

Eniyan.
M4rtin
Explorer
0 Kudos
Hi,

 

did anybody tested the Adapter with the actual anounced Apache update?

 

Kind regards,

MArtin
MChrist
Explorer

Hi Martin,  

do you mean the upgrade to camel runtime 3.14? Yes, I did some first successful tests (reading data from s3 buckets).

You need to download a new adapter version for Camel 3.x from the marketplace and deploy it on the test tenant from SAP.  

BR, Marcel

Labels in this area