Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
miroll
Product and Topic Expert
Product and Topic Expert

Scenario Overview


Single Sign-On is one of the most convenient features for users. This convenience however usually comes with a lot of integration effort and challenges on the implementation side. This is especially the case if we have different Identity Providers (IdP) and Trust setups involved.

This blog entry aims to guide you through one of these scenarios: Enabling principal propagation between an Azure AD as our IdP and an On-Premise SAP System through an API on SAP API Management.

For this we'll make use of the great work done by mraepple which he presented in his blog entry on Principal propagation in a multi-cloud solution between Microsoft Azure and SAP BTP. In our case we'll have a more specialised scenario in terms of the xsuaa connection on SAP BTP side. However we have an additional component with the SAP API Management in between which will play an important role in this case.

The challenge we faced with the integration was that clients would call our endpoints exposed by API Proxies in the SAP API Management with an OIDC token issued by the Azure AD. This OIDC token however can't be processed by the xsuaa in this form and therefore cannot be used for accessing the On-Premise connection. For the purpose of this blog we assume that the principal propagation setup from the BTP CF environment to the On-Premise system is already configured and tested. The main issue is to integrate this into our API flow in the API Proxy and exchange the Azure AD token for a token trusted by our xsuaa.

Solution Overview


The following picture gives an overview of the intended setup of the solution and the request flows in between:


Solution Overview and sequence


We use the approach explained in the blog mentioned above to exchange the incoming OIDC token for a SAML token. That token can then be used to acquire a valid JWT token from our xsuaa instance. Due to the setup of the principal propagation we can use that token to forward the initial request to the backend system through the SAP Cloud Connector.

For this connection we use the On-Premise Connectivity Plan of the API Portal service as described in the help page of SAP API Management. After setting it up and creating a service key to get the credentials we're ready to implement the API Proxy and its flow in the API Portal.

Prerequisites


To enable the flow which was previously outlined we need access to both the Azure AD instance and our SAP API Management API Portal instance. Of course these access permissions might be distributed across different people. As per Martin's blog I linked to above this is (by my knowledge) currently only possible with Azure AD and the token exchange feature of it.

To create the needed on-premise-connectivity instance of the API Management and its service key you'll also need the according permission and entitlement in your subaccount.

The most important step in the preparation is to setup the trust between the Azure AD instance and our xsuaa instance as described in Martin's blog!

The complete list:

  • Azure AD instance and access to the service key creation

  • Trust setup between the Azure AD instance and our xsuaa instance

  • SAP API Management API Portal instance and Developer permissions on it

  • on-premise-connectivity instance service key or permission and entitlement to create it

  • (optional) an on premise system to test the setup with 😉


SAP API Management Artefacts


Since we don't want to add the same steps for every API Proxy that wants to access On-Premise resources with Principal Propagation, we'll implement it as a shared API Proxy similar to the one mentioned in this blog. This way it can be called from the other API Proxies in order to retrieve the valid token.

Preparation


For the needed calls to Azure AD we'll first create a relatively simple API Provider with the configuration to connect to the Azure AD instance:


Azure AD API Provider


Second we'll create an encrypted Key Value Map for the credentials we need. it contains details of three categories:

  • The shared secret to use the Shared Flow (see above)

  • Credentials from the on-premise-connectivity service instance key

  • Credentials from Azure AD



Key Value Map with needed credentials


Sorry for the poor ordering of the key value map entries 🙂

API Proxy


Now for the API Proxy itself. The API Proxy we're going to create is a so called Loopback API which does not actually forward the request to a target endpoint - all logic will be done in Service Callouts. For this we create a blank API Proxy and just one resource: /token The actual work then is done in the policies of the API Proxy.

Let's look at the overview first - there's two flows which we're using: the PreFlow of the ProxyEndpoint to check whether the caller has our shared secret and the token flow of the ProxyEndpoint where we do the actual token conversion. The former can be taken from the blog I already linked above and is not part of this blog.

In total we use 8 policies - 7 on the Incoming request and 1 on the outgoing response:


Policies in token flow


The first JS policy extracts the incoming JWT token issued by Azure AD and stores it in a variable:
// extract auth header to retrieve JWT Token
var reqAuth = context.getVariable("request.header.Authorization");

if (!reqAuth) throw 'Missing Auth Header';

var authParts = reqAuth.split(" ");

if (!authParts[0].toLowerCase() === "bearer" || authParts.length !== 2) throw 'Only OAuth Bearer Tokens Accepted'

if (!authParts[1] || !/^[a-zA-Z0-9\-_]+?\.[a-zA-Z0-9\-_]+?\.([a-zA-Z0-9\-_]+)?$/.test(authParts[1])) throw 'Only valid OAuth Bearer Tokens Accepted'

context.setVariable("private.jwttoken", authParts[1]);

Next we extract all needed values from our previously created encrypted key value map. For convenience all the local variables in the flow are called the same as the keys in the key value map ("private.<key value map key>").

Using the credentials retrieved from the key value map and the incoming JWT token we now query the token endpoint of the Azure AD instance to exchange our JWT token for a SAML assertion. This equals the "Request SAML assertion from AAD with ObO flow" Postman request of the Postman collection in Martin's blog. The code of the ServiceCallout policy makes use of the credentials and details of the Azure AD instance we just retrieved from the encrypted key value map. We're using the API Provider which we previously created in the preparation as the HTTPTargetConnection of our request.
<ServiceCallout async="true" continueOnError="false" enabled="true" xmlns="http://www.sap.com/apimgmt">
<Request>
<Set>
<Headers>
<Header name="Authorization">Bearer {private.jwttoken}</Header>
<Header name="Content-Type">application/x-www-form-urlencoded</Header>
<Header name="Accept">*/*</Header>
<Header name="Accept-Encoding">gzip, deflate, br</Header>
</Headers>
<FormParams>
<FormParam name="grant_type">urn:ietf:params:oauth:grant-type:jwt-bearer</FormParam>
<FormParam name="client_id">{private.aad_client_id}</FormParam>
<FormParam name="client_secret">{private.aad_client_secret}</FormParam>
<FormParam name="resource">{private.aad_resource_id}</FormParam>
<FormParam name="requested_token_use">on_behalf_of</FormParam>
<FormParam name="requested_token_type">urn:ietf:params:oauth:token-type:saml2</FormParam>
<FormParam name="assertion">{private.jwttoken}</FormParam>
</FormParams>
<Verb>POST</Verb>
</Set>
</Request>
<!-- the variable into which the response from the external service should be stored -->
<Response>private.azuread.response</Response>
<!-- The time in milliseconds that the Service Callout policy will wait for a response from the target before exiting. Default value is 120000 ms -->
<Timeout>30000</Timeout>
<HTTPTargetConnection>
<APIProvider>INT0201_Azure_AD</APIProvider>
<Path>/{private.aad_tenant_id}/oauth2/token</Path>
</HTTPTargetConnection>
</ServiceCallout>

To retrieve the SAML assertion from the response we use the ExtractVariables policy.
<ExtractVariables async="false" continueOnError="false" enabled="true" xmlns="http://www.sap.com/apimgmt">
<JSONPayload>
<Variable name="private.azuread.access_token" type="string">
<JSONPath>$.access_token</JSONPath>
</Variable>
</JSONPayload>
<Source>private.azuread.response.content</Source>
</ExtractVariables>

Additionally to retrieving the SAML assertion we also need to create credentials for our ServiceCallout to the xsuaa instance which will be the last of our service calls. For this we use the third set of variables retrieved from the key value maps: the credentials of our on-premise-connectivity service instance. For this we use the BasicAuthentication policy with the client id and secret.
<BasicAuthentication async='true' continueOnError='false' enabled='true' xmlns='http://www.sap.com/apimgmt'>
<Operation>Encode</Operation>
<IgnoreUnresolvedVariables>false</IgnoreUnresolvedVariables>
<User ref='private.uaa_client_id'></User>
<Password ref='private.uaa_client_secret'></Password>
<AssignTo createNew="true">sapapim.auth</AssignTo>
</BasicAuthentication>

Now we're ready for the last call which will give us the token to be used when connecting to an on premise system.
<ServiceCallout async="true" continueOnError="false" enabled="true" xmlns="http://www.sap.com/apimgmt">
<Request>
<Set>
<Headers>
<Header name="Authorization">{sapapim.auth}</Header>
<Header name="Content-Type">application/x-www-form-urlencoded</Header>
<Header name="Accept">*/*</Header>
<Header name="Accept-Encoding">gzip, deflate, br</Header>
</Headers>
<FormParams>
<FormParam name="grant_type">urn:ietf:params:oauth:grant-type:saml2-bearer</FormParam>
<FormParam name="assertion">{private.azuread.access_token}</FormParam>
</FormParams>
<Verb>POST</Verb>
</Set>
</Request>
<Response>sapapim.oauthresponse.token</Response>
<Timeout>30000</Timeout>
<HTTPTargetConnection>
<URL>https://{private.uaa_token_domain}</URL>
<SSLInfo>
<Enabled>true</Enabled>
<ClientAuthEnabled>false</ClientAuthEnabled>
<KeyStore/>
<KeyAlias/>
<TrustStore/>
</SSLInfo>
</HTTPTargetConnection>
</ServiceCallout>

The SAML assertion of the previous step is passed to the xsuaa instance which (because of the previously set up trust relationship) returns a valid JWT token for this xsuaa instance.

Lastly we retrieve the token from the response and assign the corresponding fields to our response so the caller can use it. The latter is done in an AssignMessage policy on the response flow of the token flow.
<ExtractVariables async="false" continueOnError="false" enabled="true" xmlns="http://www.sap.com/apimgmt">
<JSONPayload>
<Variable name="private.btp.access_token" type="string">
<JSONPath>$.access_token</JSONPath>
</Variable>
</JSONPayload>
<Source>sapapim.oauthresponse.token.content</Source>
</ExtractVariables>

<AssignMessage async="false" continueOnError="false" enabled="true" xmlns='http://www.sap.com/apimgmt'>
<Set>
<Payload contentType="application/json" variablePrefix="@" variableSuffix="#">{"auth_token":"@private.btp.access_token#"}</Payload>
</Set>
<IgnoreUnresolvedVariables>false</IgnoreUnresolvedVariables>
<AssignTo createNew="false" type="response">response</AssignTo>
</AssignMessage>

Usage


So how are you going to use the newly created Shared Principal Propagation API Proxy? You can just integrate it as a ServiceCallout into any other API Proxy on your instance e.g. like this:
<ServiceCallout async="true" continueOnError="false" enabled="true" xmlns="http://www.sap.com/apimgmt">
<Request>
<Set>
<Headers>
<Header name="Authorization">Bearer {request.header.Authorization}</Header>
<Header name="secret">{private.shared.secret}</Header>
</Headers>
<Verb>POST</Verb>
</Set>
</Request>
<Response>sapapim.accessToken</Response>
<Timeout>15000</Timeout>
<LocalTargetConnection>
<Path>/v1/shared/principalpropagation/token</Path>
</LocalTargetConnection>
</ServiceCallout>

Of course you can also use the Copy functionality of the AssignMessage policy to create the request to the Service Callout. /v1/shared/principalpropagation in this case is the base path of the Shared Principal Propagation API Proxy I created and of course you need to add the resource /token as well.

Summary


Using the great blog by Martin as our blueprint we applied the principle to our specialised use case of doing this in SAP API Management. Since SAP API Management specifically provides a service plan for the on premise access with the on-premise-connectivity service plan, we don't need to dynamically set our target xsuaa instance but can always use the one connected to that instance.

Outlook


Of course there are some things that could be improved about this setup like error handling on false or illegitimate requests. If you spot anything feel free to point it out in the comments!

Thanks


At this point I'd also like to thank tobias.pahlings for his help with all of the security knowledge needed for this integration.
15 Comments
maxi1555
Contributor
0 Kudos

Hi Niklas,

I do not find the reason of introducing the step “4” in your scenario, why do you want/need to get the SAML from Azure AD?, you are generating a big dependency with that IDP, and impacting the performance of the API.

From my point of view you just need to generate the SAML directly in SAP APIM, and with that you are adding a layer of trust between SAP APIM & XSUAA, that can be always be reused and it’s under your control, and of course you are improving the performance of your APIs a lot ( you are saving one extra call, and you do not need to maintain credential in you SAP APIM).

 

Max.

miroll
Product and Topic Expert
Product and Topic Expert

Hi Maximiliano,

We actually considered what you describe but there is a huge downside to the that approach: If you generate the SAML in the SAP APIM you would need the corresponding certificate to sign them. Having that certificate in the tenant would also mean, that you could basically create a SAML for anyone you would like - even not your own user.

With the approach we chose we delegate that to the Azure AD since it is the central IdP in use in our scenario. It can validate the JWT token and will only give us a SAML for the user with a valid JWT token.

For a more detailed explanation tobias.pahlings can help as he is the security expert in our setup. He's also the one who advised against generating the SAML in our SAP APIM instance for the reasons mentioned.

BR,
Niklas

MartinRaepple
Active Participant
Hi Max,

I agree with Niklas. Generating tokens (such as SAML assertions) should remain the responsibility of the Identity Provider. And don't underestimate the effort of generating tokens. Besides finding a secure place for the credentials for all cryptographic operations (signature, encryption), SAML assertions in particular bear a certain complexity due to their flexible content model and federation concepts.

From an architectural perspective, creating a dependency to a central identity provider is actually a prerequisite to achieve single sign-on and consistent user management in a distributed system scenario such as the one Niklas is describing here or I've been working on.

Best regards

Martin
maxi1555
Contributor
0 Kudos
Hi Niklas,

That "downside" that you mentioned is the main purpose of the "SAML Policies" in SAP API Management.

I would say that step "4" is there because someone wanted it there, but it is not really necessary, I will not open a debate here 🙂

 

Max.
maxi1555
Contributor
0 Kudos
Hi Martin,

In the real world ( not in paper ) this is too expensive from the api performance point of view, following this concept you must generate an extra http call per every API call where you need principal propagation.

From my point of view the level of trust between “SAP APIM &  XSUAA” must be the same as “SAP Cloud Connector & Backend Servers”, it’s exactly the same scenario.

 

Max.
MartinRaepple
Active Participant
0 Kudos
Hi Max,

looking more closely at the message flow in the diagram of Niklas scenario, I am actually thinking if the following sequence would make more sense (and also resolve possible performance issues): Let the Application Client do steps 1, 3, 4 and 5. After that, the Client has a valid JWT token from XSUAA which can be used to authenticate and authorize step 2 when calling the API Proxy. Assuming that the API Proxy is a stateless component, caching of the (SAML) token is not an option to improve performance. The Application Client might be better suited here and could use the refresh token from XSUAA to request a new access token once it expires. miroll tobias.pahlings What do you think?

Best regards

Martin
TobiasPahlings
Participant
Hi Max,

although I agree with your statement that it is expensive performance wise, from a security and risk perspective APIM and Cloud Connector are two very different systems with a completely different purpose.

Cloud Connector (as well as an IDP like Azure AD) have a built-in functionality to create tokens like the JWT or the X509 Certificate. These applications were designed with security in mind while the average API flow in APIM typically does not have a threat model or regular security tests.

Therefore, the goal is to keep the attack surface as low as possible by leveraging security functionality that is available out of the box.

Keep in mind that you can do many things wrong when creating SAML tokens and validate JWT by hand. As a general rule, it is never advisable to write your own security functions unless you are an expert.

The risks in storing a universal token that grant you full access to all your connected backend systems in an environment where many people have access is typically not worth the benefits. But of course there are other methods to limit the scope or building a similar scenario without the need to reach out to Azure. These measures would then change the risk perspective, but this depends on many factors.

BR
Tobias
maxi1555
Contributor
0 Kudos

Hi Tobias,

Not sure if you have experience with SAP APIM, but In SAP API Management:

  • The attack surface is the "PreFlow of the ProxyEndpoint", this is the place where you must put all your policies to protect your APIs, in Niklas scenario the validation of the OIDC token
  • The safe zone is the "Preflow of the TargetEndpoint", this is the place where you know that authentication is already validated, and in case of "user context" APIs( principal propagation is required ) you must have a UPN.
    • This UPN you can have it from very different places, in Niklas scenario is inside of the OIDC token, and that UPN is the value that must be propagated to XSUAA via SAML
    • XSUAA will validate that the SAML is "trusted" and It will extract the UPN, generating another token( it could be JWT or SAML, depending of your configuration) for SAP Cloud Connector
    • SAP Cloud Connector will validate the token issued by XSUAA is "trusted" and it will extract the UPN, generating a short lived x509 certificate for backend servers( again depending of your configuration)
    • Backend server will validate the x509 ceritificate issues by SAP Cloud Connector is "trusted" and finally map the CN of the certificate to an username.

As you can see nobody is talking about "storing a universal token that grant you full access to all your connected backend systems", we are talking about principal propagation here 😉

Max.

maxi1555
Contributor
0 Kudos

Hi Martin,

Two things:

  1. SAP APIM is able to cache whatever you need, but XSUAA will not accept the same SAML token twice.
  2. It’s a really bad idea expose XSUAA service to the “public/external apps”

I do not think that the entire flow must change here, the only part the is “questionable” here is that step “4”, and of course it can be improved from the security point of view:

In this scenario:

  • How do you know that the real client application is making the request and not some else who has access to the OIDC token?
  • How do you know that the issuer of the OIDC token is allowed for the client application? ( in the real life you could have more than one IDP, and you would only allow the consumption of the APIs to specific client apps presenting OIDC from specific issuer).
  • How do you know that the OIDC token was issued for that client application and not for another application in your landscape?( in the real life the IDP is issuing OIDC token for different applications in your landscape, and people could try to reuse them)
  • How can you improve the performance of your API calls if every time that SAP APIM is receiving an API call you must validate OIDC token?

You can take a look here if you are interested to know the answers.

Max.

 

TobiasPahlings
Participant
Hi Max,

I am not sure if I understand your scenario correctly, but the problem lies in the generation of the SAML token inside the APIM. This forces you to have the required key material to sign the SAML token inside the service, being the "universal token". Maybe my choice of words was a bit misleading here.

If I now compare Niklas's scenario to the one proposed by you, Niklas uses an external service to do the validation and generation of the tokens. This lifts the requirement to store security relevant material out of  the APIM to a service that is dedicated for such operations.

I am not saying that the Azure AD Scenario is always the best way of doing it and depending on your scenario it might definitely be problematic with performance, but it is also a method where you do not need to trust all your developers inside the APIM. Another example might be that you do not need to roll over all your keys after someone with access to the APIM left your company. (And I know that this is never done at companies regardless 😉 ).

I am just arguing that if you are storing such strong credentials (signing keys) inside the APIM (or any other platform accessed  on a regular basis) you need to make absolutely sure that the token can only be used in the scenarios you intend them to be used.

I definitely do not want to sound like a security absolutist here, as I know the challenges and also the need for compromises between security and functionality. Only thing I want to point out is that you need to consider very careful if the risks are greater than the benefit. (And this is definitely very dependent on the type of application you want to support).

I hope that clarifies my standpoint a bit.

BR

Tobias
MartinRaepple
Active Participant
0 Kudos
Hi Max,

XSUAA's token endpoint is public by default (as well as other endpoints for SAML/front-channel SSO). Is there a way to make them "private" in your subaccount and put API management in front of it?

Step 4 in the token exchange flow requires an authorized OAuth client in Azure AD to request the SAML assertion from Azure AD for the user. This requires the Azure AD admin to register an applications in Azure AD and creating a secret. Both values must be known (and securely stored) by the client application. If the client application runs on BTP, the destination service serves as a good place to store such values for example.

When requesting the SAML assertion, Azure AD verifies the incoming OAuth access token. Only an access token issued by itself (i.e. the Azure AD tenant identified by the iss(uer) attribute) will be accepted. The access token also contains an aud(ience) attribute which identifies the Client Application by its client id. If the request is not authorized by the same client id (and valid secret), Azure AD will reject it. In other words: Only the client application the initial OAuth access token has been issued for can also request a SAML assertion from Azure AD for the authenticated user. More details on this flow in Azure AD can be found here.

Regarding the performance concerns, I'd recommend to let the Client Application orchestrate the initial token exchange flow (AAD access token -> SAML assertion -> XSUAA access token + refresh token) as mentioned in my previous comment. The client could use the XSUAA access token to authorize calls to BTP services/APIs until it expires, and then use the refresh token to obtain a new access token from XSUAA.

Thanks for sharing your blog post. I took a look at it, and if I understood it correctly, you are using a slightly different approach:

  • The (Client) Application uses the SAML response message (and not just a SAML assertion) from the initial SAML-based front-channel SSO to request the access token from APIM. RFC 7522 (SAML Bearer Grant Type) defines a SAML assertion to be used for this request, but I think this is only a minor issue.

  • More critical to me seems to be the fact that the SAML response (and the included SAML assertion) is actually consumed twice: First time by the client application for front-channel SSO, and then next to request the access token for calling the API on BTP. I was wondering how you handle the different (SAML) audiences? In a real-world setup, the client application and the downstream API/backend service will be different SAML service providers, and the SAML response sent by Azure AD will be issued for the (Client) application only (identified by the Audience and Recipient URL in the SAML Assertion). How did you solve this in your blog post setup?


Best regards

Martin
SebastianSeiler
Explorer
0 Kudos

Dear miroll

thank you for the great blog.
In our scenario we have a slighty different environment base line:

  • SAP API Management (NEO)
  • SAP Identity Authentication (used as proxy with delegation to Azure AD configured)

We have no physical user store in IAS. This is still managed by Azure AD while the IAS delegates the authentication requests from different SAP Cloud applications to the Azure AD.

Now I'm wondering how an authentication process for end users in API Management can be achieved having a set-up like this in place. Our challenge is that we need to give end users access to our SAP Back-End APIs (OData) via API Management without having a redundant store neither in API Management nor in IAS. Does any kind of blog or documentation exist or do you have a recommendation?

Best regards,
Sebastian

miroll
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi Sebastian,

I'm sorry for the delay in my response.

You can use the same approach as described in the blog above with one difference that Neo gives you: xsuaa is not involved in Neo --> you can directly pass the SAML token once you have it to the backend integration of the SAP API Management instead of going through the xsuaa and on-premise-connectivity instances.

Regarding the IAS question: IAS could also issue JWT Tokens if configured as an openID connect provider but it currently does not provide a token exchange endpoint analog to the one of Azure AD as described in the blog here.

Therefore it would probably be best to follow the approach described here in the blog using an incoming Azure AD JWT Token to get the SAML authentication and proceed with the backend integration of SAP API Management. This way you would still have all users in your Azure AD without redundancy and also comply with your policy to have the APIs OAuth authenticated.

Best regards,
Niklas
SebastianSeiler
Explorer
0 Kudos

Hi miroll,

let me specifiy our requirement.
Basically, we want to expose the same API proxy (e.g. for Business Partner, Sales Order, etc.) for two different use cases but with OAuth from the application perspective in place:

The first use case is straight forward where a third-party application delivers credentials, gets the access token from the token endpoint and consumes the API proxy, that consumes the back-end APIs via technical user:

Authentication with OAuth & Technical User

The second use case includes the fact that an end-user in the third-party application has already been authenticated by Azure AD. In this case, we need to delegate the user via API Management to the back-end in order to access the APIs with the individual user (or SAML assertion), not a technical user:

Authentication with OAuth &Individual User


So I don't want to create two different API proxies for each of those cases but provde one that can handle both scenarios. Maybe the assertion part is not the supposed or recommended solution. So I wonder, how this can be achieved?

Best regards,
Sebastian

miroll
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi sebastian.seiler,

From my perspective the thing you need to change is the way applications receive their OAuth token. In your diagrams the token is issued by the APIM if I understand this correctly. My advice would be not to handle this yourself but to have them retrieve the token from Azure AD. This would bring you the advantage of already having all users (personal & technical) in there and a single central place issuing OAuth tokens which can identify the user.

If you do not want to change the current flows (e.g. endpoints in the applications to retrieve the token), you can still change the behaviour of that flow to point to the token endpoint of your Azure AD instead. The consuming applications could keep the same endpoint and the token handling would still be done by the Azure AD.

Once you have the Azure AD OAuth token coming in for every request - regardless of the intent (technical or personal access) - you can handle all the requests equally: you retrieve the user SAML (personal or technical) from the Azure AD via the token exchange endpoint as described above and continue to the onprem integration

If you want to use the API key (not the secret!) additionally e.g. to apply quota limits or measure usage, you can still have the applications send in the API key additionally to the OAuth token.

I hope this explanation helps understand where I would go with this setup.

Best regards,
Niklas