Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member195766
Active Participant
With the latest release on Wave 2020.03 for Fast track Tenants and for Quarterly release Tenants cycle with 2020 Q2 QRC, SAP Analytics Cloud is introducing one of the most asked features on Scheduling Stories and Analytical Applications.  We call it Schedule Publications. With this, you would be able to Schedule a Story and also an analytical application with recurrence and distribute the same as a PDF over email to a number of different SAC and Non-SAC recipients. Want more, you can even include a customised message in the email per the schedule and as well attach a link to the view of the story  in online mode which can be used to check the latest online copy of the story / analytical applications.

Note :

  • Schedule publications would be available only for SAP Analytics cloud Tenants based on the AWS data center(Cloud foundry based), and Microsoft Azure based 

  • In china , we support in Alicloud based SAC Data centers.

  • A minimum of 25 or more SAC licenses are required to make use of this feature. It can be a total of all types of SAC licenses


You can create a single schedule or even a recurring one with a defined frequency like hourly, daily, weekly.

Before we dive into the complete details on what you need and How to create your first schedule, Let's learn some of the terminologies used with-in document:

  1. What is a Schedule Publications engine?
    A Schedule Publications engine is a part of the SAP Analytics cloud hosted in the cloud responsible for creating the schedule and generate publications and send it to different recipients over an email.

  2. What is a Publication?
    A publication is an end output of a Story / Analytics application Content which the Schedule
    Publications engine generates.
    With the first cut of this release, the first publication format supported is PDF.

  3. What is the Destination?
    The destination is where an end publication generated by Schedule Publications engine would be sent to the recipients.
    With the first cut of the release in Wave 2020.03 and 2020 Q2 QRC, email is the first destination supported.

  4. What is Recurrence?
    You can define a recurrence pattern for your schedule which would be taken care of by the schedule publications engine in the background to run and distribute the publications to different recipients.
    With the first cut of the release in Wave 2020.03 and 2020 Q2 QRC, recurrence supported is hourly, daily, weekly.

  5. What are SAC Users and Non-SAC Users?
    Users who are registered in the same SAC Tenant are called SAC users and Users who are not a part of the SAC Users list are the Non-SAC users. Non-SAC Users are basically external users.

  6. How about the authorisation of the publications ?
    Schedule owner authorisations would be rolled out to all the recipients who receive the publication which means the content sent to different users would be exactly how the schedule owner would have seen the same story and the views(using bookmarks).


 

SAC Data Centers where Scheduling Publications is supported:

Schedule publications are available in tenants which are provisioned on top of the SAP Cloud Platform Cloud Foundry environment. The list is as below:

 













































  • AP11-sac-sacap11 CF
    AP12-sac-sacap12 CF
    BR10-sac-sacbr10 CF
    CA10-sac-sacca10 CF
    CN40-sac-saccn40 CF
    EU10-sac-saceu10 CF
    EU20-sac-saceu20 CF
    JP10-sac-sacjp10 CF
    US10-sac-sacus10 CF
    US20-sac-sacus20 CF



Yes from the above list , The latest addition with a patch update  on the Ali Cloud data center starting the versions 2020.8.11 for QRC based tenants and for the fast track tenants , starting on 2020.11.2 and 2020.12.0. Depending upon whichever groups you belong to and the version , you would get this update on your Tenant.

 

What you need to get started?

At first, Schedule Publications needs to be enabled by your organisation SAP Analytics cloud admin on the Tenant level. To do the same, Please log in as Admin and go to System->Administration and enable the Toggle

"Allow Schedule Publications"

If you want to allow your schedules to be sent over to Non-SAC users as well along with SAC users, Please enable the toggle option

"Allow Schedule Publication to non-SAC users"



 

Schedule Publications is not by default enabled   to all users in your organisation , Your admin needs to assign to a template who would have rights for creating schedules.  To do the same . under the  SAC Tenant application menu, Go under the Security->Roles and click on any existing role where you would like to add Schedule Publications right.

 

How to create Schedule Publications?

You can create a schedule IF




  • If you are a BI Admin or an Admin. By default, these roles come with the Schedule Publication permission.

  • If the Schedule Publication permission has been assigned to a custom role created.

  • If you have a Save or Save As permission to a story once the Schedule Publication permission is given.


Once a user has been granted access to create schedules,

  • Select the Story / Analytical application under the browse files (By using the checkmark) and then choose the option Share ->Schedule Publications


 

  • The other way is to open a Story and the again go under share option and select Schedule
    Publication



 

  • Once the Schedule Publications Dialog box opens, Input the details as required. Let me elaborate on the different options here.

    Name
         : Provide a name for your Schedule

    Start 
          :  Provide a start date for your schedule with a defined time, You can add recurrence
    details as well by selecting the option "Add Recurrence". Under Recurrence, you can define the recurrence pattern to be hourly, daily, weekly as different options and also the number of times  needs to be repeated including the end of occurrence details.

    Topic
          : This is the subject for the email which would be delivered to  the recipients

    Message
    : This is the message body for the email which would be sent  to the recipients over emailInclude Story Link :  If you select this checkmark, then the story/ analytics application link would be sent along with the email. If you happen to personalise the publication by selecting a bookmark to be delivered(Given below) , then the personalised bookmark view link would be embedded.Distribution : Here , you can define the view of the story which needs to be delivered to the recipients. You can  personalise different users or  teams with different views of the same story to be delivered with the help of bookmarks available for stories. If you stories has multiple bookmarks where each of the bookmarks are relevant for different users/teams, you can make use of the same , else create one. The advantage you find with the bookmarks is you can create a unique personalised view by applying different filter options and create views. Eg : A story  having for multiple states and you can create different bookmarks with combinations of multiple different states and as well including other dimension combinations.


 

  • Distribution (Continued): You can create one or more than view (as story default view or  different bookmarks) which can be delivered to different SAC users/teams. Let's focus on one view and understand all options.Next to the Downarrow,  Double click "View1" and provide a name for your view. Below screenshot describes to be "California Sales Managers"

  • SAP Analytics Cloud Recipients : Click the person icon and select the different SAC user recipients or teams

  • Non-SAP Analytics Cloud Recipients: These are the users who are not a part of SAC user lists or a part part of SAC tenant . You can include their email address by manually typing their addresses. Under the  default SAC Analytics Cloud licensing, Per View , You can input a maximum of 3 Non SAC Recipients.

  • Story View : Choose the Story/Bookmarks view which you want to deliver to the above recipients . You can choose between Original Story , Global Bookmarks and as well My Bookmarks. the authorisation on the story publication would be same as schedule owner and the exact view would be delivered to different recipients .

  • File Name : Name of the publication which would be delivered to the recipients

  • PDF Settings : You can select this option to define the PDF settings like what all pages you want to deliver, the grid  settings for different columns and rows selection, choose to insert appendix which has details on metadata information on the story.Once you are done inputting all the details, Thats it ! Click OK and create your Schedule.How to view my Schedules created and as well how can i Modify?You can view the Schedule created under the Calendar view . Go to the SAC application menu and select Calendar. You can see the schedule created right there. If its recurrence schedule, then you would see against multiple different dates /time as defined by the schedule owner.You can as well modify a single recurrence or the entire series occurrence. Select the occurrence from the calendar view and on your right side, a new panel opens where you can modify.



    As and when the clock ticks its time, The Schedule publication picks the job and creates the publications and send it to the different recipients defined as an attachment over email. The maximum mail delivery size allowed per email including attachment is 12MB.Schedule Publications in itself is a resource intensive tasks which includes the Schedule publications engine on the cloud hosted on SAP Analytics cloud  do a variety of jobs in the background for creating the publications including the email  delivery. Out of the box with the standard licensing you would get limited number of schedules .


 

 

Scheduling Story based on Live connection

While scheduling story based on live connection user need to ensure certain settings are enabled and configured correctly, if the settings are not done correctly the schedule will fail.

  • Make sure that the “Allow live data to leave my network”switch is enabled from System-> Administration->Datasource Configuration

  • Go to Connection on which story is create and configure Advanced setting

  • How to configure configuration for advanced setting has been mentioned in Live Data Connections Advanced Features Using the SAPCP Cloud Connector

  • Once this Advanced feature configuration has been done successfully, you can schedule a story created on this connection


 

Please refer to the other related blogs for more information  (This space would be updated as and when more blogs appear)

Number of  Publications available under your current SAC licenses

How to schedule an Analytics application Publications

Defining Prompt values while creating or editing Schedule Publications in SAP Analytics Cloud

FAQ’s on Scheduling Publications in SAP Analytics Cloud

Schedule publications in sap analytics cloud

Manage all schedules created by different users under one view 

Video 

42 Comments
RoberR
Newcomer
hi, do yo know the date of release the version 2020.03? thank you, regards
kgrzymala3
Explorer

Hello Karthik,

Thank you for your blog, availability of that feature only on AWS was main takeaway for me.

Do you think that Schedule Publications feature could be enhanced for scenarios when recurrence is more ad-hoc based on exceptions in the data, something like Scheduled Alerts?

I have in mind one of my enhancements:

https://influence.sap.com/sap/ino/#/idea/244108

Regards,

Krzysztof

 

former_member195766
Active Participant
Hi Krzysztof,

That is a part of the product plan for future enhancements , However, for now, i can't provide you any timeline but that is the direction we would be moving forward once many other features we introduce here
former_member195766
Active Participant
0 Kudos
Hi Rober,

You should be already running 2020.03 wave if you are on Fast track mode of SAC (Bi-weekly release model) . If you are  on ona quarterly release cycle, the highest wave as of now you would be is on 2020.02, Hence for you , on Q2 QRC  by May /June 2020 timeframe , you should get this feature available.
former_member186271
Participant
Hi Karthik,

Thanks for sharing this article.

May i know if we have plan to add this function to NEO? If no, is there any technical limitation?

 

Regards,

Jeff
former_member195766
Active Participant
Sorry, We don't have plans to have this for NEO. As you guessed rightly, there are technical limitations with overall NEO platform itself which is not supporting the  Docker support required for scheduling.
Vibhav
Participant

Hello Karthik,

Very informative blog! I am unable to see the “Allow Schedule Publications” and “Allow Schedule Publication to non-SAC users” options under Administration>System Configuration using an Admin role(full privileges). Our SAC verion is 2020.6.3 and we are on the Cloud Foundry tenant.

Is there any additional system configurations that must be performed to be able to access Scheduling?

Regards,

Vibhav

former_member195766
Active Participant
Hi Vibhav,

Could you please raise an incident with SAP product support , As you are in cloud foundry environment , it should be supported in your wave 2020.6

 
deepusasidhar
Participant
0 Kudos
Hello Karthik,

Is SAP Analytic Cloud agent a pre-requisite for using scheduling? We do have the default Cloud Agent configured, but the schedule fails with an error.

Thanks.
barnaby
Explorer
0 Kudos
i have the same question.  SAP support is a bit slow to respond.
former_member195766
Active Participant
0 Kudos
Hi Deepu,

For Live connected data sources , there are additional configurations required and would be available with the documentation on how to set  it up.

For acquired data sources ,  There is no additional configurations required.

 

 
subash
Explorer
0 Kudos


 

Can you put few words on this especially and also we have few questions on this topic.
former_member195766
Active Participant
Hi Subash,

Scheduling Publications engine is hosted in the cloud , If your story  uses Live data connected models, then the data needs to be proxied  to this engine for generating the publications and it happens in the background. This toggle needs to be enabled by your admin and helps your admin know when they need this.

Along with this toggle, you would as well need to  do additional configuration for Live connected data models to make use of this feature so that seamlessly , the publication while generating is able to login on behalf  of schedule owner, generate the publication and send to different recipients .

Configure Your On-Premise Systems to Use the SAPCP Cloud Connector:

https://help.sap.com/doc/00f68c2e08b941f081002fd3691d86a7/2020.10/en-US/3bdb65253c8046b2b8234c554072...

 

Set Up Trust Between SAPCP Cloud Connector and Your On-Premise ABAP Systems (BW or S/4HANA)

https://help.sap.com/doc/00f68c2e08b941f081002fd3691d86a7/2020.10/en-US/80140fff3260494fb8eb45ca4c2d...
Hello Karthik,

 

Is it plannned to extend this feature to SAC tenants not based on AWS data centers(i.e. eu1)?

 

Regards,
Henry_Banks
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi sbshsbsh

long time!

my understanding is that this 'scheduling/publications engine' (which renders the online stories to PDF output) needs to 'crunch' the  data locally in SAC to post its output.

Much like when SAC needs to extrapolate a predictive forecast against a live connection, an admin setting like this needs to be toggle-on (else the feature wont work) to allow the required datapoints (actual values - not just metadata!)  be transferred to our cloud service so that we can run an algorithm against the data.

Cheers, H
Henry_Banks
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi uriign

Unfortunately,  it is my understanding that this feature cannot and will never be 'backported' to Neo (SAP DC) because it does not benefit from the containerized architecture on which this new extension was built.

Regards,
H
former_member4998
Active Contributor
Hello Karthik,

Can we be able to schedule jobs to import data into SAC with custom settings -below is the details.

SAC Schedule Job’s - Start every month workday 8th @ 7.30 PM IST to end last workday of month @ 7.30 PM IST(ex: for Aug 2020 – 12.Aug.2020 @7.30 PM IST to 31.Aug,2020 @7.30 PM IST)

Thanks, Sreeni
former_member195766
Active Participant
Hi Sreenivasulu,

This should be possible once we have the monthly recurrence included with-in. This is still int he roadmap and ocne rolled , should work.

For now, You can try the Daily recurrence and schedule between a number of days as required.

 
former_member195766
Active Participant
Hi Oriol,

As Henry mentioned, NEO has technical limitations and hence we are not planning. As we need a cloud foundry based data center , we were able to offer in China with Alicloud .

Once SAC is available with Azure, that would be our next plan to support Scheduling Publications in Azure based SAC tenants as Azure is Cloud foundry ready .
0 Kudos
Thank you Karthik for the blog.  Couple of question when you get a chance, could you please help.

  1. In SAC scheduling, is it possible to schedule the reports based on the Fiscal Calendar, instead of standard Calendar.

  2. The user wants to run certain reports based on Fiscal month, so the custom table is residing in the HANA CLOUD, so when ever we run the report, we need to identify the current date, pass it to the custom table on the HANA cloud, and get the Fiscal period. Is this possible in scheduling or any other way we can do this in SAC/HANA CLOUD.


Kindly let me know when you have a moment.

 

Regards

Ramakrishnan Ramanathaiah
former_member195766
Active Participant
0 Kudos
Hi Ramakrishnan,

Currently, the recurrence is based on Hourly / Daily / Weekly basis  and not based on a fiscal calendar  which would require custom calendar support. Could you add into the influence portal adding this as a requirements for you with more details on your use cases and some examples.

 

For the second point, its not possible to pass params on runtime
0 Kudos
Hi Karthik,

I have followed the pre-requisites mentioned in the blog but my schedule for an Analytic Application based on BW Live Data Connection constantly fails with the error "Couldn't fetch artifact from story exporter service".

"Error: We couldn't complete scheduling the publication because of an error while running the script."
former_member195766
Active Participant
0 Kudos

Hi Geetha,

Hope you have configured the SAPCP properly as if not done properly, it can cause errors.

Configure Your On-Premise Systems to Use the SAPCP Cloud Connector:

https://help.sap.com/doc/00f68c2e08b941f081002fd3691d86a7/2020.10/en-US/3bdb65253c8046b2b8234c554072...

 

Set Up Trust Between SAPCP Cloud Connector and Your On-Premise ABAP Systems (BW or S/4HANA)

https://help.sap.com/doc/00f68c2e08b941f081002fd3691d86a7/2020.10/en-US/80140fff3260494fb8eb45ca4c2d...

 

 

I suggest raise an incident so that our Product support can take a look at the issue and debig and get you a resolution

0 Kudos
Hi Karthik,

Thank you for your reply. We do not have the SAPCP cloud connector installed and configured on-premise. It would be good to know if there is any workaround or alternate for publishing Stories/Analytic Application without on-premise cloud connector.
SayedZubair
Active Participant

Hi karthik.kanniyappan 

I had couple of questions while scheduling publication on a story based on Live connection to S4, it would be helpful if you can clarify.

  1. The publication runs for the same set of Prompt values(as in the story) even though new set of prompt values are defined while creating the publication?(Reason being: I am getting blank pdf/ppt though I give different prompt values)
  2. How to dynamically change the dates for these prompt values in publication. If we simply set to fixed dates then this is of no use. Attaching a screen shot of publication prompt values I am using. Can you please check and say if there is any way to change the values dynamically? Mainly dates.Any leads would be helpful.

Regards,

Sayed Zubair

 

anna389
Participant
0 Kudos
Hi karthik.kanniyappan

thanks for this great blog!

I have one question regarding the authorizations during a scheduled publication.

If i create a publication every recipant will recive a PDF with my view of the data.

If there any possibillity to restrict this? I like to send a specific PDF to every recipant, which only includes the data he is authorized for.

Thanks

Max
0 Kudos

Hi Karthik,

Thanks for the great article on publication. We have data access control enabled in our model. If I want to schedule a publication to a group of users who has access to this model, will the publication send out the restricted data to the users I am sending the report to?

Thank you

 

P.S. I just noticed Max posted similar question on the restricted data. 

former_member701856
Participant
0 Kudos
Hi,

Is schedule publication based on live connection available only for on-premise SAP HANA system?

I have a SAPCP HANA service instance running in a Cloud Foundry environment and I would like to set up a schedule publication in SAC using an active live connection stablished with my HDI Container (which consumes data from my HANA service database).

Best regards.
SayedZubair
Active Participant
0 Kudos
Hi,

 

How can the changes be made (like adding new recipient, change time etc) for the complete recurrence? As of now I can see it is possible to do this to individual recurrence. Cant this be done at one go for the complete recurrence?

 

Regards,

Sayed Zubair
danielsagrera
Newcomer
0 Kudos
Hi Marcos,

I am facing the same scenario and still have not found any clue. Did you solve the situation?

Thanks.

Daniel
former_member701856
Participant
0 Kudos
Hi Daniel,

 

I did not found a solution yet. As the Cloud Connector is a mandatory prerequisite for using schedule publication feature using HANA live connection, it seems that is not possible to do such configuration in a SAPCP HANA service instance running in a Cloud Foundry environment.

 

Best regards.
former_member195766
Active Participant
Hi Sayed,

Thanks for your patience as i m so delayed to respond here .

You can modify not just for occurrence but as well for recurrence in the same calendar menu , on the top right of the panel where you see  schedule information, choose recurrence icon and then edit and save .
former_member195766
Active Participant
0 Kudos
Hi Marcos,

Could you raise an incident so that a product support can take a look at the same
SayedZubair
Active Participant
0 Kudos
Thanks karthik.kanniyappan . No worries. I got it. Thanks for your response.

 

Regards,

Sayed Zubair
makoum
Member
Hi everyone,

It is possible to send in another format instead of PDF like Excel for example?
wzuluaga
Explorer
0 Kudos
Hi,

Is there a similar functionality for Embedded Analytics only? We need a similar feature that allows us to schedule and share a story via email for SAC and Non-SAC users

Regards,

Wil
rphss
Participant
Hi

we are using Tunnel Connection for our live data and with Tunnel Connection the mentioned "Advanced Features" in your blog is not available. Is tunnel supported?

The Publication fails with errors














, even we are using same admin user (able to refresh story) and all settings mentioned above.

Any idea.
0 Kudos
Publication Bursting would be the answer, however it is still not enabled for Analytic Applications.

 
0 Kudos
Still no response on this, clients are asking the same
rpuranik
Participant
0 Kudos
When is the Publication Bursting scheduled to be released and will this work for the Analytic application too? Are there any limitations on how many teams or users the bursting would go to?
animikhc
Explorer
0 Kudos
HerveKauffmann
Advisor
Advisor
0 Kudos
Hello,

Thank you for your very good blog. Is it still relevant because you wrote it in 2020?