Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
AmandaJMurphy
Product and Topic Expert
Product and Topic Expert

 

Purpose


This blog outlines how customers or partners can use the File Upload Custom Widget to call our new Data Import Service API to upload fact data from an MS Excel Worksheet or CSV file to a Public Version or to their own Private Version within a Story that has Optimized Story Experience enabled.

The sample code may be downloaded and customized further to meet specific customer needs.

Check out our short video containing the File Upload Custom Widget highlights.

Prerequisites

  1. The File Upload Custom Widget may be installed on a SAP Analytics Cloud, Cloud Foundry environment.
  2. SAP Analytics Cloud revision 2023 Q4 or higher.
  3. The new model type that supports measures and accounts.
  4. The story must have Optimized Story Experience enabled.
  5. One widget per page may be used.
  6. Widget should be installed separately on each tenant.
  7. Importing to dimensions that contain hierarchies is not currently supported.
  8. Upload of Fact Data is supported.


Overview

The steps to upload fact data from an MS Excel Worksheet, or CSV file, to a Public Version or to a user owned Private Version / Public Edit Version can be broken down into the following 8 easy to follow steps:

Prepare the source data.

  1. Assign authorizations.
  2. Install the File Upload Custom Widget.
  3. Add the File Upload Custom Widget to an Optimized Design Experience Story.
  4. Create an explicit Private Version (optional).
  5. Configure the File Upload Custom Widget.
  6. Run the import data job.
  7. Review the results.

Let’s take a closer look at each step in greater detail.

1. Prepare the source data.

The MS Excel Worksheet or CSV file should contain one column for each dimension and measure in the model.  Each column should have a name in the header, that can be later used for mapping.

In the following example, the first 5 columns in this worksheet represent dimensions that exist within a Planning model in SAP Analytics Cloud.  Column 6 onwards, are pivot columns as they consist of fact data, including dates.

Figure 1: Sample Excel Worksheet Extract

It is helpful to assign a name to the worksheet and to also make note of the pivot column number that contains your fact data, as this information will be used later in the Job Settings configuration section of the File Upload properties panel.

In this example, column 6 is the starting pivot column number containing the fact data.

2. Assign authorizations.

Supported models are Planning and Analytic models that contain both measures and accounts.

There are 3 personas that are used for this process.

  1. The Administrator who installs the custom widget.
  2. The Story Designer who creates the story and configures the widget. This user requires design rights for the story or application.
  3. The BI or Planner End-User requires READ and MAINTAIN rights on the model.


Figure 2: Read and Maintain Access Rights on the Model


3. Install the File Upload Custom Widget.

Download the File Upload Custom Widget json and zip files from the file-upload-widget versions folder in GitHub.  Ensure to review the README.md prior to installation.

In SAP Analytics Cloud, select Stories and choose the Custom Widgets tab.  Press + to open the Upload File dialog.

 

Figure 3: Add Custom Widgets


Choose File Upload Widget json and zip file within the Upload File dialog and select OK.

Figure 4: Upload File Dialog


Your File Upload Widget should be ready for use within an Optimized Design Experience Story.

4. Add the File Upload Custom Widget to an Optimized Design Experience Story.


Create or open an Optimized Design Experience Story.

Under the View section of the Toolbar, select the Left-Side Panel icon to open the Assets Panel.

Expand on Custom Widgets in the panel and select the File Upload Widget v*.*.*.

Drag the widget onto the workspace.

Expand the width of the widget to leave sufficient space to view job status information.

Figure 5: Add File Upload Widget to Workspace


5. Create an explicit Private Version. (Optional)


This step is optional for Planners who wish to upload fact data to their own Private Version.  As an alternative, an existing Public Version may be used instead.

Within the Optimized Design Experience Story create an explicit Private Version.

See Creating Blank Public or Private Versions in SAP Analytics Cloud Help for details on Version Management.

6. Configure the File Upload Custom Widget.


The next stage is to configure the File Upload Custom Widget, in advance of running the data import job.

Select the widget on the dashboard, to highlight it.

Under the View section of the Toolbar, select the Right-Side Panel icon to open the File Upload builder panel.

Under Model Selection choose the name of the source model where the fact data shall be uploaded to.

Under Import Type, there are 2 Import Type options to choose from:

A. Fact Data.

Choose Fact Data in the drop down for upload of data to Public Versions.

B. Private Fact Data. 

Choose Private Fact Data in the drop down for upload of data to Private Versions.  Note that if the Story Designer has chosen “Private Fact Data” and the end user selects a Public Version in the upload, then the data will be written to the Public Edit Version.

Figure 6: Private Fact Data Import Type


Select Open Data Mappings Dialog from Data Mappings and either enter the column names from the file directly or choose Upload Template within Data Mappings dialog.

Choose the source worksheet name from the drop-down list.

Automatic mapping of columns should appear in the dialog.  Carefully review these mappings and configure further mappings to the corresponding worksheet columns, as required.

Click Save when mappings have been successfully made.

Figure 7: Data Mappings Dialog Containing Worksheet Name


Select Default Values for further optional configuration of default values. The default value will be used for any measures or dimensions that have not been manually mapped or if the uploaded file does not contain a value for the measure or dimension.

When the version is specified as a default value, or if it is mapped, this version is displayed automatically as an option to users when running the upload job.  If the version is not mapped in the file and if no default value is set, then the end user can select the version during the upload.

Next, select Job Settings.

Choosing the Reverse Account Sign checkbox reverses the sign of all fact data imported into models containing an Account dimension with an account type of Income (INC) or Expense (EXP).  This option is available for Fact Data only.

Choosing the Execute With Failed Rows option writes only correct records to the specified version.  Leaving this option unchecked means that fact data will be uploaded, only when all records pass correctness checks.

Choosing Ignore Additional Columns ignores columns in the file that are not required during the import process.

Choosing the Enable Pivot Settings option expands the pivot settings.  This is a similar concept to using Pivot Tables in MS Excel.

Enter the Pivot Column Start number as the pivot column of the worksheet fact data, the Pivot Key and Pivot Value.

For example, in the below screenshot, Pivot Column Start is 6, as this is the first column containing fact data. Pivot Key is Date and Pivot Value is Cost.

 

Figure 8: Job Settings Sample Configuration


The Enable Date Format Settings allows custom date formatting to be applied to the fact data.  If unchecked and no value is specified, then the date format in the source file must match the date format in the SAP Analytics Cloud model. For example, if the ‘Date’ dimension in the model is in the format ‘YYYY-MM’, then the date format of the source data must also be ‘YYYY-MM’.

Checking the Enable Date Format Settings box provides additional date flexibility, by allowing the administrator to specify the date type format in the source file, which will be converted to the date dimension configuration of the SAP Analytics Cloud model.  This option is available for Fact Data only.

Figure 9: Custom Date Format Settings

7. Run the import data job.
Save the story and switch to Optimized View Mode.

The Upload Data button should contain a blue arrow to indicate that the widget is enabled.

Figure 10: Upload Data Widget Appearance in Story Optimized View Mode


Click the Upload Data action button and select the Source file.

Choose the preferred Public or Private Version from the Target Version drop-down, where the fact data shall be uploaded to, or see the default Version name that has been pre-selected in Default Values of the File Upload Panel, along with the source file and worksheet name.

Any errors found in the worksheet are displayed in the Upload Data dialog.

Figure 11: Upload Data Dialog Containing Sample Error Messages


In the above example, a different worksheet was selected, compared to what was previously mapped, resulting in errors.

Once errors are resolved with the source worksheet, choose Run to upload data to the specified version.

Figure 12: Upload Data Dialog Containing No Errors

Figure 13: Running Import Job


The File Upload widget provides an indication of a successful import of fact data.

Figure 14: Import Success Status


Select the information icon to view more details on the successful or failed import.

For failed records, a CSV file containing a rejection reason for each failed row of data, can be downloaded, and reviewed, to help resolve any errors found in the source file.

Figure 15: Download CSV Option for Failed Import Analysis

Troubleshooting Tips:

  • Expand the Jobs Timeline and select "Copy Invalid Rows URL".  Paste into a new browser page to see failed record information.  
  • A commonly experienced error is "Temporary storage does not contain valid records [1011]".  In order to help diagnose this error further, a useful method is to switch the importing option from 'Fact Data' to 'Private Fact Data'.  Some additional error messages is generally provided using this option through a download file containing a list of failed records and a corresponding failure explanation for each row.

8. Review the results.

Following a successful import into a Public or Private Version, it should now be possible to see the same fact data from the source worksheet within SAP Analytics Cloud.

Figure 16: Empty Private Version

Figure 17: Private Version Containing Uploaded Fact Data

It is now possible to analyse, compare and maintain this fact data within this Private Version as desired.

Conclusion

The File Upload Custom Widget solution provides an example of how a custom widget can be created and used to upload fact data into SAP Analytics Cloud.  The sample code, found in the file-upload-widget GitHub folder, may be downloaded and customized further to meet specific customer needs.  A standard solution within the product is planned.  Follow the SAP Analytics Cloud Roadmap for details of when this solution will be made available in the product.

Useful Links

Check out our technical blog on how to further develop custom widgets File Upload Widget - How to develop custom widgets with the react framework

See SAP Analytics Cloud Custom Widget Developer Guide, to learn about how to extend Custom Widgets within SAP Analytics Cloud

See the Custom Widget Developer Guide section Learn about the restrictions in custom widgets

For more information on Version Management see Create, Publish, and Manage Versions of Planning Data within SAP Analytics Cloud Help.

See Enable Optimized Story Experience within SAP Analytics Cloud Help for guidelines on how to enable Optimized Design Experience for Stories.

Attributes of an Account Dimension within SAP Analytics Cloud Help provides information on Account Types in a model.

Object Type Permissions within SAP Analytics Cloud Help outlines the Maintain permission meaning.

Custom Widgets code can be found at Custom Widget Samples.

Find out more about our Data Import Service API.

SAP Analytics Cloud Roadmap

 

Find some more great content on SAP Analytics Cloud Custom Widget Framework below:

How to use Linked Analysis with Custom Widget

Announcing Custom Widgets Data Binding in SAP Analytics Cloud – Analytics Designer

Demystifying Custom Widgets for SAP Analytics Cloud

SAP Analytics Cloud – Widget Analysis

Unifying Story and Analytic Application with the New Optimized Story Experience in SAP Analytics Clo...

Bringing ChatGPT in SAP Analytics Cloud using Custom Widget

Leveraging ChatGPT to Create Custom Widgets for SAP Analytics Cloud

Quickly integrate SAP Analytics Cloud with other systems using custom widgets

 
If you have any questions, feel free to comment below or post a question to our SAP Analytics Cloud Questions & Answers forum.

Thanks for reading!

121 Comments
saurabh_03
Explorer
0 Kudos
Hi gerd.schoeffl ,

 

I am getting below error when I try to use the widget in a story. This widget was transport between environment.


 

but when I directly upload it in environment, I am able to use the widget in story.

Does this mean we cannot transport this widget? and Needs to be uploaded in each environment separately?

 

Regards,

Saurabh
AmandaJMurphy
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi Saurabh,

Yes, the widget should be installed separately for each environment and not transported.

Best regards,

Amanda
hugh_gledhill
Participant
Hi amanda.murphy gerd.schoeffl ,

In testing this widget I've faced a few issues which I hope you can help with, detailed below.

For reference I am testing in a "new model" type planning model, containing an account dimension and 2 measures. Data locking is enabled, but unlocked for the region I am uploading to. Data access controls on dimensions are also enabled, but as owner of the model I have full access rights. Data privacy is not enabled.

Thanks,

Hugh

 

#1 Target Version selection

The "Target Version" dropdown menu presented to the user after uploading a file appears to be ignored during the upload. Data is uploaded to the version specified in the data file or in the default values, regardless of the "Target Version" selection. If no version is specified in the file or the default values, the user is presented with the following error in the import dialogue.


#2 Private Fact Data

Import Type "Private Fact Data" does not seem to work, whether using an explicit private version or a public edit version. I have successfully imported the same data file using "Fact data" import type, so I assume the issue is related to the version selection. I receive the error message and JSON stats below.


{"jobStatus":"READY_FOR_VALIDATION","jobStatusDescription":"Job 7ec26c9a-4643-43f7-919c-7ce5f4416d27 for modelID Cho0no834lkriif0ptbleedf7 has been created successfully and validated. Data has been pushed to temporary storage. Push more data or run job now.","importType":"privateFactData","runJobURL":"https://XXXXXXX.hcs.cloud.sap/api/v1/dataimport/jobs/7ec26c9a-4643-43f7-919c-7ce5f4416d27/run","jobPropertiesURL":"https://XXXXXXX.hcs.cloud.sap/api/v1/dataimport/jobs/7ec26c9a-4643-43f7-919c-7ce5f4416d27","additionalInformation":{"totalNumberRowsInJob":4}}
sfilreis_18
Explorer
0 Kudos

I am having difficulty with the Pivot.

Made it simple: I have EVERY dimension except Date in the columns.  The last 12 columns are the months of the year, YYYYMM format.  Not that all the dimensions have to be in EXACTLY THE SAME ORDER as listed in the mapping and default values wizards.  However "Date" is the second item and if I leave it blank, everything shifts  over into that space and if I put Date in the mapping I get errors on the Date columns, even though the pivot is configured:

The file is super simple too, as I am evaluating the best solution process here: (the upload with the above pivot did not include col B, so the 14 start for the pivot was correct)
If I do not include the column Date I get all the data shifted over into the wrong dimension.  If I include the column Date I am told there is not data in any of the date columns at the end.
What is the proper protocol to prepare the data when the pivot is used?
Stephen
gerd_schoeffl
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi Stephen,

Could you please try with a file that does not contain a date column at all? In the mapping screen just leave the mapping for date and measure empty and do not set any default value. Then use the option 'pivot', adapt the could for the pivot start accordingly, and use pivot key and value as set above.

Best regards

Gerd
sfilreis_18
Explorer
0 Kudos
I get "KEY DIMENSION - DATE HAS NO VALUE and the Data Columns are "unknown - 202301" , etc
AmandaJMurphy
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi Stephen,

As Gerd mentioned, please remove the Date column from your csv file and don't map in the default values.  Leave Pivot Column Start at 14.  For columns 202301, 202302 & 202303 enter 0 into the cells of the csv rather than leaving empty, re-save and re-try the operation.

Regards,

Amanda
sfilreis_18
Explorer
0 Kudos

no luck.  I get two errors:

Row has more columns than expected

Row is missing values for primary key - Date

 

If I uncheck ignore additional rows I only get the second error 

Like it does not even recognize the pivot settings.

 

 

sfilreis_18
Explorer
0 Kudos
tweaked and now works!  This is an amazing addition to the toolbox!
gerd_schoeffl
Product and Topic Expert
Product and Topic Expert
Dear All,

We have narrowed the issue down the fact that the version dimension has been called 'CATEGORY', a term that has a special meaning in SAC but is nonetheless allowed to be chosen for the name of the version dimension in the model. We are looking into the issue but for the time being I would strongly recommend not to use the name 'Category' for the version dimension in the model.

BR Gerd
AmandaJMurphy
Product and Topic Expert
Product and Topic Expert
0 Kudos
Delighted to hear this is resolved now for you Stephen!

Best regards,

Amanda
saurabh_03
Explorer
0 Kudos
Hi Amanda,

Can we still transport story including fileupload widget and all setting will be retained across tenants?

Regards,

Saurabh
gerd_schoeffl
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi Saurabh,

You have to install the widget on every tenant but then you can just transport the story including the instance of the widget in the story.

Best regards

Gerd
hugh_gledhill
Participant

Hi amanda.murphy gerd.schoeffl

What is the expected behaviour of the file upload where there are multiple records in the data file for the exact same target data point?

I would want them to be aggregated, but it looks to me as if the last record wins and the earlier records are ignored. This is using import method "Update".

Thanks,

Hugh

AmandaJMurphy
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi Hugh,

To aggregate your values, the "Append" option is suitable.  See the section on Update and Append in Model Job Settings of the Data Import Service API documentation.

"... Update ... runs an upsert on the target model. If a row in the imported data matches the dimensions of a row in the target model, that row is updated. If there is no match, the source row is inserted as a new row.

Alternatively, you can set the import method to Append. In this case, the values of measures for matching rows are summed to the row in the target model. If there is no match the row is inserted as a new row."


Hope this helps.

Regards,

Amanda
hrishikesh11
Discoverer
0 Kudos

My excel file has 65550 rows, there are many combinations whose data is hitting 1 target location. if we do it from the data management section then it aggregates the values and gives correct output. I configured the data import widget with append setting and tried loading the data, Only 61108 rows are getting loaded i.e. multiple combinations hitting the same target are not getting aggregated. The widget is only considering the last value in that case even though it is set to append.
The update and append are working exactly the same way.

hugh_gledhill
Participant
Hi amanda.murphy

Thanks for your reply. I tested the Append method and it appears to have the same issue, i.e. the last record wins and the earlier records are ignored.

For example, consider a csv file containing 3 repeated rows with different values:



























Entity Account Time Value
Entity A Account 1 202301 50
Entity A Account 1 202301 20
Entity A Account 1 202301 30

I would expect these values to be aggregated before being imported into the model, same as if you were running a normal import job in the model data management tab:















Entity Account Time Value
Entity A Account 1 202301 100

Then the value would either update or append any existing records in the model, depending on the model job settings.

However, what actually seems to happen is that the last record wins and the earlier records are ignored:















Entity Account Time Value
Entity A Account 1 202301 30

This happens for both update and append import methods.

Thanks,

Hugh
bpoteraj
Explorer
0 Kudos
amanda.murphy great explanation.

When testing the widget on our model (which has both accounts and multiple measures), we were not able to populate only one selected measure - we had to input zeros for the other measures, while we would prefer that they stay null.

Is there a way to solve this?
AmandaJMurphy
Product and Topic Expert
Product and Topic Expert
0 Kudos
Thanks Hugh for the feedback.  I have reported this to our Development team to investigate.

Regards,

Amanda
AmandaJMurphy
Product and Topic Expert
Product and Topic Expert
0 Kudos
Thanks Hrishikesh for the feedback.  I have reported this to our Development team to investigate further.

Regards,

Amanda
gerd_schoeffl
Product and Topic Expert
Product and Topic Expert
Hi bpoteraj.pwc

Currently you cannot exclude columns in the target structure from the file upload, i.e. each dimension and measure in the target has to be filled either from the file or the default values. Thus you cannot leave measures 'untouched'. If they do not have a corresponding column in the file you have to fill them with a default value (e.g. 0). Here are two options how to overcome this:

  • create a specific target model which you use for the upload that only contains the dimensions/measures you want to fill in the upload and then do a cross model copy in the 'real' model

  • Use a specific dimension member (e.g. specific version,...) that is only used for the upload, Once data is uploaded and checked for correctness copy the data over to the 'proper' version.


Best regards

Gerd
ilter_yilmazli
Explorer
0 Kudos
amanda.murphy thanks a lot for this great Widget. I can see that with Q3.2024 update, this widget will be replaced with native functionality which is great. But until then, we still need to use this functionality.

Today we encountered when we try to upload .xlsx file with special characters. It can't handle it and turns them into wild characters which is incompatible with our master data.

It seems built-in "csv" converter of the widget is using standard ".csv" converter that we have on our computers. But if I turn it into "UTF-8" type csv by myself and use this file for upload, it works like a charm.

Do you think this "csv" converter library can be updated or changed to use this standard "UTF-8" conversion so users can still upload XLSX file instead of CSV.

We're going to create a ticket for this problem wanted to update from here as well.

 

Original file


Original xlsx


 

XLSX file upload


Failed import


 

UTF-8 csv file upload

 


Successfull Import


 
eya_mahdhaoui
Explorer
0 Kudos

Hello amanda.murphy gerd.schoeffl ,

After running the import job I get a "Temporary storage does not contain valid records [1011]" error. It only happens for end users – works fine with admin role.

I Appreciate your help.

Thanks,

Eya

gerd_schoeffl
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hello e_mahdhaoui ,

Is there a list of rejected records? Isf so what is the reason why they have been rejected?

Best regards

Gerd
eya_mahdhaoui
Explorer
0 Kudos

Hello gerd.schoeffl  ,

We don't get the list of rejected records. Only end users encounter this error message after running the import job, even though they used the same file as the admin. No rejects occur with the admin role.

Thanks,

 

Eya

AmandaJMurphy
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi Ilter,

Thanks for the feedback.  As this is a customizable feature, I would suggest referring to our technical blog  File Upload Widget – How to develop custom widgets with the react framework

Our technical team may be able to provide some further guidance here if required.

Best regards,

Amanda
AmandaJMurphy
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi @hugh_gledhillhrishikesh11,

Clarification on the Append method behaviour is outlined in the following SAP Note:

3420998 - SAP Analytics Cloud Data Import Service API Append Method Behaviour

Kind regards,

Amanda
hugh_gledhill
Participant
Hi amanda.murphy ,

Thanks for the update. To confirm: this is the intended behaviour, and SAP has no plan to change it to match the normal data import process?

Thanks,

Hugh
VB09104
Active Participant
0 Kudos

Hello,

I am getting error 'Request body data incorrect or missing (1202)'. I couldn't find anything related to this message. Can you please guide?

Thanks, Vikas

ginot
Discoverer
0 Kudos

Problem solved

0 Kudos

Hi,

I am using this widget and it works as expected until I upload the file where some empty rows contain tabulation characters. They appeared due to additional formatting rules applied to the rows. However in Excel cells themselves look empty.

For example how it looks like if to copy data from Excel to notepad:

aleksandra_orlovskaya_0-1706701939985.png

During the uploading it is said that 3 Rows found and process goes to onFailure flow, getJobFailedRowCount() returns zero.

And the Job status remains:

"jobStatus":"READY_FOR_DATA","jobStatusDescription":"Job ... for modelID ... has been created successfully and is waiting for data".

So, rows are empty for the user but not explicitly empty for the system.
Process goes on failure but, nevertheless, these lines are not recognized as erroneous and therefore cannot be skipped during processing, even if Execute With Failed Rows option is enabled.

Would appreciate if anyone could support.
Delete these rows in Excel is not an option 🙂

Thanks,
Aleksandra



 

gerd_schoeffl
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hello @aleksandra_orlovskaya ,

Unfortunately the copy to notepad shows that the lines are NOT empty - and I guess this would also be the case if you look at the data in csv format. And our file upload will take any line that is not empty as a potential line used for the file upload. So I fear you will have to make sure that the line is truly empty...

 

Best regards

Gerd

eya_mahdhaoui
Explorer
0 Kudos

Hello @AmandaJMurphy @gerd_schoeffl ,

We've resolved the issue.

Our attempt to grant access to users with restricted privileges led to an error due to their limited access to just one model. To address this, we granted "full access" to all models.

However, we are hesitant to extend "full access" to users. We speculate that there may be another model (Buffer model) associated with the import job that we can provide access to, rather than granting full access.

Could you help us find the buffer model to grant access to, or suggest a different solution?

Thanks.

Kind regards,

Eya

annettetrost
Discoverer
0 Kudos

Hi amanda.murphy gerd.schoeffl,

I am encountering a strange behaviour regarding the version selection.

The Import Type is Set to Private Fact Data and there is no default value for the version, so I would expect the user to get a Popup for selecting the version.

annettetrost_0-1707383874575.png

annettetrost_1-1707383885065.png

 

But when starting the Upload there is neither a Popup nor a text displaying the version. 

annettetrost_2-1707383991490.png

 

 

Any idea what could be the reason for this? 

PS: If I run the upload it fails and the rejection reason looks like this:

annettetrost_3-1707384191766.png

 

Best Regards

Annette

 

 

gerd_schoeffl
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hello @annettetrost ,

Have you maybe mapped the version from the file? If so could you delete this mapping and try again?

Many thanks and best regards

Gerd

gerd_schoeffl
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hello @eya_mahdhaoui ,

Have you grated full access to the TARGET model of the upload? In my opinion that should be the only one that is checked during the upload and I do not see why for the others full access should be necessary... What was the situation in your case?

Best regards

Gerd

eya_mahdhaoui
Explorer
0 Kudos

Hello @gerd_schoeffl,

Yes, we granted full access to the TARGET Model, and we provided Full Data Access in the Task Role Configuration.

However, when we granted full access to the TARGET Model without providing Full Data Access in the Task Role Configuration, it did not work.

Conversely, when we provided Full Data Access in the Task Role Configuration without granting full access to the TARGET Model, it worked.

This indicates that Full Data Access in the Task Role Configuration gives access to all models in the environment, which is not recommended.

Kind Regards,

Eya

annettetrost
Discoverer
0 Kudos

Hello @gerd_schoeffl 

 

thanks for the answer. I thought the version was not mapped, but now I discovered that there were some empty columns with empty headers in the file and the version was mapped to an empty column instead of not mapped at all.

Best Regards

Annette 

Tobias_Binkert
Explorer
0 Kudos

Hello,

 

I am also facing the issue, that I receive error messages like:

Temporary storage does not contain valid records [1011] 

& that rows are rejected for no obvious reason.

However, the model is a measure and account model, I am owner with permission for the model and dont have any data access restriction.

When checking the jobs timeline information I receive the below listed message which points out that the members that I do try to use are not given. However, I checked this a hunded times with different formatting and tried to upload as csv or excel:

"_REJECTION_REASON": "Invalid column value for: IMPL_COABudget, IMPL_BudgetHierarchy"

What could be the reason for this? The import works if I set the member e.g. to unassigned (#).

Is it about hierarchies or because those two are public dimensions?

@AmandaJMurphy @AmandaJMurphy 

Your help is highly appreciated

 

AmandaJMurphy
Product and Topic Expert
Product and Topic Expert

Hello @Tobias_Binkert ,

The Data Import API supports Public Dimensions, however, currently does not support hierarchies.  There are plans to support this in the future.

Best regards,

Amanda

Tobias_Binkert
Explorer
0 Kudos

Hello @AmandaJMurphy,

thank you very much for your answer. It would be great if this feature is enabled soon.

 

KR

0 Kudos

Hi all,

Last year, when I was trying this widget out I was able to import the provided .json and .zip files to SAC / Stories / Custom Widgets.

I am now trying to import the updated version of the files, however there is no.zip file anymore. Instead there is a .yaml file, in addition to the .json file. When trying to import the custom widget files, I am only asked to load the .json file and I assume the provided .yaml file will need to go somewhere else? Where do I need to import the .yaml file?

Without the .yaml file, I am getting the following error when trying to import the .json file:

Privete_member__916507_0-1708696879190.png

Thanks!

gerd_schoeffl
Product and Topic Expert
Product and Topic Expert

Hi,

Not sure where you founds this file. If you open the github you will find a folder 'versions'. There you will find the .json and the .zip file you need to download and install.

https://github.com/SAP-samples/analytics-cloud-custom-widget/tree/main/Custom-Widget-Samples/file-up...

Best regards,

Gerd

0 Kudos

Thanks Gerd,

I think I've been confused and been trying to import the API files instead of the widget files.

The feature is really helpful, however the 'readme' file indicates that 'SAP has no obligation to provide support or maintenance for the samples contained in this repository', which means that we cannot recommend this to clients for now, despite the great functionality. Does anyone have any idea if there are plans for this to be integrated in the official SAC offering? 

Many thanks

ginot
Discoverer
0 Kudos

Hi @gerd_schoeffl ,

Is there a way to retrieve the log of files that have been imported and display them in a story?

Regards,

gerd_schoeffl
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi @ginot 

Unfortunately this is not possible.

Best regards

Gerd

gerd_schoeffl
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi @Privete_member__916507 ,

Actually we are currently working on an 'in product' solution for the file upload. I hope we can give you some more details soon when this solution will be finally shipped.

Best regards

Gerd

scontran01
Newcomer
0 Kudos

Hi All,

if in a story page I insert the custom widget more than once to map different files and rules, when I execute the widget it seems that SAC do not consider the setting in the widget parameters and goes in error.

Does this happens only on my tenant?

Thanks

Stefano

AmandaJMurphy
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hello @scontran01 ,

Could you please review the prerequisites to using the Custom Widget within the blog, as they include the following:

  • One widget per page may be used.
  • Widget should be installed separately on each tenant.

See more details in the Custom Widget Developer Guide section Learn about the restrictions in custom widgets.

Hope this helps.

Kind regards,

Amanda

VishnuTeja78
Discoverer
0 Kudos

Hi Amanda  and Gerd,

I  am facing an issue and receiving the error message .

"Every row in temporary storage is invalid. Please use /invalidRows on the import job to see the reason [1033]"

I am testing it on a New model with very limited set of dimensions and trying to upload only 1 value.

ENTITYACCOUNTPRODUCTTIMEVerVALUE
REG0001DRV0100PRD0001201801Forecast200

 

Model:

VishnuTeja78_0-1712561438856.png

What could be the reason for this? Can you please help me.

Thanks,

Vishnu