I am curious to know how many people were able to complete validation in the two weeks we were provided?
We did not. We were able to complete all test cases in week 1, and I reviewed and approved the results over the weekend. We handed the validation packet to the reviewers (QA, IT, Dept Dir) on Monday but, due to scheduling issues, we only were able to obtain 2 of the 3 approvals to implement by Friday evening.
Personally, I would prefer four weeks to validate but wanted to poll the group to see if others had the same issue.
We decided on a compromise. We test new functionality and critical LMS processes for our training needs. (e.g. We validate three reports instead of 40, skip supervisor tasks, classes, etc.) The end result is 45 test cases focusing on the types of items, curricula, APs we use.
Our issue this time was not executing the test cases; we finished that in one week. Instead, we could not obtain the "approval to implement" from all of our stakeholders in the remaining week. Each is tasked with reviewing our paperwork, signing off, and forwarding to the next stakeholder.
We were able to get it done but it required complete alignment of all stakeholders going into the upgrade weekend on the timeline and when approvers would need to be available for approvals. We also use a risk based approach to only test any changes that impact aspects of the system that we are using for OQ and then very abbreviated regression testing in PQ. We write our own test scripts based on the test cases in the Validation toolkit. But it's painful. The only thing saving us right now is that we haven't implemented mobile functionality yet. Once we do, 2 weeks for validation will be impossible.
I second the request to have 4 weeks (minimum) for validation. 2 weeks is unreasonable.
We use system called veeva and kneat for documentation and approvals. so for me approvals is not a ig deal, but getting business to execute the scripts is a major challenge.
Also would you min sharing you list of O/P scripts? i am trying to make a baseline scripts that we need to execute each year apart from the new features/changes.
Appreciate your help
I do agree with Ward on the timing. We had to scrambled too and its not worth it. Eventually something has to compromise, even we don't test the entire system - our approach is risk-based and focused on deltas/core features/regression testing. We operate in a regulated environment, needless to say quality & robust scripting/testing is critical and if we don't get enough time then that's likely the first hit unfortunately.
I vote for at least 4 weeks of notice for quality output.
We were able to complete a risk based testing based on the change notices. However, we found out that these are not complete or accurate. For thorough validated testing 2 weeks is not enough at all. Especially since SAP is not promptly responding to tickets. For the validated user community I would suggest to get at least 6 weeks, because if this is causing issues with interfaces, you need more time to be able to re-design, build, and test those. This cannot get done in 2 weeks.
I agree with the below comments, 2 weeks to complete the validation of the system is not enough time. We were able to complete it using a risk-based approach based on the changes, however 2 weeks is not enough time for a proper validation when there are major changes that need to be incorporated. At a minimum, 4 weeks are needed for proper validation to ensure accuracy and complete functionality.
We originally raised concern about this year's upgrade due to past experience of issues appearing at the last minute and it happened again so fortunately they moved it up to give us 4 weeks however a last minute issue appeared just like the past couple of years so we handled it the same way. We closed out on validating the upgrade for b2105p24 with the workaround before they deployed b2105p28 which was a large part of the changes and worked on b2105p28 as a separate patch with focus on testing the bug fix and critical custom extensions and connectors only otherwise we wouldn't be able to complete the validation. 4 weeks was tough and I am constantly worried about what if one of our custom extensions or connectors don't work as the response time may not be quick enough and we have to constantly escalate. I would prefer not to work as if there's a fire.
I agree with everything that was stated before, would just add one more thing for consideration - the HXM upgrade schedule and its impacts on validated LMS release. This year SAP heard our feedback and did move the 3rd patch to allow more time for testing and LMS validation, but what wasn't taken into consideration is the risk of having HXM upgrade so short before LMS production go live. If my memory serves me well, the exact same situation happened last year where a misalignment between HXM and LMS versions caused last minute issues.
It feels like those 2 elements are developed in silos with no impact analysis, internal alignment and testing.
If HXM upgrade cycle is not adjusted, we are likely to face the same problem next year.
We have a process and form for a database validation that is used only for major upgrades in which our upgrade team determines it necessary based on the changes identified in the SAP documentation. The form documents the data type, the specific record reviewed and the success/failure of that data point. It is very time consuming and we are only able to validate a small percentage of our records as we do not archive anything, and we have been using the LMS since 2001 (and migrated many records upon the original implementation.) Still, it does provide some confidence in the data management of a major upgrade.