Human Capital Management Blogs by Members
Gain valuable knowledge and tips on SAP SuccessFactors HCM suite and human capital management market from member blog posts. Share your insights with a post of your own.
cancel
Showing results for 
Search instead for 
Did you mean: 
aasthamalik
Participant
Data Migration, popularly known as Data Upload, is one of the key components for a successful Successfactors go-live when you have the legacy data residing in some other systems and to be imported into the Successfactors.

After working for a dozen of Successfactors implementations, I finally got to work closely and extensively for the Data Migration.

Based on my experience and assuming all are aware of the process followed in Data Migration, I would focus on key considerations and important areas which needs our attention.

 

 Where is the Plan?

It all starts with a Plan. So, you must have a detailed planning for the data migration activities for everyone in the team. Planning the activities will give you good insights of how`s and what`s.



Once you are done with defining the phases, milestones and key dates for the Data Migration Plan (illustrated in the image above), go ahead and discuss it with client to finalize the plan, considering it as a collaborative decision. The timelines for the plan will rely on the source system from where you would start migrating the data. Important factor here is that these timelines might slightly differ with time, based on the project situations.

 

 Where are the Data Templates?

   

Keep your Data Templates Ready !!

You must be aware of the system configurations and data import templates. This time we had different teams for setting up the system configurations (Functional Team) and managing data (Data Team). In such cases, it is very important for the data team to coordinate with them and work collaboratively for getting all the configurations within time. This is important because the configurations typically drive the templates that would support the data uploads.

 

 What are the Language Packs?

Multiple language packs add a little more to the data migration. Keep yourself aware of all the language packs in the implementation scope and get the data in required languages. This may require some additional documentations and activities but that should be fine as it would finally count as your efforts for making successful uploads for multiple language packs.You may have 2 types of approach, one in which you may upload data for multiple language packs in a single go all together or you may segregate the activities and dates for uploading data for different packs. It completely depends on the mutual understanding between you and the client.

 

 What is the list of Data Templates & How do we get the Data?

Once you gain the understanding of the different data templates, go ahead and explain the templates (field wise) to the client. Since it’s the client’s data hence no one would have better understanding of it, followed by ‘How’ which lies with us.

You can prepare the list of data templates/files which needs to be uploaded and get that confirmed by the client.

You may make supporting documents where you can explain the technical specifications for all templates (field wise) which would help the client to prepare the data in the required format. There can be scenarios where consultants need to provide the support for data migration where data is to be transformed by converting the data from source file into the Successfactors template. This should involve the client for data quality satisfaction at their end.

 

 What about the Field Mappings?

Since I have spoken about the data transformation in my previous point, please do not forget to create a data mapping sheet with all the relevant details. This document should typically contain the mapping between source and the target fields. This is going to be of great help to you as well as to the client as it would give the clear picture on the movement of the data from any source to the Successfactors templates.



Also, keep the mappings updated with the latest configurations. For performing this activity, you should have a deep understanding of the source extracts.

 

 How to handle the Reference Values?

Reference Values plays a very critical role in preparing the data upload files. When you are mapping the data fields from source to the target, DO NOT forget to work on the mapping of the reference values (consider the Source, Target and Configured Values all together). If the reference values in the upload files are not as per the configurations in the system, the upload will not be successful, reason being mismatch of the values. When the upload fails, you need to check the data in file and compare it with the values present in the target system.



Here, Reference Values are the picklist values.

 

 What to do if the data is missing in source?

There might be some data that might not be available in the legacy extracts and if you compare them to the target system configurations (for such fields), you would see them as mandatory fields. In such cases, you can have 2 options:



 

 How to handle the issues?

While uploading the data into Successfactors, you might get few errors. You must capture them in the tracker where you can monitor each issue properly. A proper issue tracker can help you to assign them within the team and identify the root causes for solving the problems. Plus, you may always track the open items and have a look at resolution progress.



Generally, client uses a specific tool for handling this area and it depends from client to client.

 

 How to be sure if the Data is correct?

It is very important to get the data checked by client on regular basis within one mock cycle. So, you can start with getting the data checked as soon as you receive the extracts, after you complete the data transformation from source to target templates and after you upload the data in the system. When you enter the cut-over period and are done with all the data uploads successfully as per plan, ask the client to get the data validated in the system before the project is declared live. This can be considered as final checks on the uploaded data by the client themselves. They may highlight the issues (if any) and be ready to support. After all, you would want the project to go-live in time.



 

 When to stop the Data Extraction?

When you progress towards the go-live, agree on the cut-off date for extracting the data from the legacy system. This cut-off date means that the data extraction from legacy should be done only till the time cut-off date is met with the purpose to upload the most updated data in the final stage.



This will not only help the data team to upload most accurate data but also helps the client to access the latest data. The cut-off date should be agreed mutually with the client, as the employees might need to stop using legacy system after the cut-off date.

 

 How to manage your Team?

Team Management is another area which will help you to achieve your goals. The role and responsibilities of each team member should be clear and you must ensure that everyone in the team works together. Also, do not forget to have regular status calls to understand the progress. If in case there is an on-shore/off-shore model, then managing the teams will enable you to achieve the timelines. Since the data part is a little tedious and requires utmost concentration, always try to make the team feel positive and motivated.


That`s my team.. Where is yours ??


 

Honestly, working extensively for data has been a great learning to me and looking at the client satisfaction in the end is the biggest achievement of all considering the fact

Right??

9 Comments
nishantagaur
Explorer
Great work Aastha ??

 
aasthamalik
Participant
0 Kudos
Thank you Nishant
Former Member
The blog provides a Data Migration overview in a very simple way ??
aasthamalik
Participant
0 Kudos
Thanks Paras.. I am glad this blog has helped ?

 
former_member84399
Participant
0 Kudos
Aastha, thank you for this, very useful. Is there any documentation you can point me to describing what data needs migrating for each of SF modules. For example, we are at a discovery phase for Performance and Goals and we need to understand what data we need to migrate from our ERP HR system to the cloud.

 

Thank you

Andreas
aasthamalik
Participant
0 Kudos
Hi Andreas, I think you can find some documents on this on this website or partner edge, shared by our experienced people from Successfactors community & SAP, resp.; please find some key considerations based on my experience:

  1. Try to understand which part of legacy data must be included in PM,GM (ask client about this & make suggestions).

  2. Try to understand which data is important for client, prioritize with client.

  3. Map the source data fields with the fields available in the respective Successfactors data templates, this will help you identify the Target portlets for Goal Plan, legacy ratings or if any custom portlet is needed for meet client requirements.

  4. Finally, you can run ETL cycle for Data Migration.


I hope this helps.

Thanks.
former_member610912
Discoverer
Nice blog for Data Migration... Thanks.. 🙂
NarenNathawat
Explorer
0 Kudos
Very simple to understand and logical in flow. Thank you for sharing.
0 Kudos
Good Work Aastha Malik,

Indeed, It is informative and quick and simple way to understand the and carry out the Data migration in commendable way.

 

Thank you

Gowri Sankar
Labels in this area