Data Migration
Created By: Siddhant Patni
Agenda
• Introduction
• Data Migration Methods
• DATA LOADER
• API-INTERFACE
• INTERFACE
• Exercise
Agenda
Introduction
• Data Migration Methods
1. DATA LOADER
2. API-INTERFACE
3. INTERFACE
• Exercise
INTRODUCTION
• Data migration is a very vital phase of any project.
• This phase not only consumes a lot of time and efforts, but also
is one of the most deciding phases of a error free system.
• This phase also has its importance in the project life cycle as it
is seldom sufficed under favourable conditions.
• The need of time and lack of resources handicaps both
Functional and technical Teams at some point causing either
errors or delays in deadlines.
Agenda
• Introduction
Data Migration Methods
1. DATA LOADER
2. API-INTERFACE
3. INTERFACE
• Exercise
DATA MIGRATION METHODS
1. DATA LOADER
2. API-INTERFACE
3. SEEDED INTERFACE
Difference
API-INTERFACE
DATA LOADER
SEEDED INTERFACE
• Low Volume • High Volume
• Slow method • High Volume • Limited to
• High Monitoring • Limited Forms availability of
• Can be Used On • No Custom Public API’s
almost all Form Validation • Custom Validations
• Seeded i.e. Not • Created by us
created by us
Agenda
• Introduction
Data Migration Methods
1. DATA LOADER
2. API-INTERFACE
3. INTERFACE
• Exercise
1. DATA LOADER
• Used for Migrating small and manageable data.
• Not useful for large volume of data.
• Time consuming (i.e. Relatively slow method of Migration)
• Requires monitoring while loading.
• Savior where API and Interface are not available.
• Can be used to load over all forms/java pages.
DEMO FOR DATA LOADER
• Whenever we are creating multiple records for particular Form
in system we use similar system commands(like Tab, Space,
Enter, Data etc..) which are repeated for each record we enter.
• Data loader runs repetitive commands and enter data on
behalf of us on basis of the file created.
Data Loader – Creating Responsibility
1. Responsibility Form(Blank)
2. Data Load File
3. Responsibility Form(after Data Load)
2. API-INTERFACE : Introduction
Excel Template Stage Table Interface Table
• First step is to Prepare Data(Excel • Secondly data shall be imported • Successfully(Validated) Processed
sheet) for Uploading into stage into Stage Table. data are ready for being imported
table. • With Concurrent Request into System.
• Data shall be Validated imported data are processed
thoroughly before Uploading against custom Validations.
Import Items Interface Errors
• Import Items request is required • Failed records are required to be
to be submitted with relevant analyzed, updated and uploaded
parameters. again.
2. API-INTERFACE (Step 1 : Excel Template)
• Data are uploaded to stage table from Excel Template.
• Ensure Excel column headings are same as stage table.
• All mandatory fields are required to be provided(i.e. should not
be left blank).
• Before uploading, all data must be validated in Excel.
Lets Start
Ensure all mandatory columns are present in the
excel.
For easiest and safe import, ensure all columns in
excel are names exactly as column headers in the
table.
For Batching the items or processing only selected
items, kindly provide the relevant Set Process ID.
This we shall pass while importing the items to the
base table.
2. API-INTERFACE (Step 2 : Stage Table)
• Firstly Data are uploaded in the stage table relevant to the
form.
• Stage Table has all Mandatory and other fields from the base
table.
• All the Columns/fields of Stage Table are mapped with the base
table columns.
• Custom Validations are placed for validating the data with
system data.
Step 2 Stage table
Correct
Failed Toad
Records
Process Import
Stage table
Data Data
Import
Wizard
Step 2.1 -- Toad
In Toad for uploading data to target stage table we
click on Import Table Data which in our Example is
“XXEVO_INV_ITEM_UPLOAD_STG”
To start the import into Stage Table, you need to
follow the shown navigation.
Step 2.2 -- Import Table Data
Select the relevant Schema.
Select the Object Type as tables.
Select the table Name in Object Name.
Depending on preference of Commit Mode, select
-Never Commit,-Commit after each Record or –One
Commit after all records.
Press on Show Data.
Clicking On Show data displays all data in the
Selected Stage Table.
Ensure that the stage table is empty so as to avoid
confusion/wrong data being uploaded into system.
Once there is no table data click on execute wizard
for Uploading the Data into the stage Table.
Step 2.3 -- Import Wizard
Once you enter the import table data, kindly follow
the following instructions.
Select your file Type.
Import Wizard(cont.)
Browse and Select Your File
Select the Row from which your data starts.
Eg in our case, the first row is the headers, so we
shall start from row 2.
Similarly you can also give last row to mention
bracket of rows to be imported.
Import Wizard(cont.)
On pressing Next, the data would be displayed from
the Excel.
To ensure a error free mapping, we can use the
AutoMap option.
This would only work if all our excel column names
are same as Column headers in the table.
As seen, as we started with Row2, system automatically
confirms to AutoMap with Row1.
Import Wizard.
Press next till you reach the above screen. Press Execute. All rows in Excel shall get executed which
can be confirmed by querying the Interface Table after the execution is concluded.
2.4 Step 4 -- Process Stage Table Records
• Once Data is uploaded into Stage Table, it is further required to
be processed with Custom Validations which is done with
concurrent program.
• Concurrent Program Name for Item Creation is “Evosys INV
Item Migration”
• Records processed successfully are moved to Interface table.
Process Stage Table Records(Contd..)
• CP “Evosys INV Item Migration” is required
to be submitted with Parameter Set
Process ID.
• Only those records gets processed which
has mentioned Process Id.
• Successfully records are ready moved to
Interface Table
“MTL_SYSTEM_ITEMS_INTERFACE’
2. API-INTERFACE (Step 3 : Import Items)
• Records in Interface table are processed with Concurrent
“Import Items”
Item Import
Once its confirmed that import is successful in the
Interface table, we need to login to Oracle system.
In this case, as we are importing Items, we shall go
to Inventory Responsibility.
Our import shall conclude by the shown Navigation.
Item Import
Now we need to give relevant parameters for the
Concurrent.
As seen, we can run it for All Organizations or only
the Profile Organization.
We can ensure validation before import.
We can instruct whether to process the Items or just
validate them.
We can instruct whether the processed items should
be deleted from the Interface Table or kept in
successful status.
Process Set would determine the Items which would
be processed. Using this, we can process items in
Batches. Even sample data processing can be done
using this Parameter.
Now we select the transaction type as either Create
Or Update.
Error Processes
All attributes that are kept under either Update or Create, would follow the same fundamentals as our core system.
If functionally the update is not allowed, then here also the import would run into error.
Eg. If the item is not created in the Master org, and we run a create for Child, it will run to error.
One such example can be seen Below.
All Such error shall be visible in the table MTL_INTERFACE_ERRORS.
Questions!!
Exercise
• A excel sheet has been attached with some columns from the
Master Item Table.
• You need to update the segment1(i.e. Item Code) and create
new items in the system using this method.
Thank You!!!
Also Thanking to Mr Prakash Dudhat for his time and efforts in KT