0% found this document useful (0 votes)
279 views

Monitor and Support Data Conversion

Uploaded by

Amanuel Kassa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
279 views

Monitor and Support Data Conversion

Uploaded by

Amanuel Kassa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 24

Sawla Polytechnic College

LEARNING GUIDE # 09
Module Title: - Monitor and Support
Data Conversion

MODULE CODE: ICT DBA4 09 0711


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

SYMBOLS
These symbols are located at the left margin of the module. These illustrate the actions that should be
taken or resource to be used at a particular stage in the module.

LO Learning
Outcome Self-Check

Answer Key
Resources

Reading Assessment
Activity

Remember/Tips
Use Computer

Practice Task Safety

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

LO

1. Monitor data conversion


2. Support data conversion

1. Monitor data conversion


1.1.Overview of data conversion
Data conversion is the conversion of computer data from one format to another. Throughout a
computer environment, data is encoded in a variety of ways. For example, computer hardware is
built on the basis of certain standards, which requires that data contains, for example, parity bit
checks. Similarly, the operating system is predicated on certain standards for data and file
handling. Furthermore, each computer program handles data in a different manner. Whenever
any one of these variable is changed, data must be converted in some way before it can be used
by a different computer, operating system or program. Even different versions of these elements
usually involve different data structures. For example, the changing of bits from one format to
another, usually for the purpose of application interoperability or of capability of using new
features, is merely a data conversion.

Data conversions may as simple as the conversion of a text file from one character encoding
system to another; or more complex, such as the conversion of office file formats, or the
conversion of image and audio file formats.

There are many ways in which data is converted within the computer environment. This may be
seamless, as in the case of upgrading to a newer version of a computer program. Alternatively,
the conversion may require processing by the use of a special conversion program, or it may
involve a complex process of going through intermediary stages, or involving complex
"exporting" and "importing" procedures, which may converting to and from a tab-delimited or

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

comma-separated text file. In some cases, a program may recognize several data file formats at
the data input stage and then is also capable of storing the output data in a number of different
formats. Such a program may be used to convert a file format. If the source format or target
format is not recognized, then at times third program may be available which permits the
conversion to an intermediate format, which can then be reformatted using the first program.
There are many possible scenarios.

Validation of data accuracy and integrity


Data integrity

In computing, data integrity refers to maintaining and assuring the accuracy and consistency of
data over its entire life-cycle, and is an important feature of a database or RDBMS system. Data
warehousing and business intelligence in general demand the accuracy, validity and correctness
of data despite hardware failures, software bugs or human error. Data that has integrity is
identically maintained during any operation, such as transfer, storage or retrieval.

All characteristics of data, including business rules, rules for how pieces of data relate dates,
definitions and lineage must be correct for its data integrity to be complete. When functions
operate on the data, the functions must ensure integrity. Examples include transforming the data,
storing history and storing metadata.

Databases

Data integrity contains guidelines for data retention, specifying or guaranteeing the length of
time of data can be retained in a particular database. It specifies what can be done with data
values when its validity or usefulness expires. In order to achieve data integrity, these rules are
consistently and routinely applied to all data entering the system, and any relaxation of
enforcement could cause errors in the data. Implementing checks on the data as close as possible
to the source of input (such as human data entry), causes less erroneous data to enter the system.
Strict enforcement of data integrity rules causes the error rates to be lower, resulting in time
saved troubleshooting and tracing erroneous data and the errors it causes algorithms.

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

Data integrity also includes rules defining the relations a piece of data can have, to other pieces
of data, such as a Customer record being allowed to link to purchased Products, but not to
unrelated data such as Corporate Assets. Data integrity often includes checks and correction for
invalid data, based on a fixed schema or a predefined set of rules. An example being textual data
entered where a date-time value is required. Rules for data derivation are also applicable,
specifying how a data value is derived based on algorithm, contributors and conditions. It also
specifies the conditions on how the data value could be re-derived.

Types of integrity constraints

Data integrity is normally enforced in a database system by a series of integrity constraints or


rules. Three types of integrity constraints are an inherent part of the relational data model: entity
integrity, referential integrity and domain integrity:

 Entity integrity concerns the concept of a primary key. Entity integrity is an


integrity rule which states that every table must have a primary key and that the
column or columns chosen to be the primary key should be unique and not null.
 Referential integrity concerns the concept of a foreign key. The referential
integrity rule states that any foreign-key value can only be in one of two states.
The usual state of affairs is that the foreign key value refers to a primary key
value of some table in the database. Occasionally, and this will depend on the
rules of the data owner, a foreign-key value can be null. In this case we are
explicitly saying that either there is no relationship between the objects
represented in the database or that this relationship is unknown.
 Domain integrity specifies that all columns in relational database must be
declared upon a defined domain. The primary unit of data in the relational data
model is the data item. Such data items are said to be non-decomposable or
atomic. A domain is a set of values of the same type. Domains are therefore pools
of values from which actual values appearing in the columns of a table are drawn.

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

If a database supports these features it is the responsibility of the database to insure data integrity
as well as the consistency model for the data storage and retrieval. If a database does not support
these features it is the responsibility of the applications to ensure data integrity while the
database supports the consistency model for the data storage and retrieval.

Having a single, well-controlled, and well-defined data-integrity system increases

 stability (one centralized system performs all data integrity operations)


 performance (all data integrity operations are performed in the same tier as the
consistency model)
 re-usability (all applications benefit from a single centralized data integrity
system)
 Maintainability (one centralized system for all data integrity administration).

As of 2012, since all modern databases support these features (see Comparison of relational
database management systems), it has become the de-facto responsibility of the database to
ensure data integrity. Out-dated and legacy systems that use file systems (text, spreadsheets,
ISAM, flat files, etc.) for their consistency model lack any kind of data-integrity model. This
requires organizations to invest a large amount of time, money, and personnel in building data-
integrity systems on a per-application basis that effectively just duplicate the existing data
integrity systems found in modern databases. Many companies, and indeed many database
systems themselves, offer products and services to migrate out-dated and legacy systems to
modern databases to provide these data-integrity features. This offers organizations substantial
savings in time, money, and resources because they do not have to develop per-application data-
integrity systems that must be re-factored each time business requirements change.

Accuracy

The fundamental issue with respect to data is accuracy. Accuracy is the closeness of results of
observations to the true values or values accepted as being true. This implies that observations of
most spatial phenomena are usually only considered to estimates of the true value. The

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

difference between observed and true (or accepted as being true) values indicates the accuracy of
the observations.

Types of accuracy

Basically two types of accuracy exist. These are positional and attribute accuracy.

1. Positional accuracyis the expected deviance in the geographic location of an object from its
true ground position. This is what we commonly think of when the term accuracy is
discussed. There are two components to positional accuracy. These are relative and absolute
accuracy. Absolute accuracy concerns the accuracy of data elements with respect to a
coordinate scheme, e.g. UTM. Relative accuracy concerns the positioning of map features
relative to one another.
Often relative accuracy is of greater concern than absolute accuracy. For example, most GIS
users can live with the fact that their survey township coordinates do not coincide exactly with
the survey fabric, however, the absence of one or two parcels from a tax map can have
immediate and costly consequences.

2. Attribute accuracy is equally as important as positional accuracy. It also reflects estimates


of the truth. Interpreting and depicting boundaries and characteristics for forest stands or soil
polygons can be exceedingly difficult and subjective. Most resource specialists will attest to
this fact. Accordingly, the degree of homogeneity found within such mapped boundaries is
not nearly as high in reality as it would appear to be on most maps.
Data accuracy and data integrity

What is the difference between Data accuracy and data integrity ?

Data accuracy is getting the exact data. Data integrity is making sure that this data is received
completely and correctly.

Data conversion tools

A natural play for integrators is to add new storage frames to support data growth while also
consolidating and eliminating older, more expensive storage subsystems. It lowers customer

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

operating costs and at the same time increases reseller revenue. However, customers often shy
away from consolidation because they fear large, complex and disruptive data migrations.
Successful storage integrators can address this concern by offering data migration services based
on solid methodology and great tools.

Here's my unscientific take on the best data migration tool in each of five categories. To be
considered, the tool must be optimized for one-time data relocation from one storage device to
another, with an emphasis on heterogeneous replication. Special emphasis is given to data
migration tools that don't have to become a permanent part of the infrastructure.

Category 1: Host-based file-level migration

This open source tool has been around for a long time and distinguishes itself by being very
simple, yet powerful, and totally host- and storage-agnostic. rsync is very flexible and can be
adapted to almost every data migration need, but it shines especially brightly with largely static
unstructured content.

Category 2: Host-based block-level migration

With large structured files like databases, block-level migration tools make the most sense. I'm
going to cop out here and not name a specific tool but instead a spectrum of tools: Host-based
volume managers are often overlooked as a data migration tool, yet they provide a powerful way
to non-disruptively migrate data from one storage array to another. Most operating systems
already have a capable volume manager that is heterogeneous and already installed.

Category 3: Network-based file-level migration

Sometimes the data migration simply can't be done on the host. This is especially true when a lot
of hosts access the same data, as happens with NAS arrays. The winner in this category is EMC's
Rainfinity. This NAS virtualization appliance can be inserted into the data path between the
servers and the storage array, orchestrate non-disruptive migrations and then slip quietly back out
of the data path.

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

Category 4: Network-based block-level migration

Storage area networks (SANs) were once just a place to route servers to disk. These days they
have become much more sophisticated, and intelligent fabric services are not only possible, they
are commonplace. Brocade's Data Migration Manager (DMM) is a SAN-based heterogeneous
data migration tool that leverages the company's AP7600 intelligent SAN device. Migrating
logical unit numbers (LUNs) from one storage array to the next is possible with several SAN-
based tools, but this one is different because it moves the data online and doesn't have to take
control of the LUNs.

Category 5: Array-based block-level migration

It is nearly impossible for an array-based data migration tool to be heterogeneous -- unless the
array itself is heterogeneous. Hitachi Data Systems' (HDS) Universal Replicator can migrate data
that is both internal and external to HDS Universal Storage Platform (USP) arrays. This type of
replication works great if the customer already has or is moving toward a HDS USP array and
the hosts cannot support the workload required to move the data.

There you have it, five storage replication tools in five separate categories. Solution providers
who can wrap data migration services around a few of these tools will bring more value to their
customers and more revenue to themselves.

Transferring data

Data Transferring: is the physical transfer of data (a digital bit stream) over a point-to-point or
point-to-multipoint communication channel.

Importing and Exporting Data

Importing data is the process of retrieving data from sources external to Microsoft® SQL
Server™ (for example, an ASCII text file) and inserting it into SQL Server tables. Exporting data
is the process of extracting data from an instance of SQL Server into some user-specified format
(for example, copying the contents of a SQL Server table to a Microsoft Access database).

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

Importing data from an external data source into an instance of SQL Server is likely to be the
first step you perform after setting up your database. After data has been imported into your SQL
Server database, you can start to work with the database.

Importing data into an instance of SQL Server can be a one-time occurrence (for example,
migrating data from another database system to an instance of SQL Server). After the initial
migration is complete, the SQL Server database is used directly for all data-related tasks, rather
than the original system. No further data imports are required.

Importing data can also be an ongoing task. For example, a new SQL Server database is created
for executive reporting purposes, but the data resides in legacy systems updated from a large
number of business applications. In this case, you can copy new or updated data from the legacy
system to an instance of SQL Server on a daily or weekly basis.

Usually, exporting data is a less frequent occurrence. SQL Server provides tools and features that
allow applications, such as Access or Microsoft Excel, to connect and manipulate data directly,
rather than having to copy all the data from an instance of SQL Server to the tool before
manipulating it. However, data may need to be exported from an instance of SQL Server
regularly. In this case, the data can be exported to a text file and then read by the application.
Alternatively, you can copy data on an ad hoc basis. For example, you can extract data from an
instance of SQL Server into an Excel spreadsheet running on a portable computer and take the
computer on a business trip.

SQL Server provides tools for importing and exporting data to and from data sources, including
text files, ODBC data sources (such as Oracle databases), OLE DB data sources (such as other
instances of SQL Server), ASCII text files, and Excel spreadsheets.

Additionally, SQL Server replication allows data to be distributed across an enterprise, copying
data between locations and synchronizing changes automatically between different copies of
data.

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

Data Transformation Services

Data Transformation Services, or DTS, is a set of objects and utilities to allow the automation of
extract transform and load operations to or from a database. The objects are DTS packages and
their components, and the utilities are called DTS tools. DTS was included with earlier versions
of Microsoft SQL Server, and was almost always used with SQL Server databases, although it
could be used independently with other databases.

DTS allows data to be transformed and loaded from heterogeneous sources using OLE DB,
ODBC, or text-only files, into any supported database. DTS can also allow automation of data
import or transformation on a scheduled basis, and can perform additional functions such as
FTPing files and executing external programs. In addition, DTS provides an alternative method
of version control and backup for packages when used in conjunction with a version control
system, such as Microsoft Visual SourceSafe.

Data Transformation Services (DTS) is a set of tools that lets you quickly and easily move and
manipulate data.

Data Transformation Services (DTS) is a set of tools that lets you quickly and easily move and
manipulate data. If you do any work with current versions of SQL Server, you have probably
used the DTS wizard to import or export data from SQL Server into other data sources.

Need for Data Transfer

Today's IT environment is very diverse. Most companies store their data in multiple relational
database management systems (RDBMS). The most popular RDBMS on the market are
Microsoft SQL Server, Oracle, Sybase, and DB2. Many organizations also store some of their
data in non-relational formats such as mainframes, spreadsheets, and email systems. Smaller
databases are commonly built and maintained using one of the desktop RDBMS, such as
Microsoft Access. Despite the fact that data is disseminated among multiple data stores, the
organization still has to operate as a single entity. Therefore, there needs to be a way to relate and
often interchange data among various data stores.

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

Need for exchanging data among multiple systems has been around for a long time. Prior to
DTS' debut in SQL Server 7.0, the only tool for importing and exporting data to and from SQL
Server was the Bulk Copy Program (BCP). This command line utility is relatively
straightforward (although somewhat cryptic) to use, and offers fair performance. However, the
capabilities of BCP are quite limited: You can either export data from SQL Server into a text file
or from the text file to SQL Server.

Another predecessor of DTS was the SQL Server object transfer utility, which let you transfer
database objects and data between two SQL Servers.

It's easy to guess that neither BCP nor the object transfer utility could sufficiently serve data
exchanging needs. Many companies spent top dollars to create their own custom tools for
transferring and transforming data among various sources.

Advantages of DTS over Its Predecessors

DTS has a number of advantages over its predecessors.

 With DTS, you can import data from any data source for which you have an OLEDB or
ODBC provider. Whether your company stores its data in relational databases such as
Oracle or in a non-relational format such as email stores or Lotus Notes, DTS can get
around importing such data. While moving data, you can also manipulate it and store it in
the desired format.
 DTS capabilities are not limited to data transfer. DTS also provides an excellent way to
automate some of the administrative tasks. For instance, you can import the data from an
external data source to a staging table, call a stored procedure to give the imported data
the particular shape you need, and then kick off an Analysis Services cube processing
task—all from the same DTS package. If necessary, you can also send an email
notification to the responsible employee in case of a package failure.
 DTS probably wouldn't be as popular if it did not have a very nice intuitive user interface.
You can create a package from the DTS Designer, using the DTS wizard, or through
code. The DTS Designer can be accessed by expanding the Data Transformation
Servicesfolder in Enterprise Manager, right-clicking on Local Packages, and selecting

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

New Package. The DTS wizard can be accessed in several different ways—the easiest is
selecting Import and Export Data from the SQL Server menu. The wizard lets you answer
a few simple questions, and gets you well on the way of developing your packages. The
DTS Designer lets you pick from the list of tasks and then customize each task for your
needs.
 Perhaps one of the best things about DTS is that it is extensible. SQL Server 7.0 only
provided eight built-in tasks. SQL Server 2000 provides 19 built-in tasks—which, in
most cases, will be more than sufficient. However, each of these tasks can be customized
through the DTS Object Model. In addition, you can develop your own custom tasks, and
register them with SQL Server.
 SQL Server also provides a way to secure your DTS packages. You can set a user
password and an owner password to each package. The users can only execute the
package, whereas the owner can make changes to the package.
 Last but not least, DTS comes free when you purchase any edition of SQL Server 2000
(developer, desktop, standard, or enterprise editions are available). In fact, you don't even
have to have SQL Server installed on your computer to run DTS—you can use DTS to
transfer data among non-SQL Server data sources just as well.

Data Conversion Plan and Verification Document

Data Conversion Plan and Verification Document contain a line entry for each table that the
upgrade is:

 Updating
 Inserting rows into
 Deleting rows from.
Each table listed on this document must be investigated and verified.

The purpose of the conversion plan is to reflect the status of the verification of each table. It
should also serve as an overall view of the verification work that needs to be done after each test
move.

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

Guidelines for verifying data conversion and tracking results:

For each table on the conversion plan:

 Determine if the table will be used in the new version of PeopleSoft, using the
following options. Each module team will need to determine the appropriate level
of validation for its tables.
 Review the table ‘before’ and ‘after’ count reports. These reports are located in
the PS Upgrade Document repository for Conversion. A separate ‘before’ and
‘after’ table count document will exist for each test move. If the counts are zero,
or very small, it’s possible the table is not used.
 Review the Conversion.Script.Extract.txt report also located in the PS Upgrade
Document repository for Conversion. This document contains the INSERT,
UPDATE and DELETE statements from the conversion scripts. The report is
sorted alphabetically by table name. This report should be viewed using TextPad.
 Investigate the purpose of the table and determine how it is used within your
module’s overall functionality.

Updating the Conversion Plan


If the table will not be used in the new version of PeopleSoft, update the conversion plan
by placing an ‘NA’ in the status column for each test move. This will indicate that
verification work will not need to be done for this table after each test move nor during
the production move.

If the table is a temporary table that is used by PS during conversion, but is not the final
destination for data, update the conversion plan by placing an ‘NA’ in the status column
for each test move. Temporary tables do not need to be verified after each test move nor
during the production move.

If the table will be used in the new version of PeopleSoft, the validity of the data in
the table must be verified. The following are options for doing this validation.

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

Each module team will need to determine the appropriate level of validation for its
tables.
 Review the ‘before’ and ‘after’ table count reports stored in the PS Upgrade
Document repository for Conversion. Investigate and explain differences.
 Review the Conversion.Script.Extract.txt report also located in the PS Upgrade
Document repository for Conversion. This document contains the INSERT,
UPDATE, and DELETES statements from the conversion scripts. The report is
sorted alphabetically by table name. This report should be reviewed using
TextPad.
 Using SQLPlus, visually compare a subset of the table’s data in the current and
new version of PeopleSoft. Explain differences.
 Verify the table values by bringing up pages that use the table, running reports
that use the table, entering data against the table, etc.
 Document your analysis and results in your module’s subdirectory of the PS
Upgrade Document repository. Store any queries or sql used to verify data in
these subdirectories.

If data problems are found, several things must be done:

 Write a script to fix the data in YYDevX, YYWORK1, and YYSTRSX.


 Write a script that will be included in the Test Moves and will cause the table’s
data to be converted properly. This script will be run with each test move and the
final move to Production.
 Work with the Upgrade Conversion Technical Lead to get your script
incorporated into the test moves.

Verification with each test move


Verification of these tables will need to be done after each test move. For most tables, if
they were correct after the initial test move, they will be correct after subsequent test
moves. However, the conversion scripts do get changed for various reasons. Therefore,
the validity of converted data must be ensured after each test move. The amount of effort

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

involved in verifying the data will be considerably less in subsequent test moves than that
required in the initial verification. These subsequent checks will just make sure that
nothing ‘got broken’ during the test move. Remember that the final move to
Production will be done within a limited time span. Therefore, each module must
come up with efficient methods for verifying its data during the final conversion.

It’s especially important to ensure UMICH-written scripts work properly with each test
move.

Instructions for updating fields

Status:
For each table on the conversion plan, place a ‘C’ (Complete) in the status column for
each table that has been verified. Once the analysis on a table has begun, place a ‘S’
(Started) in the status column. A Status column exists for each test move and for the final
move to Production. Many module teams will not begin data verification until after Test
Move 2. In these cases, the status column for Test Move 1 should be left blank.

Outstanding Problems:
Place comments related to any problems found in the Outstanding Problems column.

Line for each Table:


Initially, only the PS scripts that modify tables are included in the conversion plan. To
fix problems found in the converted data, module teams may develop new scripts that
need to be included in the test moves. If this is done, a line item needs to be added to the
conversion plan for each script added, because each script will need to be verified after
each test move.

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

Deadline for Data Verification:

All data needs to be verified and all UMICH scripts need to be written/tested for
inclusion in Test Move 3.

Data Conversion Plan

DATA ITEM DESCRIPTION

1. DELIVERABLE NAME 2. DELIVERABLE NUMBER

Data Conversion Plan To be determined


3. DESCRIPTION/PURPOSE

The Data Conversion Plan shall describe the preparation for, delivery of, and confirmation of the
successful conversion and all associated processes and interfaces.
4. CONTENT REQUIREMENT

The following describes the minimum required content of the deliverable. Any changes to
content must be approved by the state in advance.

The Data Conversion Plan shall include the following:

 Cover/title page.

 Document revision history.

 Table of contents.

 An introduction that includes the document’s purpose, suggested audience, and listing of key
terms.

 An executive summary of the documents content.

 An overview of the activities and services that the Contractor will provide, the assumptions
on which the Plan is based, and the roles and responsibilities for individuals and
organizations involved in the conversion effort.
 Data Conversion Objectives: This section shall describe the Objectives to be addressed in the
data conversion from both paper documents and electronic data.

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

o Paper document: Identify the approximate number of records or documents to be


converted. Identify source of the records or documents and contact point for obtaining
the paper documents;
o Legacy system: Describe the existing systems that will be replaced or impacted.
Describe the scope of the data conversion for each system replaced or impacted.
o Error Resolution: Describe the procedure(s) used to identify errors, resolve the error,
and document the error resolution.
o Archived Data impact.

 Data Conversion Strategy: Describe the conversion effort. Any conventions needed to
understand the overall conversion method shall be presented or referenced. Graphic
illustrations of interrelationships are required.
o Major Systems Involved. Identify the source systems, electronic and hardcopy that
are involved. Identify the goals and issues for each source system.
o Locations Involved. Identify the locations involved, and the part that location plays in
the conversion effort.
o Conversion Method. Describe any automated method of conversion that requires
minimal intervention from State staff and how hardcopy records will be converted,
validated, and loaded into the new system. If part or all of the conversion method
depends upon system states or modes, this dependency shall be indicated. Any
conventions needed to understand the overall conversion method shall be presented or
referenced.
o Conversion Security. Describe what security measures will be enforced regarding
data sensitivity issues.
o Conversion Control. Describe the means to centrally control the conversion of
selected group (such as conversion of a single organization versus all organizations at
once) to one or more functions at a time, or at various times.
o Conversion Reporting. Describe the mechanism for identifying and reporting
conversion errors.

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

o Conversion Reconciliation. Describe the method to reconcile converted data and


differentiate between converted data versus new system data.
o Conversion Reversal. Describe the capability to automatically reverse or undo a
conversion by conversion group as well as by individuals who move from a converted
organization to a non-converted organization.
o Conversion Staffing. Describe the needed roles, and number of staff needed for
conversion.
 Data Conversion Preparation and Procedures: Describe the preparation and procedures for, at
a minimum, a) Activities required to perform file balancing and control, and estimate
associated staffing requirements; b) Parallel file maintenance procedures and controls; c)
Special conversion training, such as conversion data entry, file balancing and control; and d)
the number and type of support staff and required time frames.
o Source Specifications: Identify the file and/or database name and description, data
source, file structure, conversion rules, dependencies, access requirements, data
format, and conversion acceptance criteria for each source.
o Destination Specifications. Identify the name, data source, access requirements, and
data format for each destination.
o Intermediate Processing Requirements. Identify the cleansing, validating, and
initiating requirements.
o Data Element Mapping. Provide a mapping of the source to destination, considering
intermediate processing requirements.
o Data Conversion Tools and Scripts. Identify the necessary tools and scripts to
perform data conversion, intermediate data processing, and loading cleansed data into
the destination data repository. Include both automated procedures (conversion
programs) and manual procedures (data entry procedures). Define each script
necessary.
o Testing. Identify conversion verification procedures and activities required for system
testing. Identify the testing of tools and scripts, and the validation and verification of

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

resulting test data, in preparation for data loading.


o Timeline. Describe the schedule of activities, to begin shortly after contract award, to
complete conversion at implementation.
 Decommissioning Legacy Systems. Describe at a minimum: a) the method and procedures
needed to decommission existing legacy systems after the successful implementation of the
new system; b) the impact of decommissioning to all locations.
 Legacy System Updates. Provide information about any updates to legacy systems that will
remain after the production implementation.

5. PREPARATION INSTRUCTIONS AND APPLICABLE STANDARDS

The Contractor shall refer to the OSI Style Guide for format and preparation guidelines.

Self-Check 1

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

 Answer the questions on the following questionnaire; provide the answer sheet to your
trainer.
 Check your answers by looking at the feedback sheets; ask for the assistance of the
trainer whenever necessary.
Satisfactory
Questions
Response
 The trainee should answer the following questions YES NO
Say “true” if the statement is correct and “False” is not
1. Data conversion is the conversion of computer data from one
format to another.
2. Data conversions may as simple as the conversion of a text file
from one character encoding system to another; or more
complex, such as the conversion of office file formats,
3. Data integrity refers to maintaining and assuring the accuracy
and consistency of data over its entire life-cycle, and is an
important feature of a database or RDBMS system.
4. Accuracy is not the closeness of results of observations to the
true values or values accepted as being true.
5. Attribute accuracy is not equally as important as positional
accuracy.
6. Data Transferring: is the logical transfer of data (a digital bit
stream) over a point-to-point or point-to-multipoint
communication channel.
7. Importing data is the process of retrieving data from sources
external to Microsoft® SQL Server™ (for example, an ASCII
text file) and inserting it into SQL Server tables. Exporting
data

 The trainee’s underpinning knowledge was


[ ] Satisfactory [ ] Not satisfactory

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

 Feedback to Trainee:

Trainee’s Signature: Date:


Instructor’s Signature: Date:

Answer Key

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

1. True
2. True
3. True
4. False
5. False
6. False
7. True
8.

Performance Criteria

Monitor and Support Data Conversionversion Year 2021 Page 6


SCIC INFORMATI UNIT Database Administration Level IV
College ON SHEET MODULE Monitor and Support Data Conversion

Satisfactory
Assessment Criteria
Response
The trainee will be assessed through the following criteria: YES NO
 Answered all the interview questions clearly
 Performed all activities accordingly
 Followed all instructions in the activities

Trainees’ Performance is:


[ ] Satisfactory [ ] Not Satisfactory

Feedback to Trainee:

Trainee’s Signature: Date:


Instructor’s Signature: Date:

Monitor and Support Data Conversionversion Year 2021 Page 6

You might also like