0% found this document useful (0 votes)
10 views4 pages

BBARMKUMOD4

The document outlines the essential operations of data processing and analysis in research, focusing on editing and coding. Editing involves correcting errors in collected data to ensure accuracy and consistency, while coding assigns symbols to responses for efficient categorization. Additionally, it discusses the concept of standard error in sampling and the factors influencing sample size determination for effective research outcomes.

Uploaded by

sonuchudhury20
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views4 pages

BBARMKUMOD4

The document outlines the essential operations of data processing and analysis in research, focusing on editing and coding. Editing involves correcting errors in collected data to ensure accuracy and consistency, while coding assigns symbols to responses for efficient categorization. Additionally, it discusses the concept of standard error in sampling and the factors influencing sample size determination for effective research outcomes.

Uploaded by

sonuchudhury20
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

Processing Operations: Editing, coding

The data, after collection, has to be processed and analysed in accordance with the outline laid down for the
purpose at the time of developing the research plan. This is essential for a scientific study and for ensuring
that we have all relevant data for making contemplated comparisons and analysis. Technically speaking,
processing implies editing, coding, classification and tabulation of collected data so that they are amenable
to analysis. The term analysis refers to the computation of certain measures along with searching for
patterns of relationship that exist among data-groups. Thus, “in the process of analysis, relationships or
differences supporting or conflicting with original or new hypotheses should be subjected to statistical tests
of significance to determine with what validity data can be said to indicate any conclusions”.

With this brief introduction concerning the concepts of processing and analysis, we can now proceed with the
explanation of all the processing operations.

1. Editing: Editing of data is a process of examining the collected raw data (specially in surveys) to detect
errors and omissions and to correct these when possible. As a matter of fact, editing involves a careful
scrutiny of the completed questionnaires and/or schedules. Editing is done to assure that the data are
accurate, consistent with other facts gathered, uniformly entered, as completed as possible and have been
well arranged to facilitate coding and tabulation. With regard to points or stages at which editing should be
done, one can talk of field editing and central editing. Field editing consists in the review of the reporting
forms by the investigator for completing (translating or rewriting) what the latter has written in abbreviated
and/or in illegible form at the time of recording the respondents’ responses. This type of editing is necessary in view
of the fact that individual writing styles often can be difficult for others to decipher. This sort of editing should be done
as soon as possible after the interview, preferably on the very day or on the next day. While doing field editing, the
investigator must restrain himself and must not correct errors of omission by simply guessing what the informant
would have said if the question had been asked. Central editing should take place when all forms or schedules have
been completed and returned to the office. This type of editing implies that all forms should get a thorough editing by
a single editor in a small study and by a team of editors in case of a large inquiry. Editor(s) may correct the obvious
errors such as an entry in the wrong place, entry recorded in months when it should have been recorded in weeks, and
the like. In case of inappropriate on missing replies, the editor can sometimes determine the proper answer by
reviewing the other information in the schedule. At times, the respondent can be contacted for clarification. The
editor must strike out the answer if the same is inappropriate and he has no basis for determining the correct answer
or the response. In such a case an editing entry of ‘no answer’ is called for. All the wrong replies, which are quite
obvious, must be dropped from the final results, especially in the context of mail surveys.
2. Coding: Coding refers to the process of assigning numerals or other symbols to answers so that responses can be
put into a limited number of categories or classes. Such classes should be appropriate to the research problem under
consideration. They must also possess the characteristic of exhaustiveness (i.e., there must be a class for every data
item) and also that of mutual exclusively which means that a specific answer can be placed in one and only one cell in
a given category set. Another rule to be observed is that of unidimensionality by which is meant that every class is
defined in terms of only one concept. Coding is necessary for efficient analysis and through it the several replies may
be reduced to a small number of classes which contain the critical information required for analysis. Coding decisions
should usually be taken at the designing stage of the questionnaire. This makes it possible to precode the
questionnaire choices and which in turn is helpful for computer tabulation as one can straight forward key punch from
the original questionnaires. But in case of hand coding some standard method may be used. One such standard
method is to code in the margin with a coloured pencil. The other method can be to transcribe the data from the
questionnaire to a coding sheet. Whatever method is adopted; one should see that coding errors are altogether
eliminated or reduced to the minimum level.

Concept of standard error


The primary objective of statistical inference is to make generalisation from a sample to some population of which the
sample is part. Standard error measures (1) error of sampling and (2) error of measurement. Standard error can be
described as the standard deviation (SD) of all the standard deviations that have been computed for given number of
samples randomly drawn from same population. Let us try to understand standard error with the help of an example.
Suppose a researcher wants to carry out a research on the organisational citizenship behaviour of the employees in
public sector banks. Since it in not possible to take the whole population, the researcher randomly takes around 1% of
the whole population as the sample. Yet another researcher is also carrying out similar study and also draws 1% from
the population. Thus, if many more researchers were carrying out such a study then they would also draw 1% sample
from the whole population. The problem, though occurs when the 1% sample that is drawn by each researcher is
different from each other. And though each of the 1% sample is representation of the population (that is
heterogeneous in nature), they are different from each other. The researchers will also compute means and standard
deviations for their respective sample. And in such a case, it is expected that the means and standard deviations would
be same, because the sample has been randomly drawn from the same population. But in reality that may not happen
and the mean of one sample may be lower or higher than the mean computed for another sample. The higher the
standard error the less the likelihood that the sample is representative of the population. Thus, the difference
between the sample needs to be close to zero so as to be sure that they represent the population.
Sample size and its determination.
SAMPLE SIZE AND ITS DETERMINATION In sampling analysis the most ticklish question is: What should be the size of
the sample or how large or small should be ‘n’? If the sample size (‘n’) is too small, it may not serve to achieve the
objectives and if it is too large, we may incur huge cost and waste resources. As a general rule, one can say that the
sample must be of an optimum size i.e., it should neither be excessively large nor too small. Technically, the sample
size should be large enough to give a confidence inerval of desired width and as such the size of the sample must be
chosen by some logical process before sample is taken from the universe. Size of the sample should be determined by
a researcher keeping in view the following points:

(i) Nature of universe: Universe may be either homogenous or heterogenous in nature. If the items of the universe are
homogenous, a small sample can serve the purpose. But if the items are heteogenous, a large sample would be
required. Technically, this can be termed as the dispersion factor.

(ii) Number of classes proposed: If many class-groups (groups and sub-groups) are to be formed, a large sample would
be required because a small sample might not be able to give a reasonable number of items in each class-group.

(iii) Nature of study: If items are to be intensively and continuously studied, the sample should be small. For a general
survey the size of the sample should be large, but a small sample is considered appropriate in technical surveys.

(iv) Type of sampling: Sampling technique plays an important part in determining the size of the sample. A small
random sample is apt to be much superior to a larger but badly selected sample.

(v) Standard of accuracy and acceptable confidence level: If the standard of acuracy or the level of precision is to be
kept high, we shall require relatively larger sample. For doubling the accuracy for a fixed significance level, the sample
size has to be increased fourfold.
(vi) Availability of finance: In prctice, size of the sample depends upon the amount of money available for the study
purposes. This factor should be kept in view while determining the size of sample for large samples result in increasing
the cost of sampling estimates.

(vii) Other considerations: Nature of units, size of the population, size of questionnaire, availability of trained
investigators, the conditions under which the sample is being conducted, the time available for completion of the
study are a few other considerations to which a researcher must pay attention while selecting the size of the sample.

You might also like