RPE Notes Module 5
RPE Notes Module 5
Databases
1. Indexing databases
2. Citation databases: Web of Science, Scopus….
Research Metrics
1. Impact factor of journal as per journal citation report, SNIP, SJR, IPP, Cite Score
2. Metrics: h-index, g-index, i-10index, altmetrics
Learning Resource
Indexing https://blog.scholasticahq.com/post/index-t
ypes-for-academic-journal/
https://libguides.fau.edu/c.php?g=325509&
p=2182113
https://www.atlantis-press.com/industry-aff
iliations/indexing-databases
http://olddrji.lbp.world/aboutindexing.aspx
https://www.ncbi.nlm.nih.gov/pmc/articles/
PMC4800951/pdf/IJOrtho-50-115.pdf
https://www.mdpi.com/ethics#4
5.1 Indexing databases
● Indexing is the process of creating indexes for record collections. Having indexes allows
researchers to more quickly find records for specific individuals; without them,
researchers might have to look through hundreds or thousands of records to locate an
individual record.
● It also represents a number referring to a list of terms, definitions, topics etc. arranged in
alphabetical order in order to efficiently guide the readers to the desired information
within the content. Indexing facilitates the organization of literature in such a manner that
makes the document of interest easily identifiable by the readers.
● How indexing is done?: The indexer usually receives a set of page proofs for the journal
at the same stage when the document is undergoing final proofreading. The indexer
requires the page proofs, to make a list of headings and subheadings (term to appear in
the index) and the location of each pertinent reference. After completion of the rough
index, it is edited for structure, clarity and consistency, formatted to specifications,
proofread and submitted to the client as final soft copy.The time period for indexing
depends on the length of manuscript. The more the content of the manuscript, the more
time it will take to be indexed.
● Why Indexing?
○ The function of an index is to give users systematic and effective shortcuts to the
information they need
○ Indexes are needed for any information collection, except the very smallest
● Benefits of Indexing
○ Researchers gain access to the most recent literature, even if it has not yet been
indexed by other sources
○ Automatic set-up of holdings means zero administration
○ Faster results with fewer headaches through automatic e-journal results included
with every database search
○ Keeps users on top of their areas of interest with a single place to manage Journal
Alerts and Search Alerts
○ Organized way
● Challenges in Indexing
○ Scope of coverage depends on a library or institution's subscription; their terms
may not provide complete coverage or access to full text articles
○ Database access usually requires a subscription or an affiliation to an institution;
they are not free
○ A simple keyword search tends to yield too many results or items that may not be
relevant to your topic
○ Usually gives fewer results than a keyword search
○ Sometimes using truncation or limiters can disable other search features,
depending on the database
Nowadays enhancing credibility based on our own research solely depends on to what extent our
research reaches a wider audience. The era of digitization and OA has enhanced its value.
Concept of Citation
● A formal reference to a published or unpublished source that you consulted and obtained
information from while writing your research paper
● Citation means when one paper explicitly refers to another paper with reference given in
bibliography
● Major performance indicator: Reflects Impact and quality of research
● Symbolizes conceptual association of scientific ideas
● Citation: Author’s name, date of publication, title of the work being cited, title of the
journal, vol and issue numbers, page numbers, DOI
Importance of Citation
Self Citation
● Self citation is when author cites his earlier research works in his forthcoming paper
𝑛𝑜. 𝑜𝑓 𝑠𝑒𝑙𝑓−𝑐𝑖𝑡𝑒𝑑 𝑝𝑎𝑝𝑒𝑟𝑠 (𝑜𝑤𝑛 𝑜𝑟 𝑐𝑜−𝑎𝑢𝑡ℎ𝑜𝑟𝑒𝑑)
● Self-citation rate in % (author) = 𝑡𝑜𝑡𝑎𝑙 𝑛𝑜. 𝑜𝑓 𝑐𝑖𝑡𝑎𝑡𝑖𝑜𝑛
* 100
Citation Databases
● Citation databases are collections of referenced papers/ articles/ books and other material
entered into an online system (database) in a structured and consistent way
● Scopus:
○ Scopus-Elsevier is a source-neutral abstract and citation database which was
launched in 2004
○ Content: Health Sciences, Life Sciences, Physical Sciences, Social Sciences
○ Peer reviewed journals, book series, trade publications
○ All journals covered in Scopus database are reviewed each year to ensure high
quality standards
○ Scopus gives 4 types of quality measure: h-index, CiteScore, SJR (SCImago
Journal rank), SNIP (Source Normalized Impact per Paper)
● Web of Science (WoS):
○ Global citation database; World’s first citation index
○ Dr Eugene Garfield
○ Powerful research engine connecting academics, government and millions of
researchers
○ WoS provides access to 3 major databases: Social Sciences, Arts and Humanities
○ WoS core collection consists of 4 online databases: SCIE (Science citation index
expanded), SSCI (Social sciences citation index), AHCI (Arts & Humanities
citation index), ESCI (Emerging sources citation index)
○ Researchers can use Master Journal List, a free tool which helps in navigating all
titles currently covered in WoS; also helps librarians to keep track of publication
landscape
● Google Scholar:
○ Google scholar citations is free of charge
○ Provides i/f about citations of authors by tracking online journals, book chapters,
conference papers, web pages and so on
○ Easy to set up if one has an existing google account
○ Tracks academic articles, thesis, book titles towards citation metrics
○ Helps in locating relevant data for researchers in a scientific way via advanced
search option
○ Provides in-depth details pertaining to a document
○ Individual scholar can also set up his/ her own Google scholar profile
○ Limitation: Fails to recognize and exclude predatory sources
● Citeseer
○ The first digital library and search engine to provide an autonomous citation
indexing system which indexes academic literature in electronic format
Developer/ Owner Elsevier (Netherlands) Thomson Scientific and Google Inc., (USA)
Health Care Corporation
(USA)
Databases covered Medline, Embase, SCIE, SSCI, AHCI, ESCI PubMed, OCLC first
Geobase, Biobase search
● SNIP measures citations received relative to citations expected for the subject field
● Makes cross-discipline comparisons easier between journals
● Published twice a year, and looks at a 3 year period
● SNIP is calculated as the number of citations given in the present year to publications in
the past three years divided by the total number of publications in the past three years.
● Source Normalized Impact per Paper (SNIP) measures contextual citation impact by
weighting citations based on the total number of citations in a subject field. The impact of
a single citation is given higher value in subject areas where citations are less likely, and
vice versa.
● Source Normalized Impact per Paper (SNIP) is a sophisticated metric that intrinsically
accounts for field-specific differences in citation practices. It does so by comparing each
journal's citations per publication with the citation potential of its field, defined as the set
of publications citing that journal.
● Calculated value of SNIP = RIP (Raw Impact per paper) = A/B
○ A = Journal citation count per paper
○ B = Citation potential in the field
● The SCImago Journal Rank (SJR) indicator is a measure of the scientific influence of
scholarly journals that accounts for both the number of citations received by a journal and
the importance or prestige of the journals where the citations come from
● It is also a prestige metric based on the idea that "all citations are not created equal." With
SJR, the subject field, quality and reputation of the journal has a direct effect on the value
of a citation
● A journal's SJR indicator is a numeric value representing the average number of weighted
citations received during a selected year per document published in that journal during
the previous three years, as indexed by Scopus. Higher SJR indicator values are meant to
indicate greater journal prestige.
● If scientific impact is considered related to the number of endorsements in the form of
citations a journal receives, then prestige can be understood as a combination of the
number of endorsements and the prestige or importance of the journals issuing them. The
SJR indicator assigns different values to citations depending on the importance of the
journals where they come from. This way, citations coming from highly important
journals will be more valuable and hence will provide more prestige to the journals
receiving them. The calculation of the SJR indicator is similar to the Eigenfactor score,
with the former being based on the Scopus database and the latter on the Web of Science
database. There are some differences although.
● The SJR indicator computation is carried out using an iterative algorithm that distributes
prestige values among the journals until a steady-state solution is reached. The SJR
algorithm begins by setting an identical amount of prestige to each journal, then using an
iterative procedure, this prestige is redistributed in a process where journals transfer their
achieved prestige to each other through citations. The process ends up when the
difference between journal prestige values in consecutive iterations do not reach a
minimum threshold value any more. The process is developed in two phases, (a) the
computation of Prestige SJR (PSJR) for each journal: a size-dependent measure that
reflects the whole journal prestige, and (b) the normalization of this measure to achieve a
size-independent measure of prestige, the SJR indicator
● SJR is calculated as:
○ SJR of current year = A/B, where
○ A = Average no. of weighted citations in a given year
○ B = No. of articles published in the previous 3 years
● IPP - Impact Per Publication: Also known as RIP (raw impact per publication), the IPP is
used to calculate SNIP. It is almost same as SNIP. IPP is number of current-year citations
to papers from the previous 3 years, divided by the total number of papers in those 3
previous years.
Cite Score
● A relatively new metric that helps researchers in tracking journal performance and make
decisions
● CiteScore is the number of citations received by a journal in one year to documents
published in the three previous years, divided by the number of documents indexed in
Scopus published in those same three years
● Cite Score value = A/B, Where
○ A = Citations received by a journal in one year to documents published in the
three previous years
○ B = Number of documents indexed in Scopus published in those same three years
Author-level Metrics
h-index
● Introduced by Jorge Hirsch in 2005; also known as Hirsch index or Hirsch number
● The h-index is a number intended to represent both the productivity and the impact of a
particular scientist or scholar, or a group of scientists or scholars (such as a departmental
or research group).
● The h-index is calculated by counting the number of publications for which an author has
been cited by other authors at least that same number of times.
● For instance, an h-index of 17 means that the scientist has published at least 17 papers
that have each been cited at least 17 times. If the scientist's 18th most cited publication
was cited only 10 times, the h-index would remain at 17. If the scientist's 18th most cited
publication was cited 18 or more times, the h-index would rise to 18.
● h-index = the number of publications (h) with a citation number greater than or equal to
h; For example, 15 publications cited 15 times or more, is a h-index of 15.
● Part of the purpose of the h-index is to eliminate outlier publications that might give a
skewed picture of a scientist's impact. For instance, if a scientist published one paper
many years ago that was cited 9,374 times, but has since only published papers that have
been cited 2 or 3 times each, a straight citation count for that scientist could make it seem
that his or her long-term career work was very significant. The h-index, however, would
be much lower, signifying that the scientist's overall body of work was not necessarily as
significant.
g-index
Altmetrics
The