Ranking Methodology

General considerations

The SCImago Institutions Rankings (SIR) is a classification of academic and research-related institutions ranked by a composite indicator that combines three different sets of indicators based on research performance, innovation outputs and societal impact measured by their web visibility.

It provides a friendly interface that allows the visualization of any customized ranking from the combination of these three sets of indicators. Additionally, it is possible to compare the trends for individual indicators of up to six institutions. For each large sector it is also possible to obtain distribution charts of the different indicators.

For comparative purposes, the value of the composite indicator has been set on a scale of 0 to 100. However the line graphs and bar graphs always represent ranks (lower is better, so the highest values are the worst).

In the 2023 edition, the way of numbering the ranks of the institutions in the ranking has changed. The new numbering system maintains the same numerical ranks (positions) when the institutions are matched in the value of their indicator, and jumps as many ranks as institutions are tied. In this way, the number of ranks coincide approximately with the number of institutions. This change has been applied retrospectively to previous editions to allow the realistic comparison and observation of the evolution of institutions over time.

NEW
For the SIR Ranking 2024 edition, the Societal Factor has been modified to include 3 new indicators that reflect more specifically the societal impact achieved by an institution: among them, the generation of new knowledge related to the Sustainable Development Goals defined by the United Nations, the participation of women in research processes and the use of the results obtained in the creation or improvement of public policies. The definition of each indicator and its weight within the composite indicator can be consulted in the indicators section.

SCImago Standardization: In order to achieve the highest level of precision for the different indicators, an extensive manual process of disambiguation of the institution’s names has been carried out. The development of an assessment tool for bibliometric analysis aimed to characterize research institutions involves  an  enormous  data  processing  task  related  to  the  identification  and  disambiguation  of institutions  through  the  institutional  affiliation  of  documents  included  in  Scopus. The objective of SCImago, in this respect, is twofold:

  1. Definition  and  unique  identification  of  institutions:  The  drawing  up  of  a  list  of  research institutions where every institution is correctly identified and defined. Typical issues on this task include institution's merge or segregation and denomination changes.
  2. Attribution of publications and citations to each institution. We have taken into account the institutional  affiliation  of  each  author  in  the  field  ‘affiliation’  of  the database.  We have developed a mixed system (manual and automatic) for the assignment of affiliations to one or more institutions, as applicable. As well as an identification of multiple documents with the same DOI and/or title.

Thoroughness in the identification of institutional affiliations is one of the key values of the guaranteed standardization process, in any case, the highest possible levels of disambiguation.

Institutions can be grouped by the countries to which they belong. Multinational institutions (MUL) which cannot be attributed to any country have also been included.

The institutions marked with an asterisk consist of a group of sub-institutions, identified by with the abbreviated name of the parent institution. The parent institutions show the results of all of their sub-institutions.

Institutions can be also grouped by sectors (Universities, Health, Government,… ).

For the ranking purposes, the calculation is generated each year from the results obtained over a period of five year ending two years before the edition of the ranking. For instance, if the selected year of publication is 2024, the results used are those from the five year period 2018-2022. The only exception is the case of web indicators which have been calculated once the last year.

The inclusion criterions are:

The source of information used for the indicators for innovation is PATSTAT database.

The sources of information used for the indicators for web visibility are Google and Semrush.

Unpaywall database is used to identify Open Access documents.

Altmetrics from PlumX metrics and Mendeley are used for Societal Factor.

The Overton database is used to identify documents cited in policy documents.

The SIR is from now a LEAGUE TABLE. The aim of SIR is to provide a useful metric tool for institutions, policymakers and research managers for the analysis, evaluation and improvement of their activities, outputs and outcomes.

Best Quartile is obtained by the institution in its country comparing the quartiles based on the overall indicator, research factor, innovation factor and societal factor.

Indicators

Indicators are divided into three groups intended to reflect scientific, economic and social characteristics of institutions. The SIR includes both, size-dependent and size-independent indicators; that is indicators influenced and not influenced by the size of the institutions. In this manner, the SIR provides overall statistics of the scientific publication and other output of institutions, at the same time that enables comparisons between institutions of different sizes. It needs to be kept in mind that, once the final indicator has been calculated out of the combination of the different indicators (to which a different weigh has been assigned) the resulting values have been normalized on a scale of 0 to 100.

Score Indicators

Factor Indicator Weight
Research (50%) Normalized Impact (NI) 13%
Excellence with Leadership (EwL) 8%
Output (O) 8%
Scientific Leadership (L) 5%
Not Own Journals (NotOJ) 3%
Own Journals (OJ) 3%
Excellence (Exc) 2%
High Quality Publications (Q1) 2%
International Collaboration (IC) 2%
Open Access (OA) 2%
Scientific Talent Pool (STP) 2%
Innovation (30%) Innovative Knowledge (IK) 10%
Patents (PT) 10%
Technological Impact (TI) 10%
Societal (20%) Altmetrics (AM) 3%
Web Size (WS) 3%
Authority Score (AScore) 3%
Sustainable Development Goals (SDG) 5%
Female Scientific Talent Pool (FemSTP) 3%
Impact in public policy - Overton (OV) 3%

Research:

  1. Normalized Impact (Leadership Output) (NI): Normalized Impact is computed over the institution's leadership output using the methodology established by the Karolinska Institutet in Sweden where it is named "Item oriented field normalized citation score average". The normalization of the citation values is done on an individual article level. The values (in decimal numbers) show the relationship between an institution's average scientific impact and the world average set to a score of 1, --i.e. a NI score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above average (Rehn and Kronman, 2008; González-Pereira, Guerrero-Bote and Moya-Anegón, 2011; Guerrero-Bote and Moya-Anegón, 2012). Size-independent indicator.
  2. Excellence with Leadership (EwL): Excellence with Leadership indicates the amount of documents in Excellence in which the institution is the main contributor (Moya-Anegón, et al., 2013). Size-dependent indicator.
  3. Output (O): total number of documents published in scholarly journals indexed in Scopus (Romo-Fernández, et al., 2011; OECD, 2016). Size-dependent indicator.
  4. Not Own Journals Output (NotOJ): number of documents not published in own journals (published by the institution). Size-dependent indicator. Added in 2019 edition.
  5. Own Journals (OJ): number of journals published by the institution (publishing services). Size-dependent indicator. Added in 2019 edition.
  6. International Collaboration (IC): Institution's output produced in collaboration with foreign institutions. The values are computed by analyzing an institution's output whose affiliations include more than one country address (Guerrero-Bote, Olmeda-Gómez and Moya- Anegón, 2013; Lancho-Barrantes, Guerrero-Bote and Moya-Anegón, 2013; Lancho-Barrantes, et al., 2013; Chinchilla-Rodríguez, et al., 2010; 2012). Size-dependent indicator.
  7. High Quality Publications (Q1): the number of publications that an institution publishes in the most influential scholarly journals of the world. These are those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank (SJRII) indicator (Miguel, Chinchilla-Rodríguez and Moya-Anegón, 2011; Chinchilla-Rodríguez, Miguel, and Moya-Anegón, 2015). Size-dependent indicator.
  8. Excellence (Exc): Excellence indicates the amount of an institution’s scientific output that is included in the top 10% of the most cited papers in their respective scientific fields. It is a measure of high quality output of research institutions (SCImago Lab, 2011; Bornmann, Moya-Anegón and Leydesdorff, 2012; Bornmann and Moya-Anegón, 2014a;Bornmann et al., 2014b). Size-dependent indicator.
  9. Scientific Leadership (L): Leadership indicates the amount of an institution’s output as main contributor, that is, the amount of papers in which the corresponding author belongs to the institution (Moya-Anegón, 2012; Moya-Anegón et. al, 2013; Moya-Anegón, et al.,). Size-dependent indicator.
  10. Open Access (OA): percentage of documents published in Open Access journals or indexed in Unpaywall database. Size-independent indicator. Added in 2019 edition.
  11. Scientific Talent Pool (STP): total number of different authors from an institution in the total publication output of that institution during a particular period of time. Size-dependent indicator.

Notes:

Innovation:

  1. Innovative Knowledge (IK): scientific publication output from an institution cited in patents. Based on PATSTAT (http://www.epo.org) (Moya-Anegón and Chinchilla-Rodríguez, 2015). Size-dependent.
  2. Technological Impact (TI): percentage of the scientific publication output cited in patents. This percentage is calculated considering the total output in the areas cited in patents, which are the following: Agricultural and Biological Sciences; Biochemistry, Genetics and Molecular Biology; Chemical Engineering; Chemistry; Computer Science; Earth and Planetary Sciences; Energy; Engineering; Environmental Science; Health Professions; Immunology and Microbiology; Materials Science; Mathematics; Medicine; Multidisciplinary; Neuroscience; Nursing; Pharmacology, Toxicology and Pharmaceutics; Physics and Astronomy; Social Sciences; Veterinary. Based on PATSTAT (http://www.epo.org) (Moya-Anegón and Chinchilla-Rodríguez, 2015). Size-independent.
  3. Patents (PT): number of patent applications (simple families). Based on PATSTAT (http://www.epo.org). Size-dependent.

Societal impact:

  1. Altmetrics (AM): this indicator has two components:
    • PlumX Metrics (weigth: 70%): number of documents that have at least one mention in PlumX Metrics (https://plumanalytics.com). We consider mentions in Twitter, Facebook, blogs, news and comments (Reddit, Slideshare, Vimeo or YouTube)
    • Mendeley (weigth: 30%): number of documents that have at least one reader in Mendeley (https://www.mendeley.com).
    This indicator is size-dependent. Added in 2019 edition.
  2. Web size (WS): number of pages associated to the institution’s URL according to Google (https://www.google.com) (Aguillo et al., 2010). Size-dependent.
  3. Authority Score (AScore): it can be considered as the evolution of the old Inbound Links indicator. It is a composite metric developed by Semrush to measure the overall quality and SEO performance of a website. It is mainly based on 3 aspects: Link Power, Organic Traffic and Spam Factors. Its usefulness relies in the possibility of benchmarking domains, without establishing scores on an absolute scale of values. Size-independent. Added in 2024 edition.
  4. Sustainable Development Goals: (SDG) number of documents related to the Sustainable Development Goals defined by the United Nations. Size-dependent. Added in 2024 edition.
  5. Female Scientific Talent Pool: (FemSTP) number of different female authors of scientific papers from an institution. Size-dependent. Added in 2024 edition.
  6. Impact in public policy - Overton (OV): number of documents of the institution that have been cited in policy documents according to the Overton database. Size-dependent. Added in 2024 edition.

Bibliography