Map A to Z Index Search CU Home University of Colorado
  Strategic Planning Institutional Research and Analysis Campus Budget and Finances About PBA

PBA Home > Institutional Research & Analysis > Performance indicators > Rankings & ratings

CU-Boulder in Rankings & Ratings

CU-Boulder over time -- Excels of rankings from

About the rankings -- sections of this document

There are numerous rankings of U.S. and world institutions of higher learning. Scholars in the U.S. and Europe have criticized the methodological shortcomings and other limitations of these rankings and warn that they oversimplify the complex nature of assessing the relative merits of educational programs and institutions. See, for example, "The State of the Rankings," published in 2010 in Inside Higher Ed, and Global University Rankings and Their Impact, a 2011 report by the European University Association. Nevertheless, rankings are widely used and influence perceptions of prospective students and their parents about colleges and universities.

We provide here brief descriptions of some of the main rankings, links to rankings pages, information on rankings given for CU-Boulder, and links to other resources for information about rankings. We encourage users to bear in mind that rankings depend strongly on the indicators chosen to indicate institutional quality and on the weights or relative importance assigned to those indicators. Despite the proliferation and popularity of rankings, they are controversial. For example:

  • They often rely on small numbers of selected indicators.
  • They may have intended or unintended emphases, e.g., in favor of research activity rather than teaching excellence.
  • They may rely on weak, incomplete, and/or non-comparable data sources.
  • Indicators of quality or success vary greatly across programs or disciplines (e.g., liberal arts and natural sciences), and these differences may not be taken into account by a ranking methodology.
  • Reputational rankings may rely on the opinions of a small number of raters characterized by widely varying levels of expertise and personal bias.
  • What's "best" for one student will not be "best" for another.
Rankings of U.S. undergraduate institutions and programs
  • U.S. News & World Report publishes annual rankings of over 1400 schools based on statistical information provided by colleges and reputational rankings provided by college administrators and high school counselors. The methodology used to rank institutions is described in "How U.S. News Calculates the College Rankings" and in a tab of the Excel on CU-Boulder US News ratings. 
  • Forbes' annual list of America's Best Colleges ranks over 600 schools based on the quality of the education they provide, the experiences of the students, and how much they achieve. Only those schools categorized by The Carnegie Foundation as doctorate-granting universities, master's colleges and universities, or baccalaureate colleges are included in this sample of schools. Rankings are compiled in conjunction with the Center for College Affordability and Productivity (CCAP); the methodology uses five general categories of indicators.
  • The Princeton Review's annual Best Colleges publication includes 62 ranking lists they describe as "top 20" lists based on their survey of students attending the colleges included in the publication. The publication also includes eight college ratings scores that rely on school-reported data and/or data from the student surveys. Among the eight rating categories are academics, admissions selectivity, financial aid, and fire safety.
  • The Princeton Review's annual Best Value Colleges publication uses institutional data and student opinion surveys to rank 50 public and 50 private institutions based on undergraduate academics, costs, and financial aid (see "Best Value Colleges Methodology").
  • The Princeton Review also ranks the top 25 undergraduate entrepreneur programs based on survey data provided by more than 2,000 institutions. Rankings and information about the methodology used to calculate them are published by Entrepreneur.
  • Kiplinger ranks the top 100 public colleges and universities and the top 200 private colleges and universities based on academic quality and affordability. Its rankings rely on data from more than 500 public four-year schools and data from more than 600 private institutions. Data (provided by Peterson's/Nelnet) include factors such as admission rate, students per faculty, graduation rate, cost, and financial aid.
  • Fiske Guide to Colleges rates more than 300 colleges annually on the strength of their academics, extracurricular activities, and social life. It also lists colleges and universities that qualify as "Best Buys" based on the quality of the academic offerings in relation to the cost of attendance. Fiske also offers more specialized lists based on various criteria, e.g., "Top Conservative Colleges," "Top Women’s Colleges," and "Top Nonconformist Colleges."
Rankings of U.S. graduate programs
  • The U.S. News & World Report annual Best Grad Schools publication includes rankings of graduate programs in business, education, engineering, law, medicine, science, library and information studies, social sciences and humanities, health, public affairs, and fine arts. Not all programs are ranked every year. The rankings methodology varies somewhat for different overall programs or schools, e.g., Engineering Program Rankings Methodology vs. Business School Rankings Methodology. "Specialty rankings" of, for example, specific disciplines in the sciences (biological sciences, chemistry, computer science, earth sciences, mathematics, physics, and statistics) rely solely on reputational surveys of programs (see, e.g., Science Rankings Methodology).
  • The Princeton Review ranks the top 25 graduate entrepreneur programs based on survey data provided by more than 2,000 institutions. Rankings and information about the methodology used to calculate them are published by Entrepreneur.
  • Approximately every 10 years, beginning in 1982, the National Research Council (NRC) conducts a survey and compiles a report on U.S. research-doctorate programs. The most recent  NRC report (2010) did not provide exact ranks for any of the 222 participating institutions; rather, they used a scale system that provided statistical ranges for each of two types of rankings. "R Rankings" were based on regression analyses of various survey results in which academics reviewed the reputation of actual programs; "S Rankings" were based on how various programs' characteristics measured against criteria which academics rated as key determinants of quality for such programs.
Rankings of U.S. and world research universities

Research is typically the most important factor in ranking higher education institutions as a whole. On the webpage for its The Top American Research Universities annual reports, The Center for Measuring University Performance notes that it is generally accepted that "research matters more than anything else in defining the best institutions." And in its 2011 report on Global University Rankings and Their Impact, the European University Association concluded that their "report confirms that most international rankings focus predominantly on indicators related to the research function of universities." These rankings, therefore, may be of particular interest to graduate students investigating institutional research reputations and opportunities in their professional areas. Among the most popular of these rankings are the following:

  • The Center for Measuring University Performance at Arizona State University (formerly U of Florida) is a research enterprise focused on management and incentive and reward systems in major research universities and on methods for measuring and improving university performance. Its annual report, The Top American Research Universities, provides information on American research universities' performance. The report ranks institutions based on the number of times they rank in the top 25 on nine performance variables, including research dollars, post-docs, faculty honors, and number of PhD's granted.  The assessment includes only those institutions that receive at least a certain minimal amount of federal research monies (for the 2010 report, $40 million in fiscal year 2008).
  • CU-Boulder in world/international rankings: ARWU, THE, QS/US News, Leiden - all discussed below
  • The Academic Ranking of World Universities (ARWU) assesses more than 1,000 institutions and publishes rankings for the World Top 500 Universities, the World Top 100 Universities in five broad subject fields (natural sciences and mathematics, engineering/technology and computer sciences, life and agricultural sciences, clinical medicine and pharmacy, and social sciences), and the World Top 100 Universities in five subject fields or disciplines (mathematics, physics, chemistry, computer science, and economics/business). The ranking methodology relies on several indicators of academic or research performance, including alumni and staff winning Nobel Prizes and Fields Medals, highly cited researchers, papers published in Nature and Science, papers indexed in major citation indices, and the per capita academic performance of an institution.
  • The Times Higher Education (THE) World University Rankings  ranks the top 200 world universities based on indicators of excellence in research and "teaching. It also ranks the top 50 world universities in six broad subject areas-- engineering and technology; life sciences; clinical, pre-clinical and health; physical sciences; social sciences; and arts and humanities. The rankings methodology uses 13 criteria, including data from a worldwide Academic Reputation Survey. Data collection is carried out by Thomson Reuters and its Global Institutional Profiles Project.
    • Beginning in 2011 Times Higher Education (THE) also publishes World Reputation Rankings, a subsidiary of its World University Rankings. This ranking of the top 100 world universities, based on reputation for teaching and research, uses data from a global survey of academic opinion completed by more than 13,000 academics from 131 countries.
  • The Centre for Science and Technology Studies (CWTS) Leiden Ranking is based on data from the Web of Science bibliographic database produced by Thomson Reuters. The current (2013) ranking includes the 500 universities worldwide with the largest publication output in the Web of Science database.
    • The Leiden ranking reports three indicators of the scientific impact of a university and four indicators of scientific collaboration. Information about the methodology is available here.
    • In 2013, the Leiden ranking was also reported for five broad fields of science: Biomedical and health sciences, life and earth sciences, mathematics and computer science, natural sciences and engineering, and social sciences and humanities.I n 2014, Leiden reported rankings for seven broad fields of science: cognitive and health sciences; earth and environment sciences; life sciences; mathematics, computer science, and engineering; medical sciences, natural sciences; and social sciences.
  • The Center for World University Rankings (CWUR) rankings of the top 100 world universities were first published in July 2012. CWUR is based in Jeddah, Kingdom of Saudi Arabia (KSA). The ranking methodology relies on a combination of seven indicators of faculty quality, research quality, and alumni achievements. CWUR uses no data collected directly from universities.
  • U.S. News & World Report's Best Global Universities rankings were inaugurated in 2014. The rankings methodology uses ten indicators, including two reputational measures.
  • The two organizations overlap a great deal in their reported rankings.

The landscape of global university rankings is populated with a growing number of rankings that includes more diverse and specialized rankings. For example:

Rankings of online programs
  • U.S. News & World Report in June 2011 announced its intention to rank online bachelor's and selected master's degree programs in the U.S. Reactions to this announcement are described in an article in the June 30, 2011 edition of the Chronicle of Higher Education. U.S. News began publishing these rankings in 2012.
  •  Numerous organizations publish rankings of distance-learning MBA programs, including The Economist and countless directories, blogs, etc. The validity and legitimacy of many of these may be questionable.
Other resources

Interested readers are directed to the following sites for additional information about rankings (including material on rankings not mentioned in this posting, critiques of rankings, and a ranking audit system created in 2009):

  • The Association of American Universities Data Exchange (AAUDE) has a reference page of material dealing with international rankings.
  • The Top American Universities 2010 Annual Report of The Center for Measuring University Performance includes a selected bibliography of comments on and critiques of U.S. and global rankings (pp. 12-14).
  • report from the Center for College Affordability and Productivity (2009) provides a useful overview of the history of college rankings, a summary of key criticisms of rankings, comments on the contemporary effects of rankings, and ideas for the reform and improvement of college rankings.
  • The Wikipedia entry on college and university rankings has extensive information on regional and national rankings. The entry also includes links to Wikipedia entries on law school rankings and MBA program rankings.
  • offers prospective graduate students a do-it-yourself ranking tool that allows them to rank programs based on personal priorities, including research productivity, diversity, and professional-development opportunities. Users assign their own weights to each factor to produce a ranking that best reflects their individual preferences. Users may also narrow their search by filtering on institutional factors such as program size, tuition, and availability of funding. The website uses data from from the National Science Foundation, the National Research Council, and the National Center for Education Statistics. For more on personalized rankings, click here.
  • To give users of rankings a tool to identify trustworthy rankings, the International Ranking Expert Group (IREG) Observatory on Academic Ranking and Excellence created the IREG ranking audit. The audit is based on The Berlin Principles on Ranking of Higher Education Institutions, a 2006 codification of good ranking practice developed to contribute to the improvement and evaluation of rankings.
    • The International Ranking Expert Group (IREG) emerged in 2002 as a joint initiative of the UNESCO European Centre for Higher Education (UNESCO-CEPES) in Bucharest and a group of international ranking experts concerned with the quality of academic ranking.
    • IREG was formally established in 2004 by UNESCO-CEPES and the Institute for Higher Education Policy in Washington, DC. 
    • The IREG Observatory on Academic Ranking and Excellence is a not-for-profit association of ranking organizations, universities, and other organizations interested in the improvement of the quality of international and national rankings of higher education institutions. The association has close to 20 member organizations from Asia, Europe, and the U.S.
Last update 11/14/2014, FC
PBA ref W:\pba\perfmeas\RatingsRankings.html

Last revision 11/14/14

PBA Home  |   Strategic Planning  |   Institutional Research & Analysis |  Budget & Finances  | 
Questions? Comments? |  Legal & Trademarks |  Privacy
15 UCB, University of Colorado Boulder, Boulder, CO 80309-0015, (303)492-8631
  © Regents of the University of Colorado