College and university rankings
From Wikipedia, the free encyclopedia
In higher education, college and university rankings are listings of educational institutions in an order determined by any combination of factors. Rankings can be based on subjectively perceived "quality," on some combination of empirical statistics, or on surveys of educators, scholars, students, prospective students, or others. Such rankings are often consulted by prospective students as they choose which schools they will apply to or which school they will attend. Among college and university rankings, there are rankings of undergraduate and graduate programs. Rankings are conducted by magazines and newspapers and in some instances by academic practitioners. For details on ranking of law programs, see Law School Rankings.
Rankings vary significantly from country to country. A Cornell University study found that the rankings in the United States significantly affected colleges' applications and admissions. In the United Kingdom, several newspapers publish league tables which rank universities.
Contents |
[edit] Regional and national rankings
In alphabetical order.
[edit] Canada
Maclean's, a news magazine in Canada, ranks Canadian Universities on an annual basis known as the Maclean’s University Rankings. Their criteria are based on a number of factors, which include characteristics of the student body, classes, faculty, finances, the library, and reputation. The criteria are described here. The rankings are split into three categories: primarily undergraduate (schools that focus on undergraduate studies with few to no graduate programs), comprehensive (schools that focus on undergraduate studies but have a healthy selection of graduate programs), and medical doctoral (schools that have a very wide selection of graduate programs). As the most prominent ranking of Canadian universities, these rankings have received much scrutiny and criticism from universities, especially those that receive unfavourable rankings. For example, the University of Calgary produced a formal study examining the methodology of the ranking, illuminating the factors that determined the university's rank, and criticizing certain aspects of the methodology[1]. Even fairly renowned universities, like the University of Alberta and the University of Toronto, have expressed displeasure over the Maclean's ranking system. A notable difference between US rankings and Maclean's rankings, however, is that Maclean's does not include privately-funded universities in its rankings. However, the vast majority and best-known universities in Canada are publicly funded. As of September 2006, many Canadian universities have refused to participate in the Maclean's survey. Macleans will continue to rank universities based on publicly available data.The primarily undergraduate rankingsThe Comprehensive University rankingsThe medical doctoral rankings
[edit] European Union
The European Commission also weighed in on the issue, when it compiled a list of the 22 European universities with the highest scientific impact[2], measuring universities in terms of the impact of their scientific output. This ranking was compiled as part of the Third European Report on Science & Technology Indicators[3], prepared by the Directorate General for Science and Research of the European Commission in 2003 (updated 2004).
Being an official document of the European Union (from the office of the EU commissioner for science and technology), which took several years of specialist effort to compile, it can be regarded as a highly reliable source (the full report, containing almost 500 pages of statistics is available for download free from the EU website). Unlike the other rankings, it only explicitly considers the top European institutions, but ample comparison statistics with the rest of the world are provided in the full report.
In this ranking, the top 2 European universities are also Oxford and Cambridge, as in the Jiao Tong and Times ranking. This ranking, however, stresses more the scientific quality of the institution, as opposed to its size or perceived prestige. Thus smaller, technical universities, such as Eindhoven (Netherlands) and München (Germany) are ranked third, behind Oxbridge. The report does not provide a direct comparison between European and US/world universities - although it does compute complex scientific impact score, measured against a world average.
[edit] Ireland
The Sunday Times compiles a league of Irish universities[4] based a mix of criteria, for example:
- Average points needed in the Leaving Certificate (end-of-secondary-school examination) for entry into an undergraduate course
- Completion rates, staff-student ratio and research efficiency
- Quality of accommodation and ports facilities
- Non-standard entry (usually mature students or students from deprived neighbourhoods)
[edit] UK
- See also: League tables of British universities
HESA (Higher Education Statistics Agency) oversees three yearly statistical returns (Financial, Student and Staff) which must be compiled by every HEI in the UK. These are then disseminated into usable statistics which make up a major part of the HE ranking e.g. Student Staff Ratio, Number of Academic Staff with Doctorates, Money spent on Student Service etc.
The Research Assessment Exercises (RAE) are attempts by the UK government to evaluate the quality of research undertaken by British Universities. Each subject, called a unit of assessment is given a ranking by a peer review panel. The rankings are used in the allocation of funding each university receives from the government. The last assessment was made in 2001. The RAE provides quality ratings for research across all disciplines. Panels use a standard scale to award a rating for each submission. Ratings range from 1 to 5*, according to how much of the work is judged to reach national or international levels of excellence. Higher education institutions (HEIs) which take part receive grants from one of the four higher education funding bodies in England, Scotland, Wales and Northern Ireland.
Standards of undergraduate teaching are assessed by the Quality Assurance Agency for Higher Education (QAA), an independent body established by the UK's universities and other higher education institutions in 1997. The QAA is under contract to the Higher Education Funding Council for England to assess quality for universities in England. This replaced a previous system of Teaching Quality Assessments (TQAs) which aimed to assess the administrative, policy and procedural framework within which teaching took place did directly assess teaching quality. The new QAA return is notoriously inaccurate due to the use of student polls and a number of Universities (Warwick being the most prominent) have refused to take part in the survey.
[edit] USA
[edit] U.S. News & World Report College and University rankings
The best-known American college and university rankings have been compiled since 1983 by the magazine U.S. News & World Report based on a combination of statistics provided by institutional researchers and surveys of university faculty and staff members. The college rankings were not published in 1984, but were published in all years since. The precise methodology used by the U.S. News rankings has changed many times, and the data are not all available to the public, so peer review of the rankings is limited. As a result, many other rankings arose and seriously challenged the result and methodology of US News's ranking, as shown in other rankings of US universities section below.
The U.S. News rankings, unlike some other such lists, create a strict hierarchy of colleges and universities in their "top tier," rather than ranking only groups or "tiers" of schools; the individual schools' order changes significantly every year the rankings are published. The most important factors in the rankings are:
- Peer assessment: a survey of the institution's reputation among presidents, provosts, and deans of admission of other institutions
- Retention: six-year graduation rate and first-year student retention rate
- Student selectivity: standardized test scores of admitted students, proportion of admitted students in upper percentiles of their high-school class, and proportion of applicants accepted
- Faculty resources: average class size, faculty salary, faculty degree level, student-faculty ratio, and proportion of full-time faculty
- Financial resources: per-student spending
- Graduation rate performance: difference between expected and actual graduation rate
- Alumni giving rate
All these factors are combined according to statistical weights determined by U.S. News. The weighting is often changed by U.S. News from year to year, and is not empirically determined (the National Opinion Research Center methodology review said that these weights "lack any defensible empirical or theoretical basis"). Critics have charged that U.S. News intentionally changes its methodology every year so that the rankings change and they can sell more magazines. The first four such factors account for the great majority of the U.S. News ranking (80%, according to U.S. News's 2005 methodology), and the "reputational measure" (which surveys high-level administrators at similar institutions about their perceived quality ranking of each college and university) is especially important to the final ranking (accounting by itself for 25% of the ranking according to the 2005 methodology).[1]
A New York Times article reported that, given the U.S. News weighting methodology, "it's easy to guess who's going to end up on top: Harvard, Yale and Princeton round out the first three essentially every year. In fact, when asked how he knew his system was sound, Mel Elfin, the rankings' founder, often answered that he knew it because those three schools always landed on top. When a new lead statistician, Amy Graham, changed the formula in 1999 to what she considered more statistically valid, the California Institute of Technology jumped to first place. Ms. Graham soon left, and a slightly modified system pushed Princeton back to No. 1 the next year."[2] A San Francisco Chronicle article notes that almost all of US News factors are redundant and can be boiled down to one characteristic: the size of the college or university's endowment."[3]
More statistical criticisms involve the different standards of information used for different universities. For instance, for SAT scores, private schools tend to use best verbal + best math SAT methodology for their reporting metric, while public schools tend to use best one sitting SAT score. For students who generally score above 1300 on the SAT, the difference in the two metrics can be anywhere from 20-50 points in reported score for universities. Also, factors that measure endowment are not uniform, for instance a yearly federal grant can be consistent with a 5% cash flow from an endowment. Criticisms of US News range from political and arbitrary to statistical inaccuracy.
[edit] Vanguard College Rankings of research-doctorate universities
The Vanguard College Rankings are a profile of the top colleges and universities in the United States. These rankings apply data compiled by the world renown National Research Council (NRC). Because the NRC studies are depictions of 'scholarly quality of program faculty' in American universities, the Vanguard Rankings exhibit institutional qualities based upon faculty research achievements, faculty citation patterns, and reputation by field. The site is fee-based, but from its free exemplified listings, interested scholars are able to discover the cardinal rankings (by faculty quality) of the top 100 universities in the US, along with other information the examples portray, without subscribing.
[edit] Washington Monthly College rankings
The Washington Monthly's "College Rankings" began as a research report in 2005 and introduced its first official rankings in the September 2006 issue. It offers American university and college rankings [4] based upon the following criteria:
- a. "how well it performs as an engine of social mobility (ideally helping the poor to get rich rather than the very rich to get very, very rich)"
- b. "how well it does in fostering scientific and humanistic research"
- c. "how well it promotes an ethic of service to country" [5].
As can be seen, WM ranks universities according to very different criteria from other rankings. These criteria result in relatively higher rankings for public universities, compared to other ranking systems. Public universities have to bear responsibilities that are not present in private universities. At undergraduate level especially, public universities must serve their states' mandates and, hence, often have less freedom to pick and choose only a certain group of people.
[edit] The Top American Research Universities
A research ranking of American universities is researched and published in the Top American Research Universities by University of Florida TheCenter. The list has been published since 2000 and attempts to understand the research aspects of American universities better.
The measurement used in this report is anchored heavily on relevant objective data, such as research publications, citations, recognitions and funding - the quintessential measurements for research. The information used can also be found in public-accessible materials, reducing the possibility of manipulation. The approach for this publication appears to be more scientific than those by popular magazines as the researchers all come from scientific backgrounds and the fact that schools are grouped into tiers instead of singular ranking. The research method used is consistent from year to year and any changes is explained in the publication itself. References from any other study are cited as it would in a scientific publication. This report has been circulated and sought after among academia itself.
Among the statistics provided, the figures for total and federal research expenditures may or may not include grants won by the hospital-based faculty, depending on whether the hospital is legally owned by the school or is financially independent.
[edit] Other rankings of US universities
Other organizations which compile general US annual college and university rankings include the Fiske Guide to Colleges,the Princeton Review, and College Prowler. Many specialized rankings are available in guidebooks for undergraduate and graduate students, dealing with individual student interests, fields of study, and other concerns such as geographical location, financial aid, and affordability.
One commercial ranking service is Top Tier Educational Services. [5] Student centered criterion are used and despite the two year completely updated study, the rankings are updated every quarter from new input data. The criterion uses subjective data, such as peer assessment, desirability, and objective data, such as SAT, GPA.
Such new rankings schemes measures what decision makers think as opposed to why. They may or may not augment these statistics for reputation with hard, qualitative information. The authors discuss their rankings system and methodology with students but do not share their specific research tools or formulas. Again, the problem with such a ranking that uses subjective opinions is that it is very prone to personal bias, prejudice and bounded rationality. Also, public universities will be penalized because besides an academic mission, they have a social mission. They simply cannot charge as much money, or be as selective, as private universities. The fact that it is a commercial company, one can ask if there is any hidden business motives behind such a ranking. Perhaps, it can charge more for a private university entrant.
Among the rankings dealing with individual fields of study is the Philosophical Gourmet Report or "Leiter Report" (after its founding author, Brian Leiter of the University of Texas at Austin), a ranking of philosophy departments. This report has been at least as controversial within its field as the general U.S. News rankings, attracting criticism from many different viewpoints. Notably, practitioners of continental philosophy, who perceive the Leiter report as unfair to their field, have compiled alternative rankings.
Avery et al. recently published a working paper for the National Bureau of Economic Research titled "A Revealed Preference Ranking of U.S. Colleges and Universities." Rather than ranking programs by traditional criteria, their analysis uses a statistical model based on applicant preferences. They based their data on the applications and outcome of 3,240 high school students. The authors feel that their ranking is less subject to manipulation compared to conventional rankings (see criticism below).
[edit] International Rankings from Regional Organizations
Several regional organizations provide worldwide rankings, including:
The much-publicized Academic Ranking of World Universities [6] compiled by the Shanghai Jiao Tong University, which was a large-scale Chinese project to provide independent rankings of Universities around the world on behalf of the Chinese government. The results have often been cited by The Economist magazine in ranking universities of the world [7]. As with all rankings, there are issues of methodology, and one of the primary criticisms of the ranking is its bias towards the natural sciences, over other subjects. This is evidenced by the inclusion of criteria such as the volume of articles published by Science or Nature (both Journals devoted to the natural sciences), or the number of Nobel Prize and Fields Medal winners (both of which are predominantly awarded to the physical sciences). This results in some strange anomalies, e.g. the London School of Economics (LSE), consistently ranked within the UK as being among its top five universities,[6] [7] finds itself ranked by Shanghai Jiaotong among the 23rd-33rd best universities in Britain.
The Times Higher Education Supplement, a British publication, in association with QS, annually publishes the Times Higher-QS World University Rankings, a list of 200 ranked universities from around the world. However, when one compares THES ranking with that of others, one will note that there are many more non-American universities that populate the upper tier of the THES ranking. Furthermore, it is to be noted that THES ranking also faces criticism due to the more subjective nature of its assessment criteria, which are largely based on a 'peer review' system of 1000 academics in various fields. An Australian researcher castigates the THES-QS ranking because it arbitrarily put his very own Australian university way higher than it deserves.[8]
In August 2006, the Newsweek magazine of US published a ranking of the Top 100 Global Universities, utilizing selected criteria from the above two rankings, with the additional criterion of library holdings (number of volumes). It aimed at 'taking into account openness and diversity, as well as distinction in research'.[8]
From 2004 the Webometrics ranking of world universities is offering information about more than 3,000 universities according to their web-presence (a computerised assessment of the size and sophistication of the website). As such it is expected to show not only the academic performance, but will reflect also the whole set of activities of the universities, usually ignored in other rankings and that can be obtained if the institutions commitment to Open Access initiatives.
One refinement of the Webometrics approach is the G-Factor methodology, which counts the number of links only from other university websites. The G-Factor is an indicator of the popularity or importance of each university's website from the combined perspectives of the creators of many other university websites. It is therefore a kind of extensive and objective peer review of a university through its website - in social network theory terminology, the G-Factor measures the 'nodality' of each university's website in the 'network' of university websites.
The European Commission also weighed in on the issue, when it compiled a list of the 22 European universities with the highest scientific impact (measured in terms of the impact factor of their scientific output). This ranking was compiled as part of the Third European Report on Science & Technology Indicators, prepared by the Directorate General for Science and Research of the European Commission in 2003 (updated 2004).
Being an official document of the European Union (office of the EU commissioner for science and technology), which took several years of specialist effort to compile, it can be regarded as a highly reliable source (the full report, containing almost 500 pages of statistics is available for download free from the EU website here). Unlike the other rankings, it only explicitly considers the top European institutions, but ample comparison statistics with the rest of the world are provided in the full report.
In this ranking, the top 2 European universities are also Oxford and Cambridge, as in the Jiao Tong and Times ranking. This ranking, however, stresses more the scientific quality of the institution, as opposed to its size or perceived prestige. Thus smaller, technical universities, such as Eindhoven (Netherlands) and Munchen (Germany) closely follow Oxbridge. Furthermore, the report does not provide a direct comparison between these and US/world universities - although it does compute complex scientific impact score, measured against a world average.
A University ranking using Google search engine is also provided by a Stanford student on his blog Stanford ranking. The results of this ranking appear to be an objective peer review assessment of universities around the United States of America. A total of 1720 schools are ranked.
Some rankings include ones based on numbers of Nobel Prizes obtained by Universities[citation needed]
[edit] Criticism (United States)
American college and university ranking systems have drawn criticism from within and outside higher education in the United States.
[edit] 21st Century
During the mid-2000s, some American educators began to question the impact of rankings on the college admissions process. The 22 February 2007 article by National Public Radio titled "Colleges Want to Cool Admissions Frenzy" notes:
- MIT Dean of Admissions Marilee Jones decided to rethink the chaos and stress associated with college admissions after going through the process with her own daughter, Nora. The frenzy over college admissions at America's most selective colleges has many causes: demographic shifts, with too many high schoolers chasing too few slots; cutbacks in funding; and the growing belief that college is required for success in any career. More pressure comes from a multibillion-dollar industry of college consultants, test prep courses, marketers and popular college-ranking systems by organizations such as the Princeton Review and U.S. News & World Report. But recently, a number of university presidents and deans have come together to rethink the admissions process, while conceding that change will be difficult. [9]
However, it was the 11 March 2007 Washington Post article "The Cost of Bucking College Rankings" by Dr. Michele Tolela Myers (the President of Sarah Lawrence College), that brought the issue into national discourse. As Sarah Lawrence College dropped its SAT test score submission requirement for its undergraduate applicants in 2003 [10] (thus joining the SAT optional movement for undergraduate admission), SLC does not have SAT data to send to U.S. News for its national survey. Of this decision, Myers states, "We are a writing-intensive school, and the information produced by SAT scores added little to our ability to predict how a student would do at our college; it did, however, do much to bias admission in favor of those who could afford expensive coaching sessions.[11] (at present, Sarah Lawrence is the only American college that completely disregards SAT scores in its admission process[12]). As a result of this policy, in the same Washington Post article, Dr. Myers stated that she was informed by the U.S. News and World Report that if no SAT scores were submitted, U.S. News would "make up a number" to use in its magazines. She further argues that if SLC were to decide to stop sending all data to U.S. News and World Report, that their ranking would be artificially decreased. [13], [14] U.S. News and World Report issued a response to this article on 12 March 2007 which stated that the evaluation of Sarah Lawrence is currently under review. [15]
In addition, as reported on 21 March 2007 in TIME, "The heads of a dozen private colleges are waiting for the final draft of a letter they will probably sign and send within the next few weeks to their counterparts at 570 or so small to midsize schools asking whether they would be willing to pull out of the U.S. News survey, stop filling out part of it, stop advertising their ranking or, most important, help come up with more relevant data to provide as an alternative." [16] Of these recent events, a 29 March 2007 article in The Daily Princetonian notes, "Princeton — which the magazine ranked No. 1 for the seventh straight year — has continued to supply the magazine with relevant statistics. University Vice President and Secretary Bob Durkee '69 said that while the rankings inform students and their parents of relevant comparative data for different schools, they shouldn't be used to exclusively inform an applicant's choice of college." [17]
[edit] 20th Century
The Association of American Law Schools, Stanford University and Reed College have been highly critical of U.S. News and World Report rankings. The AALS argued that these rankings were not a reliable method of evaluating law schools in the 1998 report, "The Validity of The U.S. News and World Report Ranking of ABA Law Schools." [18]
On 18 April 1997, Stanford University President, Gerhard Casper issued a letter critical of U.S. News college rankings titled "An alternative to the U.S. News and World Report College Survey"[19] At the same time, Stanford students developed FUNC or "Forget U.S. News Coalition" in order to influence college and university administrations to reconsider their cooperation with the U.S. News rankings and similar groups developed at other colleges and universities. [20]
Finally, Reed has also gained notice for refusing to participate in U.S. News annual survey because Reed "actively questions the methodology and usefulness of college rankings."[21] Rolling Stone, in its 16 October 1997 issue, argues that Reed's rankings were artificially decreased by U.S. News after they stopped sending data to U.S. News. [22] Reed further claims that since it refuses to send data, U.S. News, depends upon limited data provided on the College's web site to rank Reed, a practice which Reed claims is incomplete and has caused it to be ranked lower than it would be otherwise. Reed President Colin Diver wrote a piece in the October 2005 issue of Atlantic Monthly magazine defending the decision to refuse to participate in the rankings.[23]
[edit] References
- ^ A review of US News ranking by NORC
- ^ Thompson, Nicholas (2003): "The Best, The Top, The Most;" The New York Times, August 3, 2003, Education Life Supplement, p. 24
- ^ Rojstaczer, Stuart. "College rankings are mostly about money", San Francisco Chronicle, September 3, 2001. Retrieved on 2006-12-11.
- ^ The Washington Monthly College Rankings
- ^ The Washington Monthly's Annual College Guide
- ^ http://education.guardian.co.uk/universityguide2005/table/0,,-5163901,00.html
- ^ http://www.timesonline.co.uk/section/0,,8403,00.html
- ^ The Top 100 Global Universities - 2006. Retrieved, August 15, 2006.
- ^ Colleges Want to Cool Admissions Frenzy. National Public Radio (22 February 2007).
- ^ Sarah Lawrence College Drops SAT Requirement, Saying a New Writing Test Misses the Point. The New York Times (13 November 2003).
- ^ Tolela Myers, Michele (11 March 2007). The Cost of Bucking College Rankings. The Washington Post.
- ^ U.S. News Statement on College Rankings. U.S. News and World Report (12 March 2007).
- ^ Tolela Myers, Michele (11 March 2007). The Cost of Bucking College Rankings. The Washington Post.
- ^ Would U.S. News Make Up Fake Data?. Inside Higher Ed (12 March 2007).
- ^ U.S. News Statement on College Rankings. U.S. News and World Report (12 March 2007).
- ^ The College Rankings Result. TIME (21 March 2007).
- ^ Weidmann, Maxwell (29 March 2007). Universities oppose college rankings. The Daily Princetonian.
- ^ The Validity of The U.S. News and World Report Ranking of ABA Law Schools. Association of American Law Schools (18 February 1998).
- ^ Casper, Gerhard (18 April 1997). Tn alternative to the U.S. News and World Report College Survey. Stanford University.
- ^ Stanford Fourth in US News Rankings. Stanford University (22 September 2006).
- ^ College Rankings. Reed College Admission Office.
- ^ http://web.reed.edu/reed_magazine/nov1997/news/3.html
- ^ Diver, Colin (November, 2005). Is There Life After Rankings?. The Atlantic Monthly.
[edit] See also
- Law School Rankings
- MBA program rankings
- Universities in the United States
- Liberal arts college
- U.S. News & World Report
- College Prowler
- Princeton Review
[edit] External links
- Paked - THES World University Rankings 2006
- THES World University Rankings 2006 - pdf version
- TIMES Higher Education - official website
- Webometrics Ranking of World Universities 2007
- Education, & Social Science Library - College and University Rankings
- Higher Education Supplement from London out of Times
- Rank Your College - A spoof of the US News rankings