(From the Book Admission Matters (2005)
written by Sally P. Springer and Marion R. Franck)
The first rankings, published in 1983, were based on surveys of college administrators.
“I am extremely skeptical that the quality of a university-- any more than the quality of a magazine-- can be measured statistically. However, even if it can, the producers of the U.S. News rankings remain far from discovering method.” Gerhard Casper, former president of Stanford University.
As the president of a university that is among the top-ranked universities, I hope I have the standing to persuade you that much about these rankings-- particularly their specious formulas and spurious precision-- is utterly misleading. I wish I could forgo this letter since, after all, the rankings are only another newspaper story. Alas, alumni, foreign newspapers, and many others do not bring a sense of perspective to the matter.” (10)
How it's calculated-- ¼: reputation ratings it receives in the poll of college presidents, provosts, and admissions deans-- they are asked to rate academic quality of undergraduate programs at schools with the same mission as their own (e.g.: Research universities together and liberal arts colleges together) on a 1-5 scale from 'marginal' to 'distinguished,' with the option to respond 'don't know.' Many of those who receive the questionnaire acknowledge that they don't have the kind of detailed information about other colleges that would be needed to respond meaningfully.
The other factors that go into the ranking include: retention and graduation rate (20%), faculty resources (20%), student selectivity (15%), financial resources (10%), alumni giving (5%), graduation rate performance (5%)
All of this information is collected and put into a formula that assigns weightings to the different kinds of data and then computes an overall 'ranking.' Each year the magazine slightly modifies the formula it uses, ostensibly to improve its usefulness as a tool to assess educational quality but also to sell the rankings as “new and improved.”
Between 1998-2000, Cal Tech went moved #9 to #1 back down to less prominent position, while Johns Hopkins moved from #22 to #10 to #15, and Columbia from #9 to #15 to #11. Critics of the rankings argue that meaningful changes in college quality are not possible over a period as short as one year, and that formula changes are primarily to designed to keep interest in the rankings high and sell more magazines.
-Most important criticism of the rankings is that they are not based on any direct measures of educational quality or student satisfaction. -For the last few years, India University has attempted to measure quality and satisfaction by asking students direct questions about their educational experiences and how they spend their time.
The myth of “I'll make more money if I graduate from an elite college. “Students may have a better sense of their potential ability than college admissions committees. To cite one prominent example, Steven Spielberg was rejected by the USC and UCLA film schools.” (19) Stacey Dale and Alan Krueger, researchers who studied the long-term effects of attending different kinds of colleges.
They wanted to test their hypothesis that maybe the kind of college where students received their undergraduate education wasn't a big factor at all and what mattered was the personal qualities that students had. So Dale and Krueger compared incomes figures for individuals who were accepted by elite colleges and actually attended those colleges with the income of
people who were accepted by elite colleges but who chose to attend a less selective college. The results showed no difference in income between the two groups. The data even suggested that simply having applied to an elite college, regardless of whether a student was accepted, was the critical factor in predicting later income. Students who had the self-confidence and motivation to envision themselves competitive at an elite college showed the enhanced economic benefit normally associated with having actually attended such a college.
Their research also had sufficient data to show that people who went to a selective college were no more likely to obtain an advanced degree than those who were admitted to a selective college but chose to attend a less selective school... students admitted to a selective college but who chose o attend a less selective one seemed to fare just as well when it came to graduate or profession school admission as those who actually attended the more selective college.
“We didn't find any evidence that suggests that the selectivity of a student's undergraduate college was related to the quality of the graduate school they attended.” Susan Dale.
“Do not choose a college by the numbers. Most of those numbers are about resources and reputations and not actual quality of performance. Base your choice on your own needs and aspirations and which college can best meet them. As Albert Einstein reminded us, 'Not everything that counts can be counted, and not everything that can be counted counts.” David Davenport, former president of Pepperdine University.
댓글 없음:
댓글 쓰기