The methodology behind Bloomberg Businessweek's rankings of the world's best business schools
For prospective MBA students, choosing the right business school can be a nerve-racking experience. Choose wisely, and the next two years will be intellectually invigorating, with job opportunities practically limitless at graduation. Choose poorly, and you're doomed to b-school purgatory. To help MBA candidates make wise choices, Bloomberg Businessweek has for more than two decades ranked full-time MBA programs by how well they satisfy their two main constituencies—students and corporate recruiters—as well as on the research output by faculty members. This year, the methodology hasn't changed, but the number of ranked programs has nearly doubled. In the past, schools that fell below the top 30 for U.S. programs and top 10 for international programs were placed in an unranked "second tier." This year, for the first time, those schools were assigned ranks, increasing the number of ranked schools from 40 to 75. To begin the ranking process, we sent a 50-question survey to 17,941 MBA graduates from the Class of 2010 at 101 schools in North America, Europe, and Asia. We received 9,827 responses for a response rate of 55 percent. In 2008, Harvard Business School (Harvard Full-Time MBA Profile) and the University of Pennsylvania's Wharton School (Wharton Full-Time MBA Profile) declined to provide student contact information for our survey; this year all 101 schools helped us contact grads, either by supplying e-mail addresses or distributing the survey invitations to students on our behalf. The Web-based survey asks graduates to rate their programs according to teaching quality, the effectiveness of career services, and other aspects of their b-school experience, using a scale of 1 to 10. The Class of 2010 survey results count for 50 percent of each school's total student satisfaction score. Our 2008 survey, which polled 16,704 graduates, and our 2006 survey, which polled 16,565, each count for an additional 25 percent. Using six years' worth of survey data encompassing 26,389 individual responses effectively ensures that short-term issues, problems, and improvements won't skew results. Next we asked David M. Rindskopf and Alan L. Gross, professors of educational psychology at City University of New York Graduate Center, to analyze the data. The idea was to ensure that the results were not marred by any attempts to influence student responses or otherwise affect the outcome. The professors tested the responses to verify the data's credibility and to guarantee the poll's integrity. The second stage of the ranking process involves a survey of corporate MBA recruiters. This year we surveyed 514 recruiters and received 215 responses, for a response rate of 42 percent. Recruiters were asked to rate the top 20 schools according to the perceived quality of grads and their company's experience with MBAs past and present. Companies could rate only schools at which they have actively recruited in recent years, on- or off-campus. With the survey completed, we first calculated each school's point total, awarding 20 points for every No. 1 ranking, 19 points for every No. 2 ranking, and so on. Using each school's point total—along with information on the schools where each recruiter hires and the number of MBAs it hires—we calculate a recruiter score. The 2010 score was then combined with scores from the 2008 and 2006 recruiter surveys, totaling 680 responses. (The 2010 survey contributes 50 percent, while the 2008 and 2006 polls each contribute 25 percent.) At this stage, 26 schools with poor response rates on one or both 2010 surveys were eliminated from ranking consideration, leaving 75 schools eligible to be ranked. Four of the eliminated schools had poor response rates on the student survey only: Drexel University, Texas Christian University, University of Connecticut, and University of Iowa. Fourteen schools had inadequate response rates only on the recruiter survey: Baruch College, E.M. Lyon, Erasmus University, Fordham University, Pepperdine University, University of South Carolina, Temple University, Hong Kong University of Science Technology, University of Tennessee at Knoxville, University of Arizona, University of British Columbia, University of California at Irvine, University of Florida, and University of Miami. Eight schools had poor response rates in both surveys: American University, Florida International University, Grenoble Ecole de Management, Syracuse University, University of California at Davis, University College Dublin, University of Alberta, and University of Missouri at Columbia. Finally, we calculated each school's intellectual-capital rating, tallying the number of articles published by each school's faculty in 20 publications, from the Journal of Accounting Research to the Harvard Business Review, and awarding extra points if a professor's book was reviewed in The New York Times, The Wall Street Journal, or Bloomberg Businessweek. When the tally was complete, the scores were adjusted for faculty size. With all three pieces of the ranking in place, we used simple math to calculate the final ranking: The three combined student surveys contribute 45 percent of the final ranking, as do the three combined recruiter polls, with intellectual capital contributing the remaining 10 percent. In addition to the overall rank, we calculated each school's position in the three parts of our methodology: the combined student surveys, combined recruiter surveys, and the intellectual capital score. We also awarded letter grades, based on the 2010 student and recruiter surveys, that highlight each school's performance in specific categories, such as teaching quality. Since these grades are based on responses to one or more survey questions, it's possible for highly ranked schools to have low grades, and for low-ranked schools to get high grades. The top 20 percent in each category earned A+s, the subsequent 25 percent got As, the next 35 percent got Bs, and the bottom 20 percent received Cs. No Ds or Fs were awarded.