Best B-Schools

How We Ranked the B-Schools


How We Ranked the B-Schools

Photograph by Lambert/Getty Images

(Corrects the intellectual capital methodology in paragraph 11.)

For more than two decades, this publication has been ranking full-time MBA programs, using essentially the same methodology. There have been slight modifications through the years, but for the most part, Bloomberg Businessweek judges business schools on how well they serve their two main constituencies: students and corporate recruiters.

To begin the ranking process, we sent our ranking survey to 18,640 MBA graduates from the graduating class of 2012 at 114 schools in North America, Europe, and Asia. We received 10,439 responses for a response rate of 56 percent. The Web-based survey asks students to rate their programs on teaching quality, career services, and other aspects of the B-school experience, using a 10-point scale.

The results of the 2012 survey were then combined with those from 2010, when we received 9,820 responses, and 2008, when 7,264 students answered the poll, for total responses of 27,523. We then created a weighted average of the three student surveys, with 2012 counting for 50 percent and 2010 and 2008 contributing 25 percent each.

Using our three most recent student surveys effectively ensures that short-term issues—a new B-school building or unpopular dean—won’t skew results one way or the other. As an added precaution, we also asked David M. Rindskopf and Alan L. Gross, professors of educational psychology at the City University of New York Graduate Center, to analyze the data. By running statistical tests against the survey data, Rindskopf and Gross are able to determine if there were any attempts to influence the outcome of the survey and guarantee the poll’s integrity.

The second stage of the ranking process involves a survey of employers. This year, we surveyed 566 corporate recruiters and received 206 responses for a response rate of 36 percent.

The employer survey asks recruiters to rate the top 20 schools they’re familiar with (at which they have actively recruited, on- or off-campus) on the perceived quality of grads and the company’s experience with MBAs past and present. Each No. 1 rating earns a school 20 points, each No. 2 rating gets it 19 points, and so on. Using each school’s point total—along with the information on where the company recruits and how many MBAs it hires—we calculate a recruiter score. We then combine the 2012 recruiter score with those from 2010, when we had 215 responses, and 2008, when 242 employers answered our poll, for total responses of 663. We then create a weighted average of the three employer surveys, with 2012 counting for 50 percent and 2010 and 2008 each contributing 25 percent.

For the first time in 2012, we modified the way the employer survey results are tabulated. In calculating the recruiter score for each of the three surveys, we first calculated the average number of times each school was mentioned as a recruiting target for 2008 to 2012, as well as the median of those averages for all schools in the ranking. When a school’s average was at or above the median, the three-year recruiter score was not changed. But when the average fell below the median, the score was adjusted downward to reflect the fact that fewer companies recruited there. By making this change, which we first incorporated in our ranking of undergraduate business programs in March, schools with few mentions that are all highly positive no longer have an advantage over schools with many mentions.

While the change had little impact on schools at the top of the 2012 list, it resulted in much lower ranks for some schools further down the list. The methodology change was the reason behind the fall of Southern Methodist University’s Cox School of Business from No. 12 to No. 29 and the drop by the University of Georgia’s Terry College of Business from No. 36 to No. 52.

At this stage of the ranking process, we eliminated 32 schools with poor response rates on one or more ranking surveys. Four schools had an inadequate response rate on our 2012 student survey: China-Europe Business School, DePaul, Pepperdine, and the University of California, Davis.

Sixteen schools had an inadequate response rate on our 2012 employer survey: American, Baylor, Case Western, William and Mary, Cranfield, EDHEC, HEC Montreal, Melbourne, Strathclyde, Tulane, Alberta, Arizona, UC San Diego, Cambridge, Connecticut, and Pittsburgh. Twelve schools had inadequate response rates on both surveys: Asian Institute of Technology, Baruch, Concordia, E.M. Lyon, Florida International, Grenoble Ecole de Management, Hofstra, Rouen Business School, University College Dublin, Arkansas, British Columbia, and University of Miami.

The final stage of the ranking process involves calculating the intellectual capital rating for the remaining 82 schools, tallying the number of articles published by each school’s faculty in 20 publications that range from the Journal of Accounting Research to the Harvard Business Review, and awarding extra points if a professor’s book was reviewed in the New York Times, the Wall Street Journal, or Bloomberg Businessweek. When the tally was complete, the scores were adjusted for faculty size.

With all three pieces of the ranking in place, calculating the ranking is a matter of simple arithmetic. The three combined student surveys contribute a total of 45 percent, as do the the three combined employer surveys, with the intellectual capital score contributing the final 10 percent. To help readers understand each school’s relative position in the ranking—how far apart one school is from another—we also supply the ranking index number on which the final rank is based. It represents the sum total of all ranking elements and is shown on a 100-point scale.

In addition to the overall rank, we calculate each school’s relative position in the three parts of our methodology: the weighted average of the student surveys, the weighted average of the employer surveys, and the intellectual capital score. We also award letter grades, based on the 2012 student survey, that highlight each school’s performance in specific categories such as teaching quality. Since these letter grades are each based on one or more questions from the survey, some schools might have a high overall rank but a low grade in one or more areas (or vice-versa). The top 20 percent in each category earn A+s, the subsequent 25 percent get As, the next 35 percent get Bs, and the bottom 20 percent receive Cs. No Ds or Fs are awarded.

Louis_lavelle
Lavelle is an associate editor for Bloomberg Businessweek.

Monsanto vs. GMO Haters
LIMITED-TIME OFFER SUBSCRIBE NOW

(enter your email)
(enter up to 5 email addresses, separated by commas)

Max 250 characters

 
blog comments powered by Disqus