This year, we sent a 50-question survey to 16,565 Class of 2006 MBA graduates at 100 schools in North America, Europe, and Asia. Overall, we received 9,298 responses, the same 56% response rate as in 2004. Nearly every school helped us contact grads, though the Harvard Business School and the University of Pennsylvania's Wharton School declined to provide student contact information. Using publicly available sources, we were able to reach nearly 39% of the Class of 2006 at those two schools, a significantly lower number than the overall response rate, but enough to make the findings statistically valid.
On the Web-based student survey, grads were asked to rate everything from teaching quality to the effectiveness of career services at their schools on a scale of 1-10. The Class of 2006 graduate survey results count for 50% of a school's total student satisfaction score. An additional 25% comes from the responses of 11,518 graduates in the 2002 poll and 25% from 10,074 graduates polled in 2004. Measuring six years' worth of data ensures that short-term improvements or problems don't sway the results. To eliminate outliers, 19 schools were removed from ranking consideration due to low response rates.
Next we asked David M. Rindskopf and Alan L. Gross, professors of educational psychology at City University of New York Graduate Center, to analyze the data. The idea was to ensure that the results were not skewed by any attempts to influence student responses or otherwise affect the outcome. They tested the responses to verify credibility of the data to guarantee the poll's integrity. Once the student poll data was certified, the scores received a 45% weight in the overall ranking.
We also invited corporate MBA recruiters to fill out an online survey similar to that of the student survey. Of the 426 companies surveyed, 223 answered, for a response rate of 52%--up from 49% in 2004. Among the companies were those that hire anywhere from a few MBAs each year to those that hire hundreds. Except in rare instances, each company was allowed to complete one survey as a way to make sure the results would not be distorted.
Recruiters were asked to rate their top 20 schools according to the quality of a B-school's grads and their company's experience with MBAs past and present. Companies could only rate schools at which they have actively recruited--on campus or off--in recent years. Each school's total score was divided by the number of responding companies that recruited from the school. Because there tend to be greater differences among schools in the corporate survey than in the student poll, recruiter opinion can have a bigger impact on the overall ranking. To add depth and breadth to our recruiter survey, BusinessWeek this year added another improvement. Instead of basing each school's recruiter score on a single survey, as we have in prior years, we combined the three most recent polls, as we do with the student surveys. The 2006 recruiter survey counts for 50% of the recruiter score, while the 2004 and 2002 surveys contribute 25% each. Combined, the three recruiter polls accounted for 45% of the final ranking. Another 23 schools with poor response rates in the recruiter survey were eliminated at this stage of the ranking.
Finally, we calculated each school's intellectual-capital rating by tallying faculty members' academic journal entries in 20 publications, from the Journal of Accounting Research to the Harvard Business Review. We also searched The New York Times, The Wall Street Journal, and BusinessWeek, adding points if a professor's book was reviewed there. The scores were then adjusted for faculty size. The final intellectual-capital score accounts for 10% of a school's final grade.