Rankings & Profiles
FAQ: Full-Time MBA Rankings
(Corrects the time frame of faculty research reviewed for the intellectual capital score. )
Answers to your burning questions about Bloomberg Businessweek‘s rankings of full-time MBA programs
When is the MBA ranking published?
The MBA ranking is published in late October or early November in even-numbered years.
We look at a number of different statistics, including but not limited to: age of the MBA program, enrollment, test scores, acceptance rates, and number of international and minority students. A program must be accredited to be considered for ranking.
If a school has never been ranked before, how can it be considered for ranking?
The school’s representative should send a note to email@example.com in January of the ranking year. We’ll request some information about your program and determine eligibility based on that. The information requested is outlined above (see: “How does Bloomberg Businessweek determine who is eligible for rankings?”) Please do not send requests for inclusion before Jan. 1 of the ranking year.
What sources of data does Bloomberg Businessweek use to rank MBA programs?
There are three main sources of data: a student survey, a survey of corporate recruiters, and an intellectual capital rating.
When does each of the surveys get distributed? How long are they available for completion?
The student survey is distributed three weeks before graduation, usually in early May, and is available for three months. The recruiter survey is distributed in early July and is available for two months. The intellectual capital rating is not based on a survey.
A third survey, conducted by Businessweek.com, asks schools for statistical information about their programs, and is used to create online statistical profiles. That survey is distributed in early summer and is available for about three months.
How is the survey conducted?
The student survey is conducted online. Using e-mail addresses supplied by the programs, Bloomberg Businessweek (with the help of Cambria Consulting) contacts students and directs them to a survey site where they can complete the survey. Bloomberg Businessweek will send out several reminders to ensure an adequate response rate.
The survey consists of about 45 questions that ask students to rate their programs on teaching quality, career services, alumni network, and recruiting efforts, among other things. Using the average answer for each of the questions and each question’s standard deviation, we calculate a student survey score for each school.
How is the recruiter survey conducted?
The recruiter survey is also conducted online. Starting with e-mail addresses supplied by the programs,Bloomberg Businessweek creates a list of companies recruiting from the programs and identifies a single high-level recruiting contact at each company. This means that not every recruiter supplied by every school will be contacted. In some cases, such as when a company maintains a separate recruiting organization for Europe, more than one recruiting contact at the company will be asked to complete the survey.Then, with the help of Cambria Consulting, Bloomberg Businessweek contacts the company representatives and directs them to a survey site where they can complete the survey. Bloomberg Businessweek will send out several reminders or call recruiters to ensure an adequate response rate.
Every company tells us how many MBAs it hired in the previous two years and which schools it actively recruits from, and it ranks up to 20 top schools.
To calculate each school’s recruiter score, we first use the rankings to determine each school’s recruiter points, awarding 20 points for every No. 1 ranking, 19 points for every No. 2 mention, and so on. We then calculate a numerator, which consists of the sum of each school’s points from each specific recruiter multiplied by the number of MBAs hired by that specific recruiter. We then calculate a denominator, which is the sum of the number of times each school is identified as a recruiting location multiplied by the number of MBAs hired by each recruiter who mentions it. Finally, for each school, we divide the sum of its numerators by the sum of its denominators.
For the first time in 2012, we modified the way the employer survey results are tabulated. First, we calculated the average for the number of times a school was mentioned by employers over the last three rankings, and found the median for all the schools. Where the average number of mentions for a school was greater than or equal to the median value, the recruiter score was unadjusted. When the average number of mentions was less than the median value, the recruiter score was multiplied by the ratio of the average number of mentions to the median number of mentions. By doing so, schools with few mentions that are all highly positive will no longer have an advantage over schools with many mentions.
How is the intellectual capital score determined?
Bloomberg Businessweek scours 20 top academic journals for articles published by each school’s faculty, reviewing all editions published in the previous four years. The journals are The Harvard Business Review, Journal of Marketing, Operations Research, Information Systems Research, Journal of Finance, American Economic Review, Journal of Accounting Research, Journal of Financial Economics, Management Science, Academy of Management Review, Journal of Marketing Research, Strategic Management Journal, Accounting Review, Academy of Management Journal, Production & Operations Management, Journal of Business Ethics, Journal of Consumer Research, Review of Financial Studies, Administrative Science Quarterly and Marketing Science. Extended articles receive three points; short articles receive one point.
How are the various factors weighted?
The three most recent student surveys are first combined for a total student score that counts toward 45% of the final ranking. (The current survey counts for 50% of the total student score. The two previous surveys count for 25% each.) The three most recent recruiter surveys are combined for a total recruiter score that contributes another 45%. (The current survey counts for 50% of the total recruiter score. The two previous surveys count for 25% each.) The intellectual capital rating contributes 10% to the final ranking.
Do schools ever get dropped from the rankings? Why?
Yes. If a response rate falls below the minimum threshold, a school will be dropped. If one of the initial requirements for consideration is not met—for example, if a school loses its accreditation or its enrollment falls below our threshold—then a school will be dropped.
Is there a minimum response rate for the student survey? How are the response rate and minimum response rate calculated?
The response rate for each school is calculated by dividing the number of replies by the total number of surveys sent. The minimum response rate is determined after a review of all school response rates with a goal of eliminating outliers.
What alternatives are there for schools that do not want to supply student e-mail addresses?
Schools can ask students to “opt out” of the survey, then supply Bloomberg Businessweek with the e-mail addresses for those who remain. A second alternative, known as the “opt-in” method, is permitted but strongly discouraged: schools send students an e-mail about the survey and give Bloomberg Businessweek a list of e-mail addresses for those who permit the release of that information.
How is the response rate calculated for schools that choose the “opt-out” method?
If a school chooses the opt-out method, the response rate is calculated using the number of e-mail addresses supplied to Bloomberg Businessweek. If 500 students are given the choice to opt out and 100 do, the school would supply 400 e-mail addresses. If we survey those 400 students and 200 complete the survey, the response rate is 50%.
How is the response rate calculated for schools that choose the “opt-in” method?
To calculate a response rate for schools choosing the “opt-in” method, Bloomberg Businessweek divides the number of responses by the number of students who received the “opt-in” e-mail. If 1,000 students receive the “opt-in” message and 500 opt in, 250 survey responses would give that school a response rate of 25%, not 50%. Schools with response rates that fall below the minimum will not be ranked.
What do you do when schools refuse to provide e-mail addresses for the student survey and decline to use the alternatives available to them?
At Bloomberg Businessweek‘s discretion, we may attempt to obtain student e-mail addresses using other legal means. These means include, but are not limited to, sending e-mails to individual students and asking them to forward it to their friends, and taking out ads in student newspapers directing students to the survey site. If Bloomberg Businessweek is unable to obtain sufficient e-mail addresses and an adequate response rate, such schools will not be ranked.
For the student survey, how do you “fill in” historical data for schools that have never been surveyed before?
Bloomberg Businessweek employs the services of statisticians David M. Rindskopf and Alan L. Gross, professors of educational psychology at City University of New York Graduate Center. Using statistical regression equations and the survey results from schools with complete data, Gross and Rindskopf are able to provide estimates of survey results from previous years.
How do you prevent cheating?
Statisticians David M. Rindskopf and Alan L. Gross, professors of educational psychology at City University of New York Graduate Center, use a series of statistical analyses to test the responses for patterns that have a low probability of occurring if the students are answering the questions honestly. Questionable responses that might be the result of coaching by school officials or other forms of cheating are discarded, and may be grounds for elimination from the ranking.
Why do you only rank 30 U.S. MBA programs and 10 international programs in the magazine?
Space constraints prevent us from listing more than 30 U.S. programs and 10 international programs in the magazine. The complete lists are available online. Prior to 2010, Bloomberg Businessweek ranked no more than 40 programs, placing the remaining programs in an unranked “second tier.” But the magazine now ranks all programs with adequate response rates in all surveys.
In the table that accompanies the ranking story, where do the letter grades come from?
In the MBA table, we typically include several letter grades for each ranked program. These include “recruiter grades” that represent the assessment of the recruiters we surveyed about the relative skills of the program graduates in specific areas, such as communication, team work, and analytical skills. Each grade is based on one or more questions in the recruiter survey pertaining to the designated skills. Unlike each school’s recruiter score that is used to determine the final ranking, which is a composite of three separate surveys, the letter grades are based only on the most recent recruiter survey.
The table also includes several “MBA grades” that represent the assessment of that program’s students about various aspects of their program, such as teaching quality and career services. Each of those grades is based on one or more questions in the student survey pertaining to the designated qualities. Unlike each school’s student survey score that is used to determine the final ranking, which is a composite of three separate surveys, the letter grades are based only on the most recent student survey.
In the case of both sets of grades, the top 20% in each category earns A+s. The next 25% receives As, the next 35% receive Bs, and the bottom 20% get Cs. There are no Ds or Fs awarded. The questions used for the recruiter and MBA grades do not represent either survey in its entirety. Therefore it is possible for a highly ranked program to receive one or more low letter grades, and a poorly ranked program to receive one or more high letter grades.
What role, if any, do schools play in the surveys, beyond providing e-mail addresses or distributing the surveys on Bloomberg Businessweek‘s behalf?
The schools have no other role in the rankings. However, they do complete surveys of their own to provide statistical data. That data is then used to create an online profile that appears on Businessweek.com.
Do the schools have any input into the content of the surveys? Is the student survey ever provided to schools?
The surveys are prepared by Bloomberg Businessweek. Schools, while they may provide input from time to time, do not decide which questions to ask or how to ask them. This is necessary to maintain the integrity and independence of the ranking process. To prevent schools from coaching students on how to answer the survey, the survey is not made available to schools, and is substantively rewritten each year.
Are schools permitted to communicate with their students about the student survey?
Bloomberg Businessweek cannot prevent schools from communicating with their students. However, they should not coach students either directly or through the media—such as student newspapers—on how to answer the survey. Nor should they make any statements that emphasize the importance of a high ranking or in any other way attempt to prevent students from answering the survey honestly. Any evidence of coaching will be taken very seriously by Bloomberg Businessweek and may be grounds for eliminating a school from the rankings.
Is the data collected from the schools for the online statistical profiles used in the ranking?
What happens if a school doesn’t fill out the survey for the statistical profile by the deadline?
The profile will not be created. If the school only partially completes the survey, those answers will be filled by NAs.
How do you find students to interview?
In addition to traditional reporting methods such as campus visits, Bloomberg Businessweek will contact students directly via phone or e-mail, but only if they indicate on the survey that they are willing to be interviewed for a story.