Business Schools

Executive MBA Rankings


Our Executive MBA rankings are based on surveys of EMBA graduates and program directors. The methodology combines results from three graduate polls over five years to measure long-term student satisfaction

When is the Executive MBA ranking published?

The Executive MBA ranking is published in November in odd-numbered years at the same time as the Part-Time MBA and Executive Education rankings.

How does BusinessWeek determine who is eligible for rankings?

We look at a number of different statistics, including but not limited to: age of the EMBA program, enrollment, test scores, acceptance rates, and number of international and minority students. A program must be accredited to be considered for ranking.

If a program has never been ranked before, how can it be considered for ranking?

The school's representative should send a note to geoff_gloeckler@businessweek.com in January of the ranking year. We'll request some information about your program and determine eligibility based on that. The information requested is outlined above (see: "How does BusinessWeek determine who is eligible for rankings?") Please do not send requests forinclusion before Jan. 1 of the ranking year.

What sources of data does BusinessWeek use to rank EMBA programs?

There are two main sources of data— a survey of EMBA graduates and a survey of EMBA program directors.

When do each of the surveys get distributed? How long are they available for completion?

The survey of EMBA graduates is distributed in May and is open for about three months. The survey of EMBA program directors is distributed in August and typically takes two weeks. A third survey, conducted by BusinessWeek.com, asks schools for statistical information about their programs, and is used to create online statistical profiles. That survey which is distributed in early summer and is open for about three months, is not used in the ranking.

How is the survey of EMBA graduates conducted?

The survey of EMBA graduates is conducted online. Using e-mail addresses supplied by the programs, BusinessWeek (with the help of Cambria Consulting) contacts students and directs them to a survey site where they can complete the survey. BusinessWeek will send out several reminders to ensure an adequate response rate. The survey consists of about 50 questions that ask students to rate their programs on teaching quality, career services, curriculum, and the caliber of their classmates, among other things. Using the average answer for each of the questions and each question's standard deviation, we calculate a student survey score for each school.

How is the survey of EMBA program directors conducted?

The survey of EMBA program directors is conducted via e-mail. The magazine (with the help of Cambria Consulting) contacts the directors of the eligible programs and asks them to list their top 10 programs among those with which they are familiar. No. 1-ranked programs are awarded 10 points, No. 2-ranked programs nine, and so on. The sum total of points is the director poll score.

How are the various factors weighted?

The three most recent surveys of EMBA graduates are first combined for a total student score that counts toward 65% of the final ranking. (The current survey counts for 50% of the total student score. The two previous surveys count for 25% each.) The survey of EMBA program directors contributes 35%.

Do schools ever get dropped from the rankings? If so, why?

Yes. Schools with a low response rate in the graduate survey, too few responses, or both will not be ranked. Also, if one of the initial requirements for consideration is not met—for example, if a school loses its accreditation or its enrollment falls below our threshold—then a school will be dropped.

Is there a minimum response rate for the survey of EMBA graduates? How are the response rate and minimum response rate calculated?

The response rate for each program is calculated by dividing the number of replies by the total number of surveys sent. The minimum response rate is determined after a review of all school response rates, and is typically about 20%.

What alternatives are there for schools that do not want to supply student e-mail addresses?

Schools can ask students to "opt out" of the survey, then supply BusinessWeek with the e-mail addresses for those who remain. A second alternative, known as the "opt-in" method, is permitted but strongly discouraged: schools send students an e-mail about the survey and give BusinessWeek a list of e-mail addresses for those who permit the release of that information.

How is the response rate calculated for programs that choose the "opt-out" method?

If a school chooses the "opt-out" method, the response rate is calculated using the number of e-mail addresses supplied to BusinessWeek. If 500 students are given the opportunity to opt out and 100 do, the school would supply 400 e-mail addresses. If we survey those 400 students and 200 complete the survey, the response rate is 50%.

How is the response rate calculated for schools that choose the "opt-in" method?

To calculate a response rate for schools choosing the "opt-in" method, BusinessWeek divides the number of responses by the number of students who received the "opt-in" e-mail.

If 1,000 students receive the "opt-in" message and 500 opt in, 250 survey responses would give that school a response rate of 25%, not 50%. Schools with response rates that fall below the minimum will not be ranked.

What do you do when schools refuse to provide e-mail addresses for the student survey and decline to use the other alternatives available to them?

We attempt to obtain student e-mail addresses using other legal means. These means include, but are not limited to, sending an e-mail to individual students and asking them to forward it to their friends, and taking out ads in student newspapers directing students to the survey site. If BusinessWeek is unable to obtain sufficient e-mail addresses and an adequate response rate, such schools will not be ranked.

For the student survey, how do you "fill in" historical data for schools that have never been surveyed before?

BusinessWeek employs the services of statisticians David M. Rindskopf and Alan L. Gross, professors of educational psychology at City University of New York Graduate Center.

Using statistical regression equations and the survey results from schools with complete data, Gross and Rindskopf are able to provide estimates of survey results from previous years.

How do you prevent cheating?

Statisticians David M. Rindskopf and Alan L. Gross, professors of educational psychology at City University of New York Graduate Center use a series of statistical analyses to test the responses for patterns that have a low probability of occurring if the students are answering the questions honestly. Questionable responses that might be the result of coaching by school officials or other forms of cheating are discarded, and may be grounds for elimination from the ranking.

Why do you only rank 25 EMBA programs in the magazine?

Space constraints prevent us from listing more than 25 programs.

In the table that accompanies the ranking story, where do the letter grades come from?

In the EMBA table, we typically include three letter grades for each ranked program—teaching, curriculum, and support, although this varies from ranking to ranking. They represent the assessment of the EMBA graduates about those aspects of their program. Each of those grades is based on one or more questions in the student survey pertaining to teaching, curriculum, and support. The top 20% in each category earns A+s. The next 25% receives As, the next 35% receive Bs, and the bottom 20% get Cs. There are no Ds or Fs awarded. The questions used for the letter grades do not represent the survey in its entirety. Therefore it is possible for a highly ranked program to receive one or more low letter grades, and a poorly ranked program to receive one or more high letter grades.

What role, if any, do schools play in the surveys, beyond providing e-mail addresses?

The schools have no other role in the rankings. However, they do complete surveys of their own to provide statistical data. That data is then used to create an online profile for each program that appears on BusinessWeek.com.

Do the schools have any input into the content of the surveys? Is the student survey ever provided to schools?

The surveys are prepared by BusinessWeek. Schools, while they may provide input from time to time, do not decide which questions to ask or how to ask them. This is necessary to maintain the integrity and independence of the ranking process. To prevent schools from coaching students on how to answer the survey, the survey is not made available to schools.

Are schools permitted to communicate with their students about the student survey?

BusinessWeek cannot prevent schools from communicating with their students. However, they should not coach students either directly or through the media—such as student newspapers—on how to answer the survey. Nor should they make any statements that emphasize the importance of a high ranking or in any other way attempt to prevent students from answering the survey honestly. Any evidence of coaching will be taken very seriously by BusinessWeek and may be grounds for eliminating a school from the rankings.

Is the data collected from the schools for the online statistical profiles used in the ranking?

No.

What happens if a school doesn't fill out the survey for the statistical profile by the deadline?

The profile will not be created. If a school only partially completes the survey, those questions left unanswered will be filled in with an NA.

How do you find students to interview?

In addition to traditional reporting methods such as campus visits, BusinessWeek will contact students directly via phone or e-mail, but only if they indicate on the survey that they are willing to be interviewed for a story.


Ebola Rising
LIMITED-TIME OFFER SUBSCRIBE NOW
 
blog comments powered by Disqus