Rankings & Profiles
FAQ: Part-Time Rankings
Updates questions on the number of programs listed in the magazine and schools that do not supply student email addresses. Deletes questions on response rates, methodology changes, and methods used to prevent cheating.
Answers to frequently asked questions about Bloomberg Businessweek‘s rankings of part-time MBA programs
When is the part-time MBA ranking published?
The part-time MBA ranking is published in November in odd-numbered years, at the same time as the Executive MBA and Executive Education rankings.
How does Bloomberg Businessweek determine who is eligible for rankings?
For our first ranking of part-time MBA programs, we extended an invitation to hundreds of programs accredited by AACSB in each of the six geographic regions of the U.S. We asked for information on five quality measures: completion rates, GMAT scores, work experience, selectivity, and tenured faculty. In each region, we calculated averages for each measure and identified the schools that were above average on each measure. Schools that were above average on three of the five measures—or two, as long as one of the two was GMAT scores—were allowed to participate.
Starting with the 2009 ranking, schools are permitted to nominate themselves. They will be deemed eligible if the information they supply with their nomination is comparable with that of the previously ranked programs in their region.
If a school has never been ranked before, how can it be considered for ranking?
The school’s representative should send a note to Geoff Gloeckler in January of the ranking year. We’ll request information about the program and determine eligibility based on that. The information requested is outlined above (see: “How does Bloomberg Businessweek determine who is eligible for rankings?”) Please do not send requests for inclusion before Jan. 1 of the ranking year.
What sources of data does Bloomberg Businessweek use to rank part-time MBA programs?
Our methodology has three elements, all derived from two main sources of data: a student survey and a school survey. The three elements are:
1. The student survey: This comprises approximately 50 equally weighted questions measuring every aspect of student satisfaction with the MBA experience—from teaching to course content to career outcomes—as well as additional questions about the person completing the survey.
2. Goals Measure: Through a series of questions in the student survey, we determine the percentage of respondents in three distinct categories who say their MBA program was “completely” important in achieving their goals. The categories are “career advancers” who are seeking career advancement with their current employer; “job switchers” who are seeking career advancement with a new employer in the same industry; and “career changers” who want to change industries, functional areas, or both.
3. Academic Quality: This consists of six equally weighted measures: average GMAT scores for part-time MBA students, average work experience for part-time MBA students, the percentage of all teachers in the part-time MBA program who are tenured faculty, average class size in core business classes, total number of business electives available to part-time MBA students, and the completion rate for students in the part-time MBA program. This information is supplied to Bloomberg Businessweek by the schools themselves, when they complete the school survey.
When do of the surveys get distributed? How long are they available for completion?
The student survey is distributed in mid-May and is available for completion through mid-August. The school survey is distributed by email in August and schools have approximately one month to complete it.
How is the student survey conducted?
The student survey is conducted online. Using e-mail addresses supplied by the programs, Bloomberg Businessweek (with the help of Cambria Consulting) contacts students and directs them to a site where they can complete the survey. Bloomberg Businessweek sends out several reminders to ensure an adequate response rate. When the survey is closed, Bloomberg Businessweek calculates the average answer for each question as well as each question’s standard deviation, which are then used to calculate an overall student survey score for each school.
How do you calculate the academic quality ranking?
Bloomberg Businessweek uses six equally weighted measures using data supplied by the schools: average GMAT scores for part-time MBA students, average work experience for part-time MBA students, the percentage of all teachers in the part-time MBA program who are tenured faculty, average class size in core business classes, total number of business electives available to part-time MBA students, and the completion rate for students in the part-time MBA program. The data are supplied to Bloomberg Businessweek by the schools when they complete the school survey.
For each measure, we split the data set into quintiles, awarding five points for schools in the top quintile, four points for those in the second, three points for those in the third, etc. Schools that fail to report data in a category are placed in the lowest quintile and receive one point. Each school’s academic quality score consists of the sum of the points earned in each of the six categories. The highest score is 30; the lowest is 6.
How are the various factors weighted?
The student survey counts for 40% of the final ranking. The goals measure counts for 30%. And the academic quality ranking counts for 30%. Each of the six component parts of the academic quality ranking contributes 5% to the final ranking.
How do you create the six regional rankings?
For each of the three measures—the student survey, the goals measure, and academic quality—we first rank every school, regardless of location. So the school that fares the best in the student survey receives a No. 1 student survey rank, followed by No. 2, No. 3, and so on. The goals measure and academic quality measure are treated the same way. We then combine each school’s three individual rankings into an overall score, which is then used to create a national ranking. Finally, we place each of the schools in ranked order in the appropriate regional ranking.
Northeast includes Connecticut, Maine, Massachusetts, New Hampshire, New York, Rhode Island, and Vermont.
Mid-Atlantic includes Delaware, Maryland, New Jersey, Pennsylvania, Virginia, Washington, D.C., and West Virginia.
Midwest includes Illinois, Indiana, Iowa, Kansas, Michigan, Minnesota, Missouri, Nebraska, North Dakota, Ohio, South Dakota, and Wisconsin.
South includes Alabama, Arkansas, Florida, Georgia, Kentucky, Louisiana, Mississippi, North Carolina, South Carolina, and Tennessee.
Southwest includes Arizona, Colorado, New Mexico, Oklahoma, and Texas.
West includes Alaska, California, Hawaii, Idaho, Montana, Nevada, Oregon, Utah, Washington, and Wyoming.
Do schools ever get dropped from the rankings? Why?
Yes. If the response rate on the student survey falls below the minimum threshold, a school will be dropped. If one of the initial requirements for consideration is not met—for example, if a school loses its accreditation—then a school will be dropped.
Is there a minimum response rate for the student survey? How are the response rate and minimum response rate calculated?
The response rate for each school is calculated by dividing the number of replies by the total number of surveys sent. The minimum response rate is determined after a review of all school responses with a goal of eliminating outliers.
What do you do when schools refuse to provide e-mail addresses for the student survey?
Schools that cannot supply email addresses for their part-time MBA students are ineligible to participate in the ranking.
Why do you rank only 25 part-time MBA programs in the magazine?
Space constraints prevent us from listing more than 25 programs, but a complete list of all ranked programs is available on this Web site.
In the table that accompanies the ranking story, where do the letter grades come from?
Each of the letter grades is based on a portion of the student survey. The “teaching quality” grade, for example, would be based on all the questions in the survey pertaining to teaching quality. In all cases, the schools that score in the top 20% in each category earn A+s. The next 25% receive As, the next 35% receive Bs, and the bottom 20% get Cs. There are no Ds or Fs awarded.
In future rankings, Bloomberg Businessweek will incorporate multiple student surveys in the methodology, but the letter grades are always based on the most recent survey.
Since each grade is based on only a portion of the survey, it’s possible for a poorly ranked school to have one or more high letter grades, and for a highly ranked school to have one or more low letter grades.
What role, if any, do schools play in the student survey, beyond providing e-mail addresses?
The schools have no other role in the survey.
Do the schools have any input into the content of the surveys? Is the student survey ever provided to schools?
The surveys are prepared by Bloomberg Businessweek. Schools, while they may provide input from time to time, do not decide which questions to ask or how to ask them. This is necessary to maintain the integrity and independence of the ranking process. To prevent schools from coaching students on how to answer the survey, it is not made available to schools and is substantively rewritten each year.
Are schools permitted to communicate with students about the student survey?
Bloomberg Businessweek cannot prevent schools from communicating with their students. But they should not coach students either directly or through the media—such as student newspapers—on how to answer the survey. Nor should they make any statements that emphasize the importance of a high ranking or in any other way attempt to prevent students from answering the survey honestly. Any evidence of coaching will be taken seriously by Bloomberg Businessweek and may be grounds for eliminating a school from the rankings.
Is data collected from the schools for the online statistical profiles used in the ranking?
No. While some of the data collected from schools for their online profiles is the same as that used for the ranking, Bloomberg Businessweek collects the data needed for the ranking separately, in an email survey distributed to schools in August. The information collected in this email survey includes: average GMAT scores for part-time MBA students, average work experience for part-time MBA students, the percentage of all teachers in the part-time MBA program who are tenured faculty, average class size in core business classes, total number of business electives available to part-time MBA students, and the completion rate for students in the part-time MBA program.
What happens if a school doesn’t fill out the survey for the statistical profile by the deadline?
The profile will not be created. If the school only partially completes the survey, the questions not answered will be filled with NAs. Bloomberg Businessweek reserves the right not to publish profiles that are substantially incomplete.
How do you find students to interview?
In addition to traditional reporting methods such as campus visits, Bloomberg Businessweek will contact students directly via phone or e-mail, but only if they indicate on the student survey that they are willing to be interviewed for a story.