Rankings & Profiles

FAQ: Undergrad Business Program Ranking


Answers to your burning questions about Bloomberg Businessweek‘s annual undergraduate business programs rankings

When is the Undergraduate Business Program ranking published?

How does Bloomberg Businessweek determine who is eligible for rankings?

If a school has never been ranked before, how does it get considered for ranking?

What sources of data does Bloomberg Businessweek use to rank programs?

When does each of the surveys get distributed? How long are they available for completion?

How is the student survey conducted?

How is the employer survey conducted?

How do you calculate starting salaries for undergraduates?

How do you determine which schools give graduates a better chance of getting into a top-ranked MBA program?

How do you measure educational quality?

How is each of the factors weighted?

Do schools ever get dropped from the rankings? If so, why?

Is there a minimum response rate for the student survey? How are the response rate and minimum response rate calculated?

What alternatives are there for schools that do not want to supply student e-mail addresses?

How is the response rate calculated for schools that choose the “opt-out” method?

How is the response rate calculated for schools that choose the “opt-in” method?

What do you do when schools refuse to provide e-mail addresses for the student survey and decline to use the other alternatives available to them?

How do you prevent cheating?

Why do you rank fewer undergraduate business programs in the magazine than you do online?

In the table that accompanies the ranking story, where do the letter grades come from?

What is the Index Number supplied in the table?

Why do you make a distinction between public and private schools?

Why do you make a distinction between two-year and four-year programs?

What role, if any, do schools play in the surveys, beyond providing e-mail addresses for the student survey or distributing that survey on Bloomberg Businessweek‘s behalf?

Do the schools have any input into the content of the surveys? Is the student survey ever provided to schools?

Are schools permitted to communicate with their students about the student survey?

Are the data collected from the schools for the online statistical profiles used in the ranking?

What happens if a school doesn’t fill out the survey for the statistical profile by the deadline?

How do you find students to interview?

When is the Undergraduate Business Program ranking published?

Every year in March.

How does Bloomberg Businessweek determine who is eligible for rankings?

To be eligible, schools must have an accredited undergraduate business degree program that meets our criteria for program size, age, test scores, grade point averages for business majors, and number of full-time tenured faculty, among other things.

If a school has never been ranked before, how does it get considered for ranking?

Click on this link and complete our online submission form. Bloomberg Businessweek will consider your answers to these questions to determine eligibility.

What sources of data does Bloomberg Businessweek use to rank programs?

There are five sources for the undergraduate ranking: a student survey, a recruiter survey, median starting salaries for graduates, the number of graduates admitted to all ranked MBA programs, and an academic quality measure that consists of SAT/ACT test scores for business majors, full-time faculty-student ratios in the business program, average class size in core business classes, the percentage of business majors with internships, and the number of hours students spend preparing for class each week. The test scores, faculty-student ratio, and class size information come from a survey to be completed by participating schools; the internship and hours of preparation data come from the student survey.

When does each of the surveys get distributed? How long are they available for completion?

The student survey is distributed in November and remains live for approximately three months; the recruiter survey is distributed in December and remains live for two months. The school survey is distributed in January, and the schools have approximately six weeks to complete it. The details of survey distribution may change in future years.

How is the student survey conducted?

It’s conducted online. Using e-mail addresses supplied by the programs, Bloomberg Businessweek and Cambria Consulting contact students and direct them to a site where they can complete the survey. Bloomberg Businessweek sends out several reminders to ensure an adequate response rate.

The survey consists of about 50 questions that ask students to rate their programs on teaching quality, career services, alumni network, and recruiting efforts, among other things. Using the average answer for each of the questions and each question’s standard deviation, we calculate a student survey score for each school.

Next, for each school, we combine that score with those from the two previous rankings to calculate each school’s overall student survey score. (For schools that did not participate in one or more previous surveys, Bloomberg Businessweek employs the services of statisticians David Rindskopf and Alan Gross, professors of educational psychology at City University of New York Graduate Center. They calculate estimates of the missing scores using statistical regressions and the survey results from schools with complete data.) The most recent year’s survey counts for 50% of the school’s overall student survey score; the two previous years count for 25% each.

How is the employer survey conducted?

The employer survey, like the student survey is also conducted online. Starting with e-mail addresses supplied by the programs, Bloomberg Businessweek creates a list of companies recruiting from the programs and identifies a single high-level recruiting contact at each company. This means that not every recruiter supplied by every school will be contacted. Then, with the help of Cambria Consulting, Bloomberg Businessweek contacts the recruiters and directs them to a site where they can complete the survey. Bloomberg Businessweek sends out several reminders or calls recruiters to ensure an adequate response rate.

Every company tells us how many undergraduate business majors it hired in the previous two years and which schools it actively recruits from, and it ranks up to 20 top schools.

To calculate each school’s recruiter score, we first use the rankings to determine each school’s recruiter points, awarding 20 points for every No. 1 ranking, 19 points for every No. 2 mention, and so on. We then calculate a numerator, which consists of the sum of each school’s points from each specific recruiter multiplied by the number of undergraduate business majors hired by that specific recruiter. We then calculate a denominator, which is the sum of the number of times each school is identified as a recruiting location multiplied by the number of undergraduate business majors hired by each recruiter who mentions it. Finally, for each school, we divide the numerator by the denominator.

For the first time in 2012, we modified the way the employer survey results are tabulated. First, we calculated the three-year average for the number of times a school was mentioned by employers, and found the median for all the schools. Where the average number of mentions for a school was greater than or equal to the median value, the recruiter score was unadjusted. When the average number of mentions was less than the median value, the recruiter score was multiplied by the ratio of the average number of mentions to the median number of mentions. By doing so, schools with few mentions that are all highly positive will no longer have an advantage over schools with many mentions.

How do you calculate starting salaries for undergraduates?

We ask schools to supply median starting salaries–excluding signing bonuses and other compensation–for students who are employed at graduation. To prevent low salaries for graduates who enter nonprofit or other low-paying occupations from skewing this number downward, we ask schools for the median starting salary, not an average.

How do you determine which schools give graduates a better chance of getting into a top-ranked MBA program?

In the student survey used for Bloomberg Businessweek‘s MBA rankings, we ask students to indicate where they received their undergraduate degree and their undergraduate major. We also ask them which MBA program they’re attending.

To determine which undergraduate programs give students a better chance of getting into a top-ranked MBA program, we first examine the data from our most recent MBA student surveys. We isolate only those students attending 35 MBA programs (those that received a top ranking by Bloomberg Businessweek at any time since 1994) who also received undergraduate business degrees from one of the undergraduate programs we’re ranking. We then determine how many survey respondents from each undergraduate program are enrolled in those MBA programs as a group. We then adjust this number for the size of each undergraduate program’s graduating class. This number is then used as the basis for the “feeder school” measure used in the ranking.

How do you measure educational quality?

Bloomberg Businessweek uses five equally weighted sets of data: average SAT/ACT scores for business majors, the full-time faculty/student ratio in the business program, average class size in core business classes, the percentage of business majors with internships, and the number of hours students spend preparing for class each week. Test scores, the faculty/student ratio, and class size information are provided by the schools; the internship data and class preparation information are derived from the student survey.

For each measure, we split the data set into quintiles, awarding five points for schools in the top quintile, four points for those in the second quintile, three points for those in the third, etc. Schools that fail to report data in a category are placed in the lowest quintile and receive one point. Each school’s academic quality score consists of the sum of the points earned in each of the five categories. The highest score is 25; the lowest score is five.

How is each of the factors weighted?

The combined student survey score counts for 30% of the final ranking. The recruiter survey score counts for 20%. Starting salaries count for 10%. The feeder school measure counts for 10%. And the quality measure counts for 30%.

Do schools ever get dropped from the rankings?

Yes. If a response rate for the student or recruiter survey falls below the minimum threshold, a school will be dropped. If one of the initial requirements for rankings consideration is not met–for example, if a school loses its accreditation or its enrollment falls below our threshold–then a school will be dropped. Schools may also be eliminated from the ranking if Bloomberg Businessweek determines that school officials improperly attempted to influence the outcome by coaching students on how to complete the survey.

Is there a minimum response rate for the student survey? How are the response rate and minimum response rate calculated?

The response rate for each school is calculated by dividing the number of replies by the total number of surveys sent. The minimum response rate is determined after a review of all school response rates with a goal of eliminating outliers.

What alternatives are there for schools that do not want to supply student e-mail addresses?

Schools have two alternatives. They can give students the opportunity to “opt out” of the survey, then supply Bloomberg Businessweek with the e-mail addresses for those who remain. A second alternative, known as the “opt-in” method, is permitted but strongly discouraged: schools send students an e-mail about the survey and give Bloomberg Businessweek a list of e-mail addresses for those who permit the release of that information.

In extremely limited cases, such as where state law prohibits the release of student e-mail addresses, Bloomberg Businessweek will–at its discretion–permit schools to distribute the survey to students on its behalf. Schools who use this method must agree to distribute the survey and three reminders to students on a timetable set by Bloomberg Businessweek.

How is the response rate calculated for schools that choose the “opt-out” method?

The response rate is calculated using the number of e-mail addresses supplied to Bloomberg Businessweek. If 500 students are given the opportunity to opt out and 100 do, the school would supply 400 e-mail addresses; if we survey those 400 students and 200 complete the survey, the response rate is 50%.

How is the response rate calculated for schools that choose the “opt-in” method?

Bloomberg Businessweek divides the number of responses by the number of students who received the “opt-in” e-mail. If 1,000 students receive the message and 500 opt in, 250 survey responses would give that school a response rate of 25%, not 50%. Schools with response rates that fall below the minimum will not be ranked.

What do you do when schools refuse to provide e-mail addresses for the student survey and decline to use the other alternatives available to them?

We attempt to obtain student e-mail addresses using other legal means. These means include, but are not limited to, sending an e-mail to individual students and asking them to forward it to their friends, and taking out ads in student newspapers directing students to the survey site. If Bloomberg Businessweek is unable to obtain sufficient e-mail addresses and an adequate response rate, such schools will not be ranked.

How do you prevent cheating?

Statisticians David Rindskopf and Alan Gross, professors of educational psychology at City University of New York Graduate Center, use a series of statistical analyses to test the responses for patterns that have a low probability of occurring if the students are answering the questions honestly. Questionable responses that might be the result of coaching by school officials or other forms of cheating are discarded, and may be grounds for elimination from the ranking.

Why do you rank fewer undergraduate business programs in the magazine than you do online?

Space constraints prevent us from publishing the full list in the magazine. The full rankings are online at BusinessWeek.com.

In the table that accompanies the ranking story, where do the letter grades come from?

In some tables, either in the magazine or online, Bloomberg Businessweek includes several letter grades for each ranked program. They represent the assessment of students about those aspects of their program. Each grade is based on one or more questions in the student survey–so, for example, the “teaching quality” grade would be based on student answers to questions concerning teaching quality in business courses and teaching quality in nonbusiness courses.

The top 20% in each category earns A+s. The next 25% receives As, the next 35% receive Bs, and the bottom 20% get Cs. There are no Ds or Fs awarded. The questions used for the letter grades do not represent the survey in its entirety. Therefore it is possible for a highly ranked program to receive one or more low letter grades, and a poorly ranked program to receive one or more high letter grades. The letter grade for “teaching quality” and the academic quality score used in the ranking measure are two different things; the former represents student perceptions of teaching quality, the latter is comprised of five equally weighted measures of program quality (See: “How do you measure educational quality?”) It’s possible for a school to receive a high academic quality rank and a low grade in teaching quality, or a low academic quality rank and a high grade in teaching quality.

What is the Index Number supplied in the table?

This is the number on which the ranking is based. It allows students, parents, and schools to determine relative differences between ranked programs. For example, the difference between a No. 1 ranked program with an index number of 100 and a No. 2 ranked program with an index number of 99 is negligible. However, the difference between two programs with index numbers of 94 and 89 is substantial. The index number is determined by adding the standardized scores for all five ranking measures–student survey, recruiter survey, starting salaries, MBA feeder school rank, and academic quality rank–for each school. That number is then converted into a 100-point scale.

Why do you make a distinction between public and private schools?

Public schools typically operate under rules that require them to accept a large percentage of applicants. Because our methodology uses data on test scores, among other things, we identify public and private schools so that readers can take that into consideration.

Why do you make a distinction between two-year and four-year programs?

Because our methodology relies in part on class size data that may put some four-year programs at a disadvantage, we supply this information so that readers can take that into consideration.

What role, if any, do schools play in the surveys, beyond providing e-mail addresses for the student survey or distributing that survey on Bloomberg Businessweek‘s behalf?

The schools have no other role. However, they do complete surveys of their own to provide statistical data. Those data are then used to create an online profile for each program that appears on BusinessWeek.com.

Do the schools have any input into the content of the surveys? Is the student survey ever provided to schools?

The surveys are prepared by Bloomberg Businessweek. The schools, while they may provide input from time to time, do not decide which questions to ask or how to ask them. This is necessary to maintain the integrity and independence of the ranking process. To prevent schools from coaching students or recruiters on how to answer the surveys, neither survey is made available to the schools.

Are schools permitted to communicate with their students about the student survey?

Bloomberg Businessweek cannot prevent schools from communicating with their students. However, they should not coach students, either directly or through the media–such as student newspapers–on how to answer the survey. Nor should they make any statements that emphasize the importance of a high ranking or in any other way attempt to prevent students from answering the survey honestly. Any evidence of coaching will be taken seriously by Bloomberg Businessweek and may be grounds for eliminating a school from the rankings.

Are the data collected from the schools for the online statistical profiles used in the ranking?

No. While some of the questions asked in the survey that schools complete for their online statistical profiles are similar to those used to calculate each school’s Academic Quality score, the data used for the Academic Quality score does not come from the school survey. The data used for the rankings comes from a separate email survey consisting of approximately 5 questions that is distributed to schools participating in the ranking in February.

What happens if a school doesn’t fill out the survey for the statistical profile by the deadline?

The profile will not be created. If the survey is partially completed, those questions left unanswered will be filled in with NAs.

How do you find students to interview?

In addition to traditional reporting methods such as campus visits, Bloomberg Businessweek will contact students directly via phone or e-mail, but only if they indicate on the survey that they are willing to be interviewed for a story.


Burger King's Young Buns
LIMITED-TIME OFFER SUBSCRIBE NOW

(enter your email)
(enter up to 5 email addresses, separated by commas)

Max 250 characters

 
blog comments powered by Disqus