If you plotted the height and weight of hundreds of men on a graph, you’d find that the taller men generally weighed more. You could even figure out an average weight for each height. But you couldn’t take one man’s height and depend on the chart average to accurately tell you how much he weighs. Yet that’s what the GRE comparison tool recently published by ETS tries to do with GRE and GMAT scores.
This GRE comparison tool is not as precise at it may appear, and using it is not as straightforward as presented. The comparison tool is about averages. Admission decisions are about individuals.
ETS used the scores of 525 people who took both tests and used statistical analysis to develop a GMAT prediction grid based on the average scores. ETS acknowledges a 67.4 standard error of prediction. This is an extremely large error that raises concern about the fairness of using the GRE to predict individual GMAT scores in the admission process.
A 67.4 standard error of prediction means there’s only a 17.6 percent chance that the predicted GMAT score will be exactly the same or 10 points higher or lower than the test taker would actually get on the GMAT exam. A far greater percentage of the predicted scores will be much higher or much lower than the score the test taker would actually get on the test.
As a specific example, for a GRE verbal score of 660 and quantitative score of 670, the tool would predict a GMAT Total score of 650. In this case, 1 in 4 people with this predicted score would actually earn 600 or below if they were to take the GMAT exam. In addition to prediction error, there is also measurement error in both the verbal and quantitative GRE scores, so the chance that this individual would actually score something close to 650 is extremely thin.
If a person’s true performance on the GMAT exam is 650, the chances are more than 90 percent that his score will be in the 610-700 range each time he takes the GMAT test.
Decades of peer-reviewed research shows the GMAT exam is a valid predictor of academic performance in the core curriculum of graduate business schools, but no such research exists for the GRE in business study. To use predicted GMAT scores along with actual ones unfairly penalizes both sets of test takers, because applicants with valid GMAT scores could be displaced by applicants with predicted scores that are much too high.
The GMAT exam was developed more than 50 years ago because a consortium of business schools saw the need for a fair, reliable, and valid measure of the specific academic skills needed in graduate business school. With the globalization of both demand and delivery of graduate management education, the importance of a single standard measure of academic skill has only grown.
The GRE and GMAT exams are different tests, measuring different content. No conversion table will ever make them equivalent.
The best way to predict a GMAT score is to use the GMAT exam.