Frank L. Schmidt

*April 29, 1944, †August 21, 2021

The obituary can be found here. The Biography below is reprinted from Frank’s APA Foundation Gold Medal Award Citation.

Frank L. Schmidt was born in Louisville, Kentucky, in 1944, the oldest of six children in a farm family. In high school he had strong interests in biology and evolutionary theory, and he had a semester’s advanced placement in biology when he entered Bellarmine College in 1962. As a sophomore, Schmidt became interested in applied psychology, especially psychological measurement. He graduated cum laude in psychology and was awarded a Woodrow Wilson graduate fellowship. He received his doctoral degree in industrial/organizational (I/O) psychology from Purdue University in 1970. His interest in the complexities of data analysis began with his dissertation, which showed via computer simulation that regression weights estimated on sample sizes typical of psychological research performed more poorly than simple equal weights in predicting dependent variables. This led him to suspect that psychologists were routinely reading more into their data than was in fact there. This interest continued and eventually led to his work in the development of meta-analysis methods and his critiques of statistical significance testing.

During Schmidt’s first year on the faculty at Michigan State University, he and Jack Hunter began a research collaboration that lasted until Hunter’s death in 2002. Over the years they received three major awards jointly for their work. Schmidt was promoted and given tenure after only three years at Michigan State, but a desire for real-world applications of psychology led him to take a research position at the Personnel Research and Development Center at the U.S. Civil Service Commission in 1974. As the largest practitioner of personnel selection in the world, the Civil Service Commission (now known as the Office of Personnel Management) provided many interesting applied and basic research problems and time free of academic duties to focus on these problems. A great deal of Schmidt’s research was conducted during this period. In 1985 he accepted a chaired professorship in the Tippie College of Business at the University of Iowa, where he remains as professor emeritus after retiring in 2012.

Schmidt has published a total of 198 journal articles and book chapters*, over 100 of them in top-tier journals such as Psychological Bulletin, Journal of Applied Psychology, Organizational Behavior and Human Decision Processes, and Personnel Psychology. His work had been cited over 25,000 times and his h-index is 67 (that is, 67 articles with at least 67 citations each) according to the Publish or Perish software program; according to the Web of Science database, it has been cited about 10,000 times. The impact of Schmidt’s many research contributions is reflected in the fact that he has received the highest research awards from the American Psychological Association, the Association for Psychological Science, the Society for Industrial/Organizational Psychology, the Society for Human Resource Management, and both the Research Methods Division and the Human Resources Division of the Academy of Management. He also received the Ingram Olkin Award from the Society for Research Synthesis Methodology for his contributions to meta-analysis methods. All of these awards are for lifetime contributions to the application of psychology.

Two of Schmidt’s research contributions stand out in terms of their impact on applications of psychology: validity generalization and meta-analysis. An important research focus in I/O psychology is employment decisions—decisions about who to hire, promote, transfer, and so forth. Selection procedures used to help make these decisions include cognitive ability and aptitude tests, job knowledge tests, interviews, personality measures, and others. The numerous research studies conducted during the 20th century on the validity of such procedures produced contradictory findings. This led to the belief that the validity of employment methods was “situationally specific” and that no general conclusions about successful predictions of employee performance were possible. This conclusion meant that a new study would have to be conducted in every setting.

Schmidt, working with others, developed data analysis methods that looked across multiple studies and tested whether these conflicting research findings were real or artifactual (i.e., due to various statistical and measurement artifacts such as sampling error, measurement error, and range restriction). Application of these methods, called validity generalization (VG) methods, revealed in almost every case that the conflicts were artifactual rather than real and that tests, interviews, assessment centers, and other employment tools were consistently valid across jobs, occupations, and organizations. A particularly striking finding was that reliable measures of general cognitive ability (i.e., general intelligence) predicted performance on virtually all jobs.

Over the years, hundreds of such VG studies have been conducted and published, changing the way researchers and practitioners think about selection methods in what amounts to a paradigm shift in employment selection. These findings changed employment selection practices in many industries (e.g., petroleum and insurance), corporations (e.g., Ford Motor Co.), the U.S. government (e.g., the Office of Personnel Management and the U.S. Job Service), state governments, the U.S. military, and other organizations.

Schmidt’s research, through the meta-analytic methods he developed with the late Jack Hunter, has had a major impact on areas of research outside of employment selection. Just as studies of selection methods produced conflicting findings, so too did studies in many other areas, making it impossible to reach general conclusions in these areas, just as it had been in the selection area. Schmidt and his associates modified the VG methods for use in any research area and explicated these methods in a series of three books on meta-analysis published in 1982, 1990, and 2004 (another edition is now in press).

In most applications of these methods, the results have shown that the research studies were not nearly as conflicting and contradictory as they first appeared to be and that general conclusions were indeed possible. These methods have been applied in numerous areas, resulting in major changes in conclusions from the research within each area. The number of applications outside of personnel selection now far exceeds the number in personnel selection. The methods of meta-analysis developed by Schmidt and his associates are more complete than other available meta-analysis methods in that they correct for biases in data created by measurement error, range restriction, and other study artifacts. For this reason, this approach is referred to as psychometric meta-analysis.

*as of 2013