What Makes A Law Student Succeed Or Fail? A Longitudinal Study Correlating Law Student Applicant Data And Law School Outcomes
University of Colorado Law School
University of Colorado Law School
July 6, 2015
Despite the rise of “big data” empiricism, law school admission remains heavily impressionistic; admission decisions based on anecdotes about recent students, idiosyncratic preferences for certain majors or jobs, or mainly the Law School Admission Test (LSAT). Yet no predictors are well-validated; studies of the LSAT or other factors fail to control for college quality, major, work experience, etc. The lack of evidence of what actually predicts law school success is especially surprising after the 2010s downturn left schools competing for fewer applicants and left potential students less sure of law school as a path to future success. We aim to fill this gap with a two-school, 1400-student, 2005-2012 longitudinal study. After coding non-digitized applicant data, we used multivariate regression analysis to predict law school grades (“LGPA”) from many variables: LSAT; college grades (“UGPA”), quality, and major; UGPA trajectory; employment duration and type (legal, scientific, military, teaching, etc.); college leadership; prior graduate degree; criminal or discipline record; and variable interactions (e.g., high-LSAT/low-UGPA or vice-versa).
Our results include not only new findings about how to balance LSAT and UGPA, but the first findings that college quality, major, work experience, and other traits are significant predictors: (1) controlling for other variables, LSAT predicts more weakly, and UGPA more powerfully, than commonly assumed – and a high-LSAT/low-UGPA profile may predict worse than the opposite; (2) a STEM (science, technology, engineering, math) or EAF (economics, accounting, finance) major is a significant plus, akin to 3½-4 extra LSAT points; (3) several years’ work experience is a significant plus, with teaching especially positive and military the weakest; (4) a criminal or disciplinary record is a significant minus, akin to 7½ fewer LSAT points; and (5) long-noted gender disparities seem to have abated, but racial disparities persist. Some predictors were interestingly nonlinear: college quality has decreasing returns; UGPA has increasing returns; a rising UGPA is a plus only for law students right out of college; and 4-9 years of work is a “sweet spot,” with neither 1-3 or 10 years’ work experience significant. Some, such as those with military or science work, have high LGPA variance, indicating a mix of high and low performers requiring close scrutiny. Many traditionally valued traits had no predictive value: typical pre-law majors (political science, history, etc.); legal or public sector work; or college leadership.
These findings can help identify who can outperform overvalued predictors like the LSAT. A key caveat is that statistical models cannot capture certain difficult-to-code key traits: some who project to have weak grades retain appealing lawyering or leadership potential; and many will over- or under-perform any projection. Thus, admissions will always be both art and science – but perhaps with a bit more science.
What Makes A Law Student Succeed Or Fail? A Longitudinal Study Correlating Law Student Applicant Data And Law School Outcomes – Introduction
The Need For Better Law School Decision-Making
The modern legal education crisis – years of rising tuition and legal sector retrenchment yielding declining law school applications – put a premium on a question that always should have mattered to law schools and their students: What qualities predict law student success? This concern has grown as the downturn has left schools competing for far fewer applicants: applications are at a 30-year low, down 38% over two years alone, forcing schools to shrink, decrease selectivity, or both. Part of the decline may be cyclical, but there also are core long-term, structural causes: the obsolescence of the large-firm model, especially as clients began demanding experienced lawyers, not higher-profit-margin junior lawyers; the rise of a legal process outsourcing industry as digitization allows offsite work; and cheaper competition, as technology streamlines high-markup labor-intensive tasks, from simple software for creating simple documents to replacing multi-lawyer document review with “predictive coding” in which “machine algorithms partially replac[e] humans altogether in the search for relevant information.”
With schools seeing fewer applicants, all schools have been forced to admit students with lower numerical predictors. Especially in a diminished pool, discerning who likely can outperform their numbers is an imperative. Elite schools want to keep admitting those who pass bar exams at high rates and display the talent to land elite jobs; non-elite schools want those who, despite low grades or LSAT scores, still can perform competent legal work and pass a bar exam. The interests are similar from applicants’ perspective. Those with strong LSAT/grade profiles do not always win admission to top schools, and ideally those who are truly stronger should win those coveted seats; those with weak LSAT/grade profiles may not win admission to a reputable (or any) school, yet it is a loss for society and the profession if the stronger low-numbers candidates lack good (or any) admission offers. More broadly, the value of students getting admission offers they deserve goes beyond this era of fewer in law applications. Even if applications rise, schools and students still should want to know who projects to succeed or fail based on factors other than the obvious, such as LSAT, and factors of unclear import, such as college major. Even if the tide rises or some schools can stand pat, the innovative gain advantage from better projecting which prospects are more (or less) promising than they first appear.
See full PDF below.