W. Jake Thompson


Jake Thompson
  • Assistant Director of Psychometrics

Biography

Jake Thompson, Ph.D., has a research focus on the implementation of diagnostic classification models in applied settings.

Thompson is the Assistant Director of Psychometrics at Accessible Teaching, Learning, and Assessment Systems (ATLAS), a research center at the University of Kansas with more than 150 staff and more than $20M in annual revenues.

Thompson leads a team of psychometricians to conduct research studies that support the delivery of operational assessments and implements a research agenda for the application and evaluation of diagnostic classification models. He is the lead psychometrician for the Dynamic Learning Maps® (DLM®) Alternate Assessment System, which administers statewide alternate assessments to students with the most significant cognitive disabilities in 21 states.

Thompson is the PI on an IES-funded grant to develop software to help research estimate and evaluate diagnostic model in practice (R305D210045). He has also collaborated on other CGSA- and OSEP-funded grants to implement diagnostic models to understand student learning and provide actionable feedback to educators (S368A170009, S368A220019). Thompson’s research interests also include Bayesian statistics and data visualization methods for effective communication. He has co-authored more than 60 journal articles, book chapters, technical reports, conference presentations, and software packages.

Education

Ph.D. in Research, Evaluation, Measurement, and Statistics, University of Kansas

Selected Publications

Publications

Thompson, W. J., Nash, B., Clark, A. K., & Hoover, J. C. (2023). Using simulated retests to estimate the reliability of diagnostic assessment systems. Journal of Educational Measurement. [Preprint]

Kobrin, J. L., Karvonen, M., Clark, A. K., & Thompson, W. J. (2022). Developing and refining a model for measuring implementation fidelity for an instructionally embedded assessment system. Practical Assessment, Research, and Evaluation, 27(1), Article 24.

Thompson, W. J. (2022). Gibbs sampler. In B. B. Frey (Ed.) The SAGE encyclopedia of research design (2nd ed., pp. 621– 622). SAGE.

Thompson, W. J. & Nash, B. (2022). A diagnostic framework for the empirical evaluation of learning maps. Frontiers in Education, 6, 714736.

Karvonen, M., Kingston, N. M., Wehmeyer, M. L., & Thompson, W. J. (2020). New approaches to designing and administering inclusive assessments. Oxford research encyclopedia of education (pp. 1–23). Oxford University Press.

Thompson, W. J., Clark, A. K., & Nash, B. (2019). Measuring the reliability of diagnostic mastery classifications at multiple levels of reporting. Applied Measurement in Education, 32(4), 298–309. [Preprint]

Thompson, W. J. (2018). Construct irrelevance. In B. B. Frey (Ed.) The SAGE encyclopedia of educational research, measurement, and evaluation (pp. 375–376). SAGE.

Thompson, W. J. (2018). Evaluating model estimation processes for diagnostic classification models (Publication No. 10785604) [Doctoral dissertation, University of Kansas]. ProQuest Dissertations and Theses Global.

Chrysikou, E. G. & Thompson, W. J. (2016). Assessing cognitive and affective empathy through the Interpersonal Reactivity Index: An argument against a two-factor model. Assessment, 23(6), 769–777.

Technical Reports

Thompson, W. J. & Hoover, J. C. (2021). Using propensity scores to evaluate changes in cross-year performance distributions (Technical Report No. 21-01). University of Kansas; Accessible Teaching, Learning, and Assessment Systems. [PDF]

Thompson, W. J. (2020). Reliability for the Dynamic Learning Maps assessments: A comparison of methods(Technical Report No. 20-03). University of Kansas; Accessible Teaching, Learning, and Assessment Systems. [PDF]

Thompson, W. J. (2019). Bayesian psychometrics for diagnostic assessments: A proof of concept(Research Report No. 19- 01). University of Kansas; Accessible Teaching, Learning, and Assessment Systems.

Clark, A. K., Thompson, W. J., & Karvonen, M. (2019). Instructionally embedded assessment: Patterns of use and outcomes(Technical Report No. 19-01). University of Kansas; Accessible Teaching, Learning, and Assessment Systems. [PDF]

Thompson, W. J. (2018). Assessing model fit for the Dynamic Learning Maps alternate assessment using a Bayesian estimation (Technical Report No. 18-01). University of Kansas; Accessible Teaching, Learning, and Assessment Systems. [PDF]

R Packages

Thompson, W. J., Pablo, N., & Hoover, J. (2023). ratlas: ATLAS formatting functions and templates. R package version 0.0.0.9000.

Thompson, W. J. (2023). taylor: Lyrics and song data for Taylor Swift’s discography. R package version 2.0.1.9000.

Thompson, W. J. (2023). measr: Bayesian psychometric measurement using ‘Stan’. R package version 0.2.1.9000.

Hoover, J. & Thompson, W. J. (2023).dcm2: Calculating the M2 model fit statistic for diagnostic classification models. R package version 1.0.2.

Hoover, J. & Thompson, W. J. (2023). tdcmStan: Automating the creation of Stan code for TDCMs. R package version 2.0.0.

Selected Presentations

Thompson, W. J., Nash, B., & Hoover, J. C. (2023, September 6–7). Using diagnostic models to evaluate student learning hierarchies in a large-scale assessment [Conference session]. Frontier Research in Educational Measurement, Oslo, Norway.

Thompson, W. J. & Clark, A. K. (2023, April 12–15). A simulated retest method for estimating classification reliability. In Y. Bao, M. Madison, & Q. Pan (Chair), Diagnostic measurement: Operational and implementational issues [Symposium]. National Council on Measurement in Education Annual Meeting, Chicago, IL. [Slides]

Thompson, W. J. (2023, March 28–30). Applied diagnostic classification modeling with the R package measr [Paper presentation]. National Council on Measurement in Education Annual Meeting, Virtual Sessions. [Slides]

Clark, A. K., Thompson, W. J., & Kobrin, J. (2022, April 22–25). Visualizing validity evidence: Considering strength of evidence following disrupted administration [Paper presentation]. National Council on Measurement in Education Annual Meeting, San Diego, CA.

Hoover, J. C. & Thompson, W. J. (2022, April 22–25). Modifying the M2 statistic to handle missing data [Paper presentation]. National Council on Measurement in Education Annual Meeting, San Diego, CA.

Kobrin, J., Thompson, W. J., Wang, W., & Hoover, J. C. (2022, April 22–25). Development and evaluation of a composite item-fit statistic for diagnostic classification models [Paper presentation]. National Council on Measurement in Education Annual Meeting, San Diego, CA.

Hoover, J. C., Thompson, W. J., Nash, B., & Kobrin, J. (2021, June 8–11). The I-SMART project: Empirical map validation [Paper presentation]. National Council on Measurement in Education Annual Meeting, Virtual Conference. [PDF]

Thompson, W. J., Clark, A. K., & Nash, B. (2021, June 8–11). Technical evidence for diagnostic assessments. In W. J. Thompson (Chair), Diagnostic assessments: Moving from theory to practice [Symposium]. National Council on Measurement in Education Annual Meeting, Virtual Conference. [PDF / Slides]

Thompson, W. J. & Pablo, N. (2020, January 29–30). Branding and packaging reports with R Markdown [Conference session]. rstudio::conf(2020), San Francisco, CA.

Thompson, W. J. & Nash, B. (2019, April 4–8). Empirical methods for evaluating maps: Illustrations and results. In M. Karvonen (Chair), Beyond learning progressions: Maps as assessment architecture [Symposium]. National Council on Measurement in Education Annual Meeting, Toronto, Canada. [PDF / Slides]

Brussow, J. A., Skorupski, W. P., & Thompson, W. J. (2018, April 12–16). A hierarchical IRT model for identifying group-level aberrant growth [Paper presentation]. National Council on Measurement in Education Annual Meeting, New York, NY.

Nash, B., Clark, A. K., & Thompson, W. J. (2018, April 12–16). Using simulation to evaluate retest reliability of assessment results [Paper presentation]. National Council on Measurement in Education Annual Meeting, New York, NY. [PDF]

Thompson, W. J., Clark, A. K., & Nash, B. (2018, April 12–16). Measuring the reliability of student mastery classifications at multiple levels of reporting [Paper presentation]. National Council on Measurement in Education Annual Meeting, New York, NY. [PDF / Slides]

Nash, B. & Thompson, W. J. (2017, April 26–30). Evaluating an initialization tool for student placement into a map-based assessment [Paper presentation]. National Council on Measurement in Education Annual Meeting, San Antonio, TX. [PDF]

Awards & Honors

  • AERA Division H Outstanding Publication Award for Advances in Methodology (2023)
  • Educational Measurement: Issues and Practice Cover Showcase Winner (2023)
  • AERA Inclusion and Accessibility in Educational Assessment SIG Annual Award (2022)
  • Educational Measurement: Issues and Practice Cover Showcase Winner (2020)
  • Educational Measurement: Issues and Practice Cover Showcase Winner (2017)
  • Educational Measurement: Issues and Practice Cover Showcase Top 10 (2016)
  • Chancellor’s Doctoral Fellowship (2014–2018)

Grants & Other Funded Activity

Currently Funded Projects

Principal Investigator: Improving Software and Methods for Estimating and Evaluating Diagnostic Classification Models (2021–2023). USED, Institute of Education Sciences; $225,000.

Co-Principal Investigator: Dynamic Learning Maps (DLM) Alternate Assessment System. Ongoing state contracts. PI: Meagan Karvonen.

Other Personnel (Psychometrician): Pathways for Instructionally Embedded Assessment (PIE) (2022–2026). USED, Office of Elementary and Secondary Education, Office of School Support and Accountability; $2,500,000. PI: Brooke Nash.

Previously Funded Projects

Other Personnel (Psychometrician): Innovations in Science Map, Assessment, and Reporting Technology (I-SMART) (2016– 2020). USED, Office of Elementary and Secondary Education. PI: Meagan Karvonen.

Unfunded Projects

Principal Investigator: Improving Software and Methods for Estimating and Evaluating Diagnostic Classification Models (2020–2022). USED, Institute of Education Sciences; $225,000.

Memberships

Professional Affiliations

  • American Educational Research Association
  • American Statistical Association
  • National Council on Measurement in Education

Reviewer

  • Behaviormetrika
  • International Journal of Research in Education and Science
  • Journal of Educational Measurement
  • Journal of Open Source Software
  • National Council on Measurement in Education annual meeting