Quantitative Research Methods, Surveys, Psychological Measurement, Latent Variable Modeling, Structural Equation Modeling, Confirmatory Factory Analysis, Item Response Theory, Cognitive Interviews, Focus Groups, Causal Inference, Mixed Effects Regression, Network Analysis, Mixture Modeling, Multivariate Regression, Ethnography, Machine Learning, Growth Models, Network Analysis, ANOVA
R, SPSS, Mplus, Shiny Dashboard App, Excel, Stata, Qualtrics, Survey Monkey, Google Analytics, Figma, MTurk, UserTesting, Python, HLM, GPower, GitHub
User Experience Researcher at Microsoft - Contract Position (2022 - Present)
- I am a quantitative research analyst working on designing and validating a scale to evaluate several dimensions of user experience for Microsoft Web products. Depending upon the research question, I employ both qualitative and quantitative methods to elicit meaningful, actionable data. I collaborate with a multi-disciplinary team of designers and engineers and incorporate user perspectives into our surveys.
Research Associate at Arizona State University Mixed Effects Modeling Research Lab (2020 - Present)
- I co-created a method to derive model specific cutoffs for factor models, modernizing the way we measure psychological constructs and validate surveys. I focused on making this work accessible to applied researchers by creating a simple, point-and-click Shiny App and R package. I used user feedback on the app to create error messages that provided the most likely solution to the user, reducing the number of bug reports we received each month to 0.
Data Consultant at New Tech Network (2020 - Present)
- I used R and Selenium to scrape data from the web and turn a 25-hour weekly task into a semi-autonomous 5-hour weekly task, resulting in an extension of my contract. I sent out surveys, compiled survey data, analyzed results, send out survey reports, and communicated with representatives from 49 schools.
Psychometric Consultant on John Templeton Foundation Grant #61187 (2017 - 2021)
- I co-created an iterative mixed method approach to item construction and item-level validation called the Response Process Evaluation method. This method is like a cognitive interview, but more efficient, cost-effective, and easily scalable to hundreds of participants. I used this method to collect mixed methods data from over 1,000 participants in America and India to develop and validate a scale.
Graduate Student Researcher on IES Grant #R305A160157 (2016 - 2019)
- I used factor analysis (Mplus) to test the internal structure of a scale and employed mixture models to evaluate if the construct would be better conceptualized as categorical. I helped collect and clean data, and worked on methods to detect deceptive and/or unusual response patterns.
Research Analyst at International Baccalaureate (2012 - 2015)
- I developed research projects and analyzed data from a large internal database to co-author white papers and journal articles. I created QA surveys, distributing them to 1000+ people weekly, and prepared and presented reports to stakeholders. I also created RFPSs and evaluated submissions.
Master of Arts (2020): Education
University of California, Santa Barbara
Master of Arts (2017): Research Methods and Statistics
University of Denver
Graduate Certificate (2012): Measurement, Statistics and Evaluation
University of Maryland, College Park
Advisor: Gregory R. Hancock
Bachelor of Arts (2009): Communication
University of Delaware
Teaching Assistant at University of California, Santa Barbara (2019 – 2021)
ED214A: Introduction to Statistics (2x)
ED214B: Inferential Statistics (2x)
ED214C: Linear Models
SOC108: Introduction to Research
UCSB-Smithsonian Scholars Program: Introduction to Data Science
Teaching Assistant at University of Maryland, College Park (2010 – 2012)
EDMS645: Quantitative Methods
EDMS610: Classroom Assessment
McNeish, D. & Wolf, M. G. (accepted). Dynamic fit index cutoffs for one-factor models. Behavior Research Methods.
Boness, C.L., Helle, A.C., Miller, M.B, Wolf, M.G., & Sher, K.J. (accepted). Who opts in to alcohol feedback and how does that impact behavior? A pilot trial. Journal of Studies on Alcohol and Drugs.
Wolf, M. G., Ihm, E., Maul, A., & Taves, A. (2022). Survey item validation. In S. Engler & M. Stausberg (Eds.), Handbook of Research Methods in the Study of Religion (2nd ed.). Routledge.
McNeish, D. & Wolf, M. G. (2021). Dynamic Fit Index Cutoffs for Confirmatory Factor Analysis Models. Psychological Methods. https://doi.org/10.1037/met0000425
Clairmont, A., Wolf, M. G., & Maul, A. (2021). The prevention and detection of deception in self-report survey data. In U. Luhanga & G. Harbaugh (Eds.), Basic Elements of Survey Research in Education: Addressing the Problems Your Advisor Never Told You About. Charlotte, NC: Information Age Publishing.
McNeish, D., & Wolf, M.G. (2020). Thinking twice about sum scores. Behavior Research Methods. https://doi.org/10.3758/s13428-020-01398-0
Luo, Y. & Wolf, M. G. (2019). Item parameter recovery for the two parameter testlet model with different estimation methods. Psychological Test and Assessment Modeling, 61(1), 65-89.
Ghafoori, B., Wolf, M. G., Nylund-Gibson, K., & Felix, E. D. (2019). A naturalistic study exploring mental health outcomes following trauma-focused treatment among diverse survivors of crime and violence. Journal of Affective Disorders, 245, 617–625. https://doi.org/10.1016/j.jad.2018.11.060
Raines, T.C., Gordon, M., Harrell-Williams, L.M., Diliberto, R.A, & Parke, E.M. (2017). Adaptive skills and academic achievement in Latino students. Journal of Applied School Psychology, 245 - 260. https://doi.org/10.1080/15377903.2017.1292974
Gordon, M., VanderKamp, E. & Halic, O. (2015). Research brief: International Baccalaureate programmes in Title I schools in the United States: Accessibility, participation and university enrollment. https://www.ibo.org/globalassets/publications/ib-research/title-1-schools-research.pdf
Bergeron, L. & Gordon, M. (2015). Establishing a STEM pipeline: Trends in male and female enrollment and performance in higher level STEM courses. International Journal of Science and Mathematics Education, 1 - 18. http://dx.doi.org/10.1007/s10763-015-9693-7
Gordon, M., & Bergeron, L. (2014). The use of multilevel modeling and the level two residual file to explore the relationship between Middle Years Programme student performance and Diploma Programme student performance. Social Science Research, 50, 147-163. https://doi.org/10.1016/j.ssresearch.2014.11.004
Wolf, M. G. & Denison, A. (under review). Survey uses may influence survey responses.
Wolf, M. G. & McNeish, D. (under review). dynamic: An R package for deriving dynamic fit index cutoffs for factor analysis.
Packages and Applications
Wolf, M. G. & McNeish, D. (2020). Dynamic Model Fit (version 0.1.0.). [Software]. Available from www.dynamicfit.app
Wolf, M. G. & McNeish, D. (2020). dynamic: Model fit cutoffs. R package version 0.1.0. https://cran.r-project.org/web/packages/dynamic/index.html
- American Educational Research Association Division D Program Committee Graduate Student Representative, 2018 – 2019
- Expert Advisory Board member at the Center for Mind and Culture, 2019 – Present
- Course Director: Psychological Network Analysis, 2019
- Research Methods and Statistics Student Association President, 2015 – 2016
- Interdisciplinary Journal of Problem-Based Learning
- Behavior Research Methods
Honors and Awards
- Block Grant Dissertation Award, University of California, Santa Barbara (2020)
- Department of Education Excellence Award for Research (2019)
- Grad Slam Finalist (Top 9 out of 79) (2019)
- Block Grant Fellowship Award, University of California, Santa Barbara (2018)
- Education Travel Grant, University of California, Santa Barbara (2018–2019)
- New Tech Network $10,000 Research Grant, Napa, CA (2016)
- Block Grant Fellowship Award, University of California, Santa Barbara (2016)
- University of Denver Graduate Student Travel Grant (2016)
- University of Denver Scholarship Award (2015)
- Dean’s Fellowship, University of Maryland, College Park (2010–2011)
Structural Equation Modeling, Constructing Measures, Analyzing and Validating Measures, Item Response Theory, Psychological Network Analysis, Psychometrics, Bayesian Statistics, Mixture Modeling, Multi-Level Modeling, Causal Inference, Meta-Analysis, Empirical Research Methods, Program Evaluation, Applied Sampling, Survey and Design Analysis, Philosophy of Measurement, Introduction to SAS, Introduction to Simulation, Multivariate Data Analysis, Applied Measurement, Factor Analysis, Quantitative Research Methods I & II, Applied Multiple Regression Analysis, Classroom Assessment & Evaluation, Introduction to Qualitative Research, Ethnography, Anthropology of Education, Social Psychology