Bruce Randel, Ph.D.
Century Analytics, Inc.
Centennial, CO
Education
Ph.D. Developmental Psychology, University of Michigan, Ann Arbor, MI
M.A. Developmental Psychology, San Francisco State University, San Francisco, CA
Expertise
Research Methods: experimental design, causal inference, cluster-randomized trials, quasi-experimental designs, What Works Clearinghouse standards, data collection, implementation fidelity, and technical reporting
Quantitative Data Analyses: hierarchical linear models, propensity score matching, statistical power analyses, missing data imputation, classical test theory, item response theory, factor analysis, discriminant analysis, and structural equation modeling
Psychometrics: test development, field test and item analysis, item calibration, inter-rater reliability, equating, differential item functioning, test alignment, and early childhood assessment
Professional Experience
2011 – Present President and Chief Consultant
Century Analytics, Inc., Centennial, Colorado
Small business providing technical consulting and analytic services in research methods and statistics. Expertise includes impact evaluation, research design, statistics, data analysis, and reporting with an emphasis on cluster randomized trials, quasi-experimental designs, hierarchical linear models, and psychometrics. Psychometric services include instrument development, reliability and validity studies, alignment studies, inter-rater reliability analyses, scaling and equating, and technical reporting. Services also include detailed point-by-point recommendations to help improve research plans, proposals, or technical reports.
2005 – 2011 Principal Researcher
Mid-continent Research for Education and Learning (McREL), Denver, Colorado
Key responsibilities included serving as Principal Investigator for a cluster randomized trial of a formative assessment training program and Co-principal Investigator for a cluster randomized trial of a vocabulary intervention. Projects also included validations studies, analysis of student achievement data, and psychometric analyses. Served as internal and external consultant and technical reviewer for research methods, statistical analysis, and measurement and psychometrics. Responsibilities also included interpreting findings and preparing written reports and presentations for a variety of audiences.
2001 – 2005 Research Scientist/Senior Research Scientist
CTB/McGraw-Hill, Monterey, California
Successfully managed research efforts for large custom state assessment contracts, early childhood diagnostic assessments, and standardized tests of achievement and aptitude within the K-12 education environment. Responsible for sampling, data collection, test design and item selection, field test and item analysis, vertical scaling, inter-rater reliability, test linking, differential item functioning, and special research studies. Provided technical expertise to state education department personnel, state government officials, and test development staff regarding test design, item selection, scaling and equating, reliability and validity, and accountability.
Selected Project Experience
Education Innovation and Research (EIR) Technical Assistance Liaison, Century Analytics: Nov 2018 – Dec 2026
Contract to Abt Associates
evaluators throughout their study periods. Review and provide feedback to evaluators on designs, research methods, statistical analyses, fidelity of implementation studies, and technical reporting. Communicate to local evaluators the scientific evidence criteria for EIR and the WWC and help ensure local evaluations meet these criteria.
Statistical Adjustment Model Technical Assistance Center (SAMTAC): Oct 2021 – Sept 2023
Subcontract to Plus Alpha Research and Consulting
Funded by the U.S. Department of Labor to provide technical assistance to states developing their local statistical adjustment models (SAMs). Work with states to develop and implement strategies on how to utilize their data and implement available statistical software to create statistical models. Develop reference and training materials to support technical assistance. Identify, organize and facilitate peer-to-peer learning between states and others in the workforce development community.
Evaluation of Curriculum Associates i-Ready Instruction impact on student growth in reading and math: Sept 2021 – March 2022
Subcontract to HumRRO
In collaboration with HumRRO, evaluated the impact of Curriculum Associates i-Ready instruction on achievement in reading and mathematics for students in demographic subgroups (e.g., race/ethnicity, FRL) in grades K-2, 3-5, and 6-8. Study design included two stage matching (many-to-one school matching and coarsened exact student matching) to identify a set of comparison schools and students similar to treatment schools and students. Estimated impacts on student growth using a difference-in-differences analytic model.
Investing in Innovations (i3) Technical Assistance Liaison, Century Analytics: July 2012 – Dec 2021
Contract to Abt Associates
Serve as technical assistance liaison and provide direct technical assistance to fifteen i3 local evaluators throughout their study periods. Review and provide feedback to local evaluators on project plans, evaluation designs and research methods, statistical analyses, and fidelity of implementation studies. Communicate to local evaluators the scientific evidence criteria of the National Evaluation of i3 and the WWC and help ensure local evaluations meet these criteria.
Hawaiian Language Immersion Program Assessment: March 2015 – December 2022
Contract to University of Hawai’i Manoa
Provide consultation and technical services regarding test design, analytic approaches, item calibration and scaling, reliability and validity, and technical documentation. Served as lead psychometrician on multiple alignment studies of standards and test items. Conducted classical test theory, item response theory, and test score equating for operational tests in Hawaiian Language Arts, mathematics, and science in grades 3 through 8.
Curriculum Associates Efficacy Advisory Committee (EAC): Aug 2018 – Dec 2022
Contract to Curriculum Associates
Provide consultation and expertise on evaluation research and projects related to Curriculum Associates product line by reviewing research designs and reports, presenting material through webinars on subject areas of expertise, and participating in in-person and virtual EAC meetings.
REL Central Technical Working Group, Century Analytics: May 2012 – December 2021
Contract to Marzano Research Laboratory
Provide technical assistance and consultation for the development and implementation of research projects to meet IES and What Works Clearinghouse standards. Review and provide technical feedback on major deliverables prior to submission to U.S. Department of Education Institute of Education Sciences.
ASPIRE Project National Science Foundation ITEST program (#1759320): Sept 2018 – June 2021
Contract to Challenger Center
Evaluate the project’s progress toward meeting the goals and objectives by creating evaluation plan to track activities and progress, report on progress, provide a critical review of the project’s designs and activities that is sufficiently independent and rigorous to influence the project’s activities and improve the quality of its findings.
Impact Evaluation of Curriculum Associates i-Ready Instruction: August 2019 – Jan 2020
Subcontract to HumRRO
In collaboration with HumRRO, evaluated the impact of Curriculum Associates i-Ready Instruction program in elementary and middle school grades for mathematics and reading using a student-level stratified matching design. Impacts were estimated using hierarchical-linear models. Provided technical consultation on methodology, conducted propensity score matching, established baseline equivalence, estimated program impacts, and co-authored technical report.
Impact Evaluation of Exact Path: July 2018 – Jan 2019
Contract to Edmentum
Evaluated the impact of Edmentum’s Exact Path program using rigorous research methodology. Designed a student-level quasi-experimental study, worked with program staff to develop eligibility criteria for implementation inclusion, conducted propensity score matching, established baseline equivalence, estimated program impacts, and delivered technical report.
Impact Evaluation of Study Island: Jan 2019 – May 2019
Contract to Edmentum
Evaluated the impact of Edmentum’s Study Island program using rigorous research methodology. Designed a student-level quasi-experimental study, worked with program staff to develop eligibility criteria for implementation inclusion, conducted propensity score matching, established baseline equivalence, estimated program impacts, and delivered technical report.
Impact Evaluation of Blended Core Mathematics: August 2018 – March 2019
Subcontract to HumRRO
In collaboration with HumRRO, evaluated the impact of Curriculum Associates’ Blended Core Mathematics program using a school-level quasi-experimental designed. Provided technical consultation on methodology, conducted propensity score matching, established baseline equivalence, estimated program impacts, and co-authored technical report.
Supporting Educator Effectiveness and Development Technical Assistance: Oct 2015 – December 2018
Contract to Plus Alpha Research & Evaluation and IMPAQ
Provide technical assistance to the three Supporting Educator Effectiveness and Development (SEED) grantees and their evaluators on the design and implementation of their evaluations. Review and provide feedback project plans, evaluation designs, statistical analyses, and reporting to help ensure evaluations can produce rigorous evidence meeting the WWC standards.
Randomized Controlled Trial of Teacher Mentoring: Aug 2012 – March 2017
Contract to Marzano Research Laboratory
Co-author for this cluster randomized trial of a two-year intervention using retired mentors to support early career teachers. The study was a 5-year effort as part of the Regional Educational Laboratory Central and included 77 teachers at 11 Title I elementary schools. Outcomes included student achievement, teacher evaluations, and teacher retention.
Student progress and teacher assessment in a competency-based education system: Aug 2014 – February 2017
Contract to Marzano Research Laboratory
Co-principal investigator of this study examining the academic progress of elementary and
middle school students enrolled in competency-based education and teachers’ assessments of student competencies. Funded as part of the Regional Educational Laboratory Central. Analyses were conducted to estimate student competency scores based on teacher ratings that were used to predict student scores and proficiency on the state achievement test.
REL Technical Quality Assurance Reviewer: Jan 2012 – December 2016
Contract to Chesapeake Research Associates, LLC
Provided technical quality assurance reviews and served as senior adviser to the Regional Education Labs for the Northeast, Mid-Atlantic, and Southwest Regions funded by the U.S. Department of Education Institute of Education Sciences. Reviewed research proposals, analysis plans, and technical reports for relevance, utility, rigor, and technical quality. Provided feedback and technical advice to address weaknesses.
Alignment study, Century Analytics: Jan 2015 – April 2015
Contract to CTB/McGraw-Hill
Principal investigator of this study of the alignment between CTB’s LAS Links Español language proficiency assessment to the WIDA SALSA Standards. The study examined whether the LAS Links Español assesses Spanish language proficiency at the same or higher levels of language complexity and cognitive demand as the WIDA SALSA standards.
Early Mathematics Assessment: September 2012 – August 2013
Subcontract to Seneca Consulting, LLC, funded by the Jefferson County School District
Provided technical consultation and analytics services to develop mathematics assessments for Kindergarten, grade 1, and grade 2. Tasks included planning and consulting on test blueprint development, item development, and field test administration. Tasks also included conducting item review, item selection, analysis and scaling of test forms, and technical reporting.
Impact Analysis for Correctional Education Association College of the Air Study: April 2012 – June 2012
Contract to RMC Research Corporation, Denver
Provided analytic consulting and data analysis services for national study examining the impact of the Correctional Education Association College of the Air program on prison inmates’ academic achievement and postsecondary credit acquisition. Provided consultation for data cleaning and data management, developed data analysis plans, conducted multilevel impact analyses and multilevel analyses of mediation and moderation, and produced written reports of analyses and study findings.
Accuplacer Alignment study: Nov 2011 – March 2012
Subcontract to Magnolia Consulting, LLC funded by Peterson’s
Principal investigator of this study to examine the alignment between Peterson’s online academic skills course (OASC) with the College Board AccuplacerTM placement test and with the Common Core standards in reading, language and math at grade 12.
Student Growth Percentiles: June 2011 – July 2011
Subcontract to Seneca Consulting, LLC funded by the Jefferson County School District
Co-Principal investigator of this study to generate information about the technical aspects of constructing student growth percentiles for the Colorado Student Assessment Program reading scores at 3rd grade using kindergarten through 2nd grade scores on the Basic Early Assessment of Reading (BEAR).
Analysis and Recommendations for Georgia CLASS Keys: March 2011 – June 2011
Subcontract to University of West Georgia funded by the Georgia DoE
Co-Principal investigator of this secondary data analysis study to provide evidence in support the identification of a set of elements from the Georgia teacher evaluation system to use as a teacher effectiveness measure in support of Georgia’s Race to the Top grant.
An Efficacy Trial of Robust Vocabulary Instruction: July 2008 – June 2011
Funded by the U.S. Department of Education National Center for Educational Research, Helen Apthorp, PI, Grant #R305A080627
Co-Principal Investigator for this cluster randomized trial of a supplemental vocabulary program (Elements of Reading: Vocabulary) involving 48 high-poverty elementary schools in Florida. Key responsibilities include design and statistical analysis, power analyses, psychometric analyses, and report writing.
Randomized Controlled Trial of Classroom Assessment for Student Learning: Jan 2006 – April 2011
Funded by the U.S. Department of Education, Contract #ED-06-CO-0023
Principal investigator for this cluster randomized trial of a professional development program in classroom assessment among 67 elementary schools. The study was a 5-year effort to provide unbiased estimates of the impact of the program for enhancing teachers’ knowledge and skill in classroom assessment and for raising student motivation and achievement.
Validation study of Gesell Developmental Observation: June 2008 – November 2010
Funded by the Gesell Institute of Human Development
Principal Investigator for this project to provide technical consulting and statistical analysis to the Gesell Institute of Human Development regarding the development of new instruments for and the validation of the Gesell Developmental Observation (GDO). The GDO is an early childhood (pre-K to K) developmental screening assessment based on a standard procedure for direct observation of a child’s growth and development.
Comprehensive Assessment Program Review: August 2009 – March 2010,
Funded by Department of Labor, Job Corps
Project lead for a review of assessments and development of recommendations to inform Job Corps’ development of a comprehensive assessment strategy. This project involved working closely with Job Corps to determine the scope and purpose of the assessment process, collecting and reviewing technical information and supporting materials, and analyzing off-the-shelf assessments for their alignment with Job Corps’ standards. The report included recommendations regarding the adoption or development of assessments.
Technical Assistance to the Nebraska Department of Education: June 2008 – Dec 2009
Funded by the U.S. Department of Education, Contract #ED-06-CO-0023
Principal Investigator for project providing technical assistance to the Nebraska Department of Education Student Achievement Coordinator to monitor, report, and respond to the achievement gaps of English Language Learners, free and reduced lunch eligible, highly mobile, and special education students. Technical assistance included recommendations for analytic approaches to identify districts with high and low achievement gaps, data analysis, and a technical reporting of achievement gaps while adhering to local non-disclosure legislation.
High School Dropout and Completion, Central Region: November 2006 – August 2008
Funded by the U.S. Department of Education, Contract #ED-06-CO-0023
Principal investigator for this study of dropout and high school graduation in the central region of the US (Colorado, Kansas, Missouri, Nebraska, North Dakota, South Dakota, and Wyoming). This study reported disaggregated dropout and high school graduation rates for the regions states using NCES data sources and methods.
Arizona Instrument to Measure Standards: April 2004 – Oct 2005
Funded by the Arizona Department of Education
Lead psychometrician responsible for all aspects of reliability, validity, test design, and test scaling for the state-wide NCLB tests in Reading, Writing, and Mathematics for Grades 3-8 and high school. Provided technical expertise and guidance to department of education staff, teachers, and government officials. Project work also included sample selection for special studies, field test analysis, differential item functioning, and inter-rater reliability studies. This project required the development of a vertical scale for Grades 3-8. The vertical scale was established by horizontally linking the 1-parameter logistic model AIMS test to a 3-parameter logistical model nationally standardized test via simultaneous calibration of an anchor test.
Fox in a Box: An Adventure in Literacy: June 2002 – April 2004
Lead psychometrician for test design and development for an update of this diagnostic literacy assessment. Conducted field tests to inform item selection and test construction. Conducted validation studies to document the reliability and validity of the multiple test batteries. Evidence of concurrent validity was provided via linking study to a standardized test of reading comprehension. Evidence regarding predictive validity was provided via results from discriminant analysis and children’s performance on a standardized test of reading comprehension.
Service, Training, and Honors
What Works Clearinghouse version 4.1 group design certification, March 2020
NSF Discovery K-12 scale-up panel grant proposal reviewer, February 2012
Investing in Innovations Fund (i3) program grant proposal reviewer, October 2011
What Works Clearinghouse single-case design standards training, April 2011
Institute of Education Sciences, What Works Clearinghouse peer reviewer, April 2010 – 2015
What Works Clearinghouse single-case design standards certification, April 2011.
IES Research Training Institute: Cluster Randomized Trails, Vanderbilt University, June 29 2007
Dissertation Fellowship, Rackham School of Graduate Studies, University of Michigan, 2000-2001
Dissertation Grant, Psychology Department, University of Michigan, 2000
NICHD Predoctoral Training Fellowship, University of Michigan, 1996-1997 and 1998-1999
Publications
Yamaguchi, R., & Randel, B. (2021). Evaluation of an On-line Parenting and Divorce Course: Effects of Parent Knowledge and Skills among Court-mandated Parents of Divorce. Journal of Divorce & Remarriage, 62:6, 431-449, DOI: 10.1080/10502556.2021.1921442.
DeCesare, D., McClelland, A., & Randel, B. (2017). Impacts of the Retired Mentors for New Teachers program (REL 2017–225). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Central. Retrieved from http://ies.ed.gov/ncee/edlabs.
Brodersen, R. M., & Randel, B. (2017). Measuring student progress and teachers’ assessment of student knowledge in a competency-based education system (REL 2017–238). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Central. Retrieved from http://ies.ed.gov/ncee/edlabs.
Randel, B., Apthorp, H., Beesley, A. D., Clark, T.F., & Wang, X. (2016). Impacts of Professional Development in Classroom Assessment on Teacher and Student Outcomes. Journal of Education Research, DOI:10.1080/00220671.2014.992581.
French, B, F., Holmes, W. F., Randel, B., Hand, B., & Gotch, C. M. (2016). Measurement invariance techniques to enhance measurement sensitivity. International Journal of Quantitative Research in Education, Vol 3, Nos1/2, 79-93.
Guddemi, M., Sambrook, A., Wells, S., Randel, B., Fite, K., Selva, G., & Gagnon, K. (2014). Arnold Gesell’s developmental assessment revalidation substantiates child-oriented curriculum. SAGE Open, April-June: 1–18.
Meyer, S. J., & Randel, B. (2013). The impact of an Associate’s degree program for Incarcerated students: A randomized trial of the Correctional Education Associations College of the Air program. Community College Review, 41(3), 223-248.
Randel, B., & Clark, T. (2013). Measuring Classroom Assessment Practice. In J. H. McMillan (Ed.), Sage Handbook of Research on Classroom Assessment. Thousand Oaks, CA: Sage.
Apthorp, H., Randel, B., Cherasaro, T., Clark, T., McKeown, M., Beck, I. L. (2012). Effects of a Supplemental Vocabulary Program on Word Knowledge and Passage Comprehension. Journal of Research on Educational Effectiveness, 5, 160-188.
Randel, B., Beesley, A. D., Apthorp, H., Clark, T.F., Wang, X., Cicchinelli, L. F., & Williams, J. M. (2011). Classroom Assessment for Student Learning: The impact on elementary school mathematics in the Central Region. (REL 2011-4005). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
Schneider, M. C., & Randel, B. (2010). Research on characteristics of effective professional development programs for enhancing educators’ skills in formative assessment. In H. L. Andrade & G. J. Cizek (Eds.), Handbook of Formative Assessment. New York: Routledge.
Randel, B., Moore, L., & Blair, P. (2008). High school dropout and graduation rates in the Central Region. (Issues & Answers Report, REL 2008–No. 040). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Central.
Fried, J. & Randel, B. (2005). Appropriate assessment of young children. CTB/McGraw-Hill position paper. Monterey, CA: CTB/McGraw-Hill.
Randel, B., Stevenson, H.W., & Witruk, E. (2000). Attitudes, beliefs, and mathematics achievement of German and Japanese high school students. International Journal of Behavioral Development, 24(2), 190-198.
Stevenson, H.W., Hofer, B. & Randel, B. (2000). Mathematics achievement and attitudes about math in China and the West. Journal of Psychology in Chinese Societies, 1, 1-16.
Stevenson, H. W., Hofer, B., & Randel, B. (2000). Middle childhood: Education and schooling. In A. E. Kazdin (Ed.) Encyclopedia of Psychology. New York: Oxford University Press.
Technical Reports
Randel, B., Swain, M., Dvorak, R. N., Spratto, E., & Prendez J. Y. (2020). Impact Evaluation of Mathematics i-Ready for Striving Learners Using 2018–19 Data. Final Report. No. 053. Alexandria, VA: HumRRO.
Randel, B., Swain, M., Dvorak, R. N., Spratto, E., & Prendez J. Y. (2020). Impact Evaluation of Reading i-Ready for Striving Learners Using 2018–19 Data. Final Report. No. 053. Alexandria, VA: HumRRO.
Swain, M., Randel, B., Dvorak, R. N. (2020). Impact Evaluation of Mathematics “i-Ready Instruction” for Elementary Grades Using 2018-19 Data. Final Report. No. 106. Alexandria, VA: HumRRO.
Swain, M., Randel, B., Dvorak, R. N. (2020). Impact Evaluation of Reading “i-Ready Instruction” for Elementary Grades Using 2018-19 Data. Final Report. No. 107. Alexandria, VA: HumRRO.
Swain, M., Randel, B., Dvorak, R. N. (2019). Impact Evaluation of Reading “i-Ready Instruction” for Middle School Grades Using 2018-19 Data. Final Report. No. 108. Alexandria, VA: HumRRO.
Swain, M., Randel, B., Dvorak, R. N. (2019). Impact Evaluation of Mathematics “i-Ready Instruction” for Middle School Grades Using 2018-19 Data. Final Report. No. 109. Alexandria, VA: HumRRO.
Swain, M., Randel, B., Dvorak, R. N. (2019). An Impact Evaluation of the Blended Core Mathematics Program for Elementary Grades. Final Report. No. 029. Alexandria, VA: HumRRO.
Randel, B. (2019). Impacts of Edmentum’s Exact Path on Student Language Arts Achievement. Centennial CO: Century Analytics, Inc.
Randel, B. (2018). Impacts of Edmentum’s Exact Path on Student Reading Achievement. Centennial CO: Century Analytics, Inc.
Randel, B. (2018). Impacts of Edmentum’s Exact Path on Student Mathematics Achievement. Centennial CO: Century Analytics, Inc.
Randel, B. (2017). Kaiapuni Assessment of Educational Outcomes (KĀʻEO) Science Assessment alignment report. Centennial CO: Century Analytics, Inc.
Randel, B. & Englert, K. (2017). Kaiapuni Alignment Studies Technical Report. Centennial CO: Century Analytics, Inc.
Englert, K. & Randel, B. (2017). Alignment of the Kaiapuni Academic Content Standards to the Common Core State Standards: 3rd and 4th Grade Mathematics and Language Arts. Golden CO: Seneca Consulting.
Englert, K. & Randel, B. (2017). Kaiapuni Assessment of Educational Outcomes (KĀʻEO): Technical Manual. Golden CO: Seneca Consulting.
Randel, B. (2015). Alignment of LAS Links Español (Form B) to the WIDA Standards Summary & Technical Report. Centennial CO: Century Analytics, Inc.
Englert, K., & Randel, B. (2013). Jefferson County Public School District Early Mathematics Assessment Technical Manual. Golden, CO: Seneca Consulting.
Englert, K., & Randel, B. (2012). Technical report on student growth percentiles for 2012 test scores. Golden, CO: Seneca Consulting.
Englert, K., & Randel, B. (2011). Technical report on student growth percentiles. Golden, CO: Seneca Consulting.
Haynes, L., Randel, B., Allen, J., Englert, K., Cherasaro, T., & Michaels, H. (2011). Analysis and recommendations for CLASS KeysSM power elements. Atlanta: Georgia Department of Education.
Clark, T. F., Randel, B., & Allen, J. (2010). McREL’s approach to assessment. McREL: Denver, CO.
Randel, B. (2010). Student achievement: A gap analysis of targeted subgroups’ performance across Nebraska. McREL: Denver, CO.
Randel, B. (2010). Georgia CRCT erasure study and Gainesville City Schools. McREL: Denver, CO.
Clark, T. F., Englert, K., Frazee, D., Shebby, S., & Randel, B. (2009). A McREL report prepared for Stupski Foundation’s Learning System: Assessment. McREL: Denver, CO.
Apthorp, H. S., Bordova, E., Randel, B., Clark, T., & Alsop, R. (2007). A Kindergarten – Grade 3 literacy audit for Guildford County Schools. McREL: Denver, CO.
Randel, B. (2006). North Dakota Reading First: Year 4 Evaluation Report. McREL: Denver, CO.
CTB/McGraw-Hill (2005). Arizona Instrument to Measure Standards Technical Report. Author.
CTB/McGraw-Hill (2005). Arizona linking study: Technical Report for Linking TerraNova to SAT/9. Author.
CTB/McGraw-Hill (2004). Fox in a Box: Technical Report 2 Grades K-3. Author.
CTB/McGraw-Hill (2003). InView Technical Report. Author.
Presentations
Borman, G., Randel, B., Zoblotsky, T., & Gargani, J. (2013, May). Exploratory and confirmatory contrasts. Investing in Innovations Project Directors Meeting, Washington, DC.
French, B. F., Finch, H., Randel, B., & Hand, B. (2013, May). Measurement invariance techniques to enhance measurement sensitivity. Annual Meeting of the National Council for Measurement in Education, San Francisco, CA.
Clark, T. F. & Randel, B. (2011, April). Three measures of formative assessment. Annual Meeting of the American Educational Research Association, New Orleans, LA.
Beesley, A. B., Randel, B., & Clark, T. F. (2011, April). Classroom Assessment for Student Learning: Impact on elementary school mathematics. Annual Meeting of the American Educational Research Association, New Orleans, LA.
Kim, D.I., Kim, J. N., Randel, B. (2011, April). Estimating Probability of Being Placed Into a Performance Level. A paper presented at the annual meeting of the American Education Research Association, New Orleans, LA.
Randel, B., Beesley, A. D., Wang, X., & Clark, T. F. (2011, March). Student and Teacher Impacts of Professional Development in Classroom Assessment for Student Learning. A poster presented at the 2011 Society for Research on Educational Effectiveness, Washington, DC.
Randel, B., Beesley, A. D., Wang, X., & Clark, T. F. (2010, June). Reliability and validity of a classroom artifact instrument as a measure of teacher assessment practice. A poster presented at the 2010 Institute for Education Sciences Research Conference, National Harbor, MD.
Clark, T. F., & Randel, B. (2009, April). Effective Professional Development in Classroom Assessment. In J. R. Flaitz (Chair), Formative Assessment in the Classroom: The Impact of Teacher Education and Professional Development. A symposium conducted at the annual meeting of the American Education Research Association, San Diego, CA.
French, B. F., Randel, B., & Finch, W. H. (2009, April). Measurement Sensitivity for Randomized Control Trials: When Differential Item Functioning Is a Good Thing. A poster presented annual meeting of the American Education Research Association, San Diego, CA.
Clemons, T., Clark, T. F., Randel, B., & Apthorp, H. (2008). Teacher knowledge of classroom assessment practices. A poster presented at the 2008 Institute for Education Sciences Research Conference, Washington, DC.
Clark, T. F., & Randel, B. (2008, March). Is the achievement gap narrowing? Longitudinal analysis of racial differences in reading achievement. In L. D. Robertson (Chair), Literacy assessment and longitudinal studies: Issues and outcomes. A symposium conducted at the annual meeting of the American Education Research Association, New York, NY.
Randel, B. (2008, March). Discussant in T. Siskind (Chair), Professional development programs in formative assessment: Do changes in teacher practice improve student achievement? Invited symposium conducted at the annual meeting of the National Council on Measurement in Education, New York, NY.
Randel, B., & Clark, T. F. (2007, November). Multilevel longitudinal modeling as a method for evaluating Reading First. In F. Newman (Chair), Applications of multilevel longitudinal analysis. A symposium conducted at the annual meeting of the American Evaluation Association, Baltimore, MD.
Randel, B., Moore, L., & Blair, P. (2007, June). Dropout and completion rates in the Central Region: Issues and Challenges. A poster presented at the 2007 Institute for Education Sciences Research Conference, Washington, DC.
Randel, B., Barrett, M., & Choi, S. (2006, April). Establishing DIF on single-prompt writing assessment: In search of appropriate external criteria. In B. G. Rogers (Chair), Differential item/person functioning. A symposium conducted at the annual meeting of the American Education Research Association, San Francisco, CA.
Barrett, M., Choi, S., & Randel, B. (2006, April). Calibrating a single-prompt writing test: An investigation of Rasch polytomous model behavior. In G. E. Stone (Chair), The Rasch model applied to polytomous data. A symposium conducted at the annual meeting of the American Education Research Association, San Francisco, CA.
Randel, B., Blair, P., & Kim, D. I. (2005, October). Arizona linking study: Linking TerraNova to SAT/9. Plenary Session at the annual meeting of the Arizona Educational Research Organization, Phoenix, AZ.
Sotaridona, L., Barrett, M., Choi, S., Um, K., Randel, B., & Kim, D.I. (2005, June). Incorporating person fit analysis into the assessment framework using IRT. A paper presented at the annual national conference on Large-scale Assessment, San Antonio, TX.
Randel, B., Chen, L.S., & Kim, D.I. (2004, April). Linking a diagnostic literacy assessment to a standardized test of reading. In M. Pitoniak (Chair), Joining cognition and assessment. A symposium conducted at the annual meeting of the American Education Research Association, San Diego, CA.
Randel, B., & Fried, J. (2004, April). Proficiency in reading comprehension: Implications of Reading First at Grade 3. In M. Chambliss (Chair), Multiple perspectives on reading comprehension. A symposium conducted at the annual meeting of the American Education Research Association, San Diego, CA.
Kim, D.I., Lee, G., Tomkowicz, J., & Randel, B. (2004, April). Linking a short selected-response test to a test with mixed-response items using five linking procedures. In S. Tay-Lim (Chair), Equating. A symposium conducted at the annual meeting of the American Education Research Association, San Diego, CA.
Randel, B. (2003, April). Comparing upward and horizontal educational attainment: A discriminant function analysis. A paper presented at the annual meeting of the American Education Research Association, Chicago, IL.
Randel, B. (1999, April). Attitudes, beliefs, and mathematics achievement of German and Japanese high school students. A poster presented at the Biennial meeting of the Society for Research in Child Development, Albuquerque, NM.
Randel, B. (1998, June). Cultural values support engagement in school. In C. Kinney (Chair), Getting on Track: Cross-national research on student engagement in school. A symposium conducted at the Convention for the Society for the Psychological Study of Social Issues, Ann Arbor, MI.
Randel, B. (1998, February). Culture as moderator of motivation and achievement. A poster presented at the Seventh Biennial meeting of the Society for Research on Adolescence, San Diego, CA.