Improving instructional design with better analysis of assessment data

Kristen L. Murphy, Thomas A. Holme

Abstract


As more instructors articulate learning objectives for their students within one course, or academic staff collaborate to articulate learning outcomes for programs, a robust means to assess student performance within these becomes increasingly important. The Examinations Institute of the American Chemical Society (ACS), Division of Chemical Education, has recently published content maps that utilise a structure of subdiscipline-independent fundamental concepts narrowing down to content details that are specific to subdisciplines. This structure has then been utilised to align items and can be used to assess student performance throughout a program. Learning objectives that are designed for a course can then be aligned to the framework and used to gauge student learning within a course or across a program. One key to making well-informed instructional decisions is to obtain as much valid information from such assessment work as possible. This paper describes the combination of using a rubric for assigning complexity with scaling student performance to gauge achieving learning objectives that are aligned to fundamental concepts in the content maps in general chemistry and organic chemistry. It can be argued that information in these forms can provide useful guidance for designing improved instruction.

 


Keywords


Testing and assessment, general chemistry, organic chemistry, chemical education research

Full Text:

PDF

References


Aubrecht, G. J., & Aubrecht, J. D. (1983). Constructing objective tests. American Journal of Physics, 3, 613-620. doi: 10.1119/1.13186

Barbera, J., & VandenPlas, J. R. (2011). All assessment materials are not created equal: The myths about instrument development, validity and reliability. In D. M.

Bunce (Ed.), Investigating classroom myths through research on teaching and learning (pp. 177-193). ACS Symposium Series.

Bates, S., & Galloway, R. (2010). Diagnostic tests for the physical sciences: A brief review. New Directions, 6, 10-20.

Bernholt, S., & Parchmann, I. (2011). Assessing the complexity of students’ knowledge in chemistry. Chemical Education Research and Practice, 12, 167-173. doi: 10.1039/C1RP90021H

Charalambous, C. Y., Kyriakides, L., & Philippou, G. N. (2012). Developing a test for exploring student performance in a complex domain: Challenges faced, decisions made and implications drawn. Studies in Education Evaluation, 38, 93-106. doi: 10.1016/j.stueduc.2012.08.001

Crocker, L., & Algina, J. (1986). Introduction to classical and modern test theory. New York: Holt, Rinehart, and Winston.

Engelhardt, P. V. (2009) An introduction to classical test theory as applied to conceptual multiple-choice tests. In C.R. Henderson & K.A. Harper (Ed.), Getting Started in Physics Education Research (pp. 1-40). American Association of Physics Teachers, College Park, MD. Retrieved from http://www.per-central.org/items/detail.cfm?ID=8807

Haladyna, T. M. (1994). Developing and validating multiple-choice test items. Mahwah, NJ: Lawrence Erlbaum Associates.

Hattie, J., Jaeger, R. M., & Bond, L. (1999). Persistent methodological questions in educational testing. Reviews of Research in Education, 24, 393-446. doi: 10.3102/0091732X024001393

Hay, I. (2012). Over the threshold – Setting minimum learning outcomes (benchmarks) for undergraduate geography majors in Australian Universities. Journal of Geography in Higher Education, 36, 481-498. doi: 10.1080/03098265.2012.691467

Holme, T. A. (2003). Assessment and quality control in chemistry education. Journal of Chemical Education, 80, 594-597. doi: 10.1021/ed080p594

Holme, T. A. (2014). Comparing recent organizing templates for test content between ACS exams in general chemistry and AP chemistry. Journal of Chemical Education, Article ASAP, doi: 10.1021/ed400856r

Holme, T. A., & Murphy, K. L. (2012). The ACS Exams Institute undergraduate chemistry anchoring concepts content map I: General Chemistry. Journal of Chemical Education, 89, 721-723. doi: 10.1021/ed300050q

Huff, K., Steinberg, L., & Matts, T. (2010). The promises and challenges of implementing evidence-centered design in large-scale assessment. Applied Measurement in Education, 23, 310-324. doi: 10.1080/08957347.2010.510956

Johnstone, A.H. (2006). Chemical education research in Glasgow in perspective. Chemical Education Research and Practice, 7, 49-63. doi: 10.1039/B5RP90021B

Kline, T. J. B. (2005). Psychological testing: A practical approach to design and evaluation. Thousand Oaks, CA: Sage.

Knaus, K. J., Murphy, K. L., Blecking, A., & Holme. T. A. (2011). A valid and reliable instrument for cognitive complexity rating assignment of chemistry exam items. Journal of Chemical Education, 88, 554-560. doi: 10.1021/ed900070y

Murphy, K. L, Holme, T. A., Zenisky, A. L., Caruthers, H., & Knaus, K. J. (2012). Building the ACS Exams anchoring concept content map for undergraduate chemistry. Journal of Chemical Education, 89, 715-720. doi: 10.1021/ed300049w

Nitko, A. J. (1983). Educational tests and measurement: An introduction. San Diego, CA: Harcourt Brace Jovanovich.

Raker, J. R., Holme, T. A., & Murphy, K. L. (2013). The ACS Exams Institute undergraduate chemistry anchoring concepts content map II: Organic Chemistry. Journal of Chemical Education, 90, 1443-1445. doi: 10.1021/ed400175w

Raker, J. R., Trate, J. M., Holme, T. A., & Murphy, K. L. (2013). Adaptation of an instrument for measuring the cognitive complexity of organic chemistry exam items. Journal of Chemical Education, 90, 1310-1315. doi: 10.1021/ed400373c

Schultz, M., Mitchell Crow, J., & O’Brien, G. (2013). Outcomes of the Chemistry Discipline Network mapping exercises: Are the Threshold Learning Outcomes met? International Journal of Innovation in Science and Mathematics Education, 21, 81-91.

Towns, M. H. (2010). Developing learning objectives and assessment plans at a variety of institutions: Examples and case studies. Journal of Chemical Education, 87, 91-96. doi: 10.1021/ed100066c

Wilson, M. (2008). Cognitive diagnosis using item response models. Zeitschrift für Psychologie, 216, 74-88. doi: 10.1027/0044-3409.216.2.74

Zenisky, A. L., & Murphy, K. L. (2013). Developing a content map and alignment process for the undergraduate curriculum in chemistry. In T. A. Holme, M. M. Cooper, & P. Varma-Nelson (Ed.), Trajectories of chemistry education innovation and reform (pp. 79-91). ACS Symposium Series.




DOI: http://dx.doi.org/10.5204/jld.v7i2.199
Abstract Views:
278
Views:
PDF
87

Article Metrics

Metrics Loading ...

Metrics powered by PLOS ALM

Refbacks

  • There are currently no refbacks.

Creative Commons License This work is licensed under a Creative Commons Attribution 4.0 License.



Contact | Announcements | © Queensland University of Technology | ISSN: 1832-8342