How CCDA instrument analyzed science pre-service teachers’ prior knowledge?

  • Safira Permata Dewi Biology Education, Faculty of Teacher and Training Education, Universitas Sriwijaya, Indonesia
  • E Ermayanti Biology Education, Faculty of Teacher and Training Education, Universitas Sriwijaya, Indonesia
  • Lucia Maria Santoso Biology Education, Faculty of Teacher and Training Education, Universitas Sriwijaya, Indonesia
Keywords: Cell cognitive diagnostic, Discrimination index, Difficulities index, Validity, Reliability

Abstract

Increasing the effectiveness of learning is done by exploring the initial understanding of science teacher candidates for the concept of cells to be studied. Exploring the prior knowledge of science teacher candidates about cell concepts can be done using the Cell Cognitive Diagnostic Assessment (CCDA) instrument.This study aims to determine the effectiveness of the CCDA instrument that has been developed. The research sample (n = 163) was student science teacher candidates coming from the Department of Chemistry Education, Physics Education, and Biology Education, Faculty Teaching Training and Education, Sriwijaya University, Indonesia. The topics tested include the structure and function of cells, cell membranes and molecular transport, cell reproduction, and cell communication. The results showed that all the items developed were valid, with a high level of reliability (0.86), very good discrimination index (0.44), and a balance was found between the number of questions classified as difficult and moderate. Although the research results show that this instrument has been valid and reliable, it still needs to be improved on the items so that they can be used in the future.  

References

Andariana, A., Zubaidah, S., Mahanal, S., & Suarsini, E. (2020). Identification of biology students’ misconceptions in human anatomy and physiology course through three-tier diagnostic test. Journal for the Education of Gifted Young Scientists, 8(3), 1071–1085. https://doi.org/10.17478/JEGYS.752438

Baek, Y., Xu, Y., Han, S., & Cho, J. (2015). Exploring Effects of Intrinsic Motivation and Prior Knowledge on Student Achievements in Game-based Learning. The Smart Computing Review, October, 368–377. https://doi.org/10.6029/smartcr.2015.10.001

Cresswell, J. W., & Cresswell, J. D. (2018). Research design: Qualitative, quantitative and mixed methods approaches (5th Edition). Thousand Oaks, CA: Sage.

Dhakne-Palwe, S., Gujarathi, A., & Almale, B. (2015). Item Analysis of MCQs and Correlation between Difficulty Index, Discrimination Index and Distractor Efficiency in a Formative Examination in Community Medicine. Journal of Research in Medical Education & Ethics, 5(3), 254. https://doi.org/10.5958/2231-6728.2015.00052.9

Duda, H. J., & Adpriyadi, A. (2020). Students’ Misconception in Concept of Biology Cel. Anatolian Journal of Education, 5(1), 47–52. https://doi.org/10.29333/aje.2020.515a

Ekon, E. E., & Edem, N. B. (2018). Conceptual Change Pedagogy a nd Its Effects On Students ’ Cognitive Achievement and Interest in Biology. 9(2), 3407–3413.

Esomonu, N. P.-M., & Eleje, L. I. (2020). Effect of Diagnostic Testing on Students’ Achievement in Secondary School Quantitative Economics. World Journal of Education, 10(3), 178. https://doi.org/10.5430/wje.v10n3p178

Galvin, E., & Mooney Simmie, G. (2015). Identification of Misconceptions in the Teaching of Biology: A Pedagogical Cycle of Recognition, Reduction and Removal. Higher Education of Social Science, 8(2), 1–8. https://doi.org/10.3968/6519

Gurel, D. K., Eryilmaz, A., & McDermott, L. C. (2015). A review and comparison of diagnostic instruments to identify students’ misconceptions in science. Eurasia Journal of Mathematics, Science and Technology Education, 11(5), 989–1008. https://doi.org/10.12973/eurasia.2015.1369a

Hailikari, T., Katajavuori, N., & Lindblom-Ylanne, S. (2008). The relevance of prior knowledge in learning and instructional design. American Journal of Pharmaceutical Education, 72(5). https://www.ajpe.org/lookup/doi/10.5688/aj7205113

Harahap, L. J., Komala, R., & Ristanto, R. H. (2020). Assesing critical thinking skills and mastery concepts: The case of ecosystem. Edusains, 12(2), 223-232. https://doi.org/10.15408/es.v12i2.16544

Hikmasari, P; Kartono; Mariani, S. (2017). Analyze of Diagnostic Assessment and Remedial Teaching Result of Mathematics Problem Solving Achievement by Problem Based Learning Model. Unnes Journal of Mathematics Education, 6(2), 215–222. https://doi.org/10.15294/ujme.v6i2.15576

Jang, E. E. (2008). A framework for cognitive diagnostic assessment. Towards Adaptive CALL: Natural Language Processing for Diagnostic Language Assessment, (January 2008), 117–131.

Kehoe, J. (1995). Basic Item Analysis for Multiple-Choice Tests. Basic Item Analysis for Multiple-Choice Tests. Eric Development Team.

Kheyami, D., Jaradat, A., Al-Shibani, T., & Ali, F. A. (2018). Item analysis of multiple choice questions at the department of paediatrics, Arabian gulf university, Manama, Bahrain. Sultan Qaboos University Medical Journal, 18(1), e68–e74. https://doi.org/10.18295/squmj.2018.18.01.011

Lestari, P., Ristanto, R. H., & Miarsyah, M. (2019). Metacognitive and conceptual understanding of pteridophytes: Development and validity testing of an integrated assessment tool. Indonesian Journal of Biology Education, 2(1), 15-24. http://dx.doi.org/10.31002/ijobe.v2i1.1225

Licona-Chávez, A. L., Montiel Boehringer, P. K., & Velázquez-Liaño, L. R. (2020). Quality assessment of a multiple-choice test through psychometric properties. MedEdPublish, 9(1), 1–12. https://doi.org/10.15694/mep.2020.000091.1

Mahjabeen, W., Alam, S., Hassan, U., Zafar, T., Butt, R., Konain, S., & Rizvi, M. (2018). Difficulty index, discrimination index and distractor efficiency in multiple choice questions. Ann. Pak. Inst. Med. Sci., January, 310–315.

Mahror, N., & Mahmud, S. N. D. (2020). Secondary school students’ cognitive structures and misconceptions in respiration topic. Humanities and Social Sciences Reviews, 8(3), 1272–1284. https://doi.org/10.18510/HSSR.2020.83130

Maizeli, A., Nerita, S., & Afza, A. (2020). An analysis of cognitive assessment readability toward biology learning outcome and process evaluation course. Journal of Physics: Conference Series, 1521(4), 0–5. https://doi.org/10.1088/1742-6596/1521/4/042014

Mehta, G., & Mokhasi, V. (2014). Item analysis of multiple choice questions-an assessment of the assessment tool. International Journal of Health Science and Research, 4(7), 197–202.

Nikmard, F., & Tavassoli, K. (2020). The Effect of Diagnostic Assessment on EFL Learners ’ Performance on Selective and Productive Reading Tasks. Jounal of Modern Research in English Language Studies, 7(1), 79–104. https://doi.org/10.30479/jmrels.

Oliffe, M., Thompson, E., Johnston, J., Freeman, D., Bagga, H., & Wong, P. K. K. (2019). Assessing the readability and patient comprehension of rheumatology medicine information sheets: A cross-sectional Health Literacy Study. BMJ Open, 9(2), 1–10. https://doi.org/10.1136/bmjopen-2018-024582

Pande, S. S., Pande, S. R., Parate, V. R., Nikam, A. P., & Agrekar, S. H. (2013). Correlation between difficulty & discrimination indices of MCQs in formative exam in Physiology. South-East Asian Journal of Medical Education, 7(1), 45-50. https://doi.org/10.4038/seajme.v7i1.149

Pekel, F. O. (2019). Misconceptions and “learning doctors.” Problems of Education in the 21st Century, 77(1), 5–7. https://doi.org/10.33225/PEC/19.77.05

Puthiaparampil, T., Rahman, M. M., & Lim, I. F. (2017). From Item Analysis to Assessment Analysis: Introducing New Formulae. MedEdPublish, 6(1), 1–12. https://doi.org/10.15694/mep.2017.000007

Quaigrain, K., & Arhin, A. K. (2017). Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation. Cogent Education, 4(1). https://doi.org/10.1080/2331186X.2017.1301013

Rahmah, N., Yusrizal, & Syukri, M. (2020). Analysis of multiple-choice question (MCQ) of physics final examination in senior high school. Journal of Physics: Conference Series, 1460(1). https://doi.org/10.1088/1742-6596/1460/1/012143

Roy, D., State, W., Zhang, Z., Ma, M., Arnaoudova, V., Panichella, A., Panichella, S., Gonzalez, D., Mirakhorli, M., & Mirakhorli, M. 2020. (2020). DeepTC-Enhancer: Improving the Readability of Automatically Generated Tests. ACM Reference Format, 1. https://doi.org/10.1145/3324884.3416622

Sim, S. M., & Rasiah, R. I. (2006). Relationship between item difficulty and discrimination indices in true/false-type multiple choice questions of a para-clinical multidisciplinary paper. Annals of the Academy of Medicine Singapore, 35(2), 67–71.

Soeharto, Csapó, B., Sarimanah, E., Dewi, F. I., & Sabri, T. (2019). A review of students’ common misconceptions in science and their diagnostic assessment tools. Jurnal Pendidikan IPA Indonesia, 8(2), 247–266. https://doi.org/10.15294/jpii.v8i2.18649

Sun, Y., & Suzuki, M. (2013). Diagnostic Assessment for Improving Teaching Practice. International Journal of Information and Education Technology, 3(6), 607–610. https://doi.org/10.7763/ijiet.2013.v3.345

Taherdoost, H. (2018). Validity and Reliability of the Research Instrument; How to Test the Validation of a Questionnaire/Survey in a Research. SSRN Electronic Journal, September. https://doi.org/10.2139/ssrn.3205040

Tan Geok Shim, G., Shakawi, A. M. H. A., & Azizan, F. L. (2017). Relationship between Students’ Diagnostic Assessment and Achievement in a Pre-University Mathematics Course. Journal of Education and Learning, 6(4), 364. https://doi.org/10.5539/jel.v6n4p364

Taib, F., & Yusoff, M. S. B. (2014). Difficulty index, discrimination index, sensitivity and specificity of long case and multiple-choice questions to predict medical students’ examination performance. Journal of Taibah University Medical Sciences, 9(2), 110–114. https://doi.org/10.1016/j.jtumed.2013.12.002

Toksöz, S., & Ertunç, A. (2017). Item Analysis of a Multiple-Choice Exam. Advances in Language and Literary Studies, 8(6), 141. http://dx.doi.org/10.7575/aiac.alls.v.8n.6p.141

Van Riesen, S. A. N., Gijlers, H., Anjewierden, A. A., & de Jong, T. (2019). The influence of prior knowledge on the effectiveness of guided experiment design. Interactive Learning Environments, 0(0), 1–17. https://doi.org/10.1080/10494820.2019.1631193

Velou, M. S., & Ahila, E. (2020). Refine the multiple-choice questions tool with item analysis. International Archives of Integrated Medicine, 7(8), 80–85. Retrieved from https://www.iaimjournal.com/wp-content/uploads/2020/08/iaim_2020_0708_13.pdf

Zhao, Z. (2013). An Overview of Studies on Diagnostic Testing and its Implications for the Development of Diagnostic Speaking Test. International Journal of English Linguistics, 3(1), 41–45. https://doi.org/10.5539/ijel.v3n1p41

Published
2021-04-25
How to Cite
Dewi, S. P., Ermayanti, E., & Santoso, L. M. (2021). How CCDA instrument analyzed science pre-service teachers’ prior knowledge? . Biosfer: Jurnal Pendidikan Biologi, 14(1), 25-35. https://doi.org/10.21009/biosferjpb.18247