Open Access Open Access  Restricted Access Subscription or Fee Access

Validity and Reliability Analyses of School Achievement Test in Disaster Readiness and Risk Reduction for Senior High School

Nucleilee T. Mariño, Dennis G. Caballes

Abstract


This study was anchored from Item-Response Theory, which conducted an item analysis on the developed School Achievement Test (SAT) in Disaster Readiness and Risk Reduction for Senior High School learners. The study that employed descriptive-comparative research design utilized twenty-four Grade 12 learners taking up General Academic Strand (GAS) of Batch 2020 who took the subject during the second semester of the past school year, 2018-2019.

The researcher created 120-item test based on the desired learning competencies of the subject Disaster Readiness and Risk Reduction, a special core subject in the Senior High School Curriculum. The questions were subjected to validity and reliability analyses using different statistical formula: Mean, Standard Deviation, Standard Error of the Mean (SEM), Upper-Lower Formula, and Split-half reliability.

As a result of the analyses, the test obtained 72.58 mean score and 3.64 standard error measurement, and would expect 95% test takers test score to vary from 68.94 to 76.22 score. Also, an acceptable 0.818 reliability coefficient was obtained based on the reliability analysis. The item analysis result reveals that 51 items have difficulty index in the range of 0.3 to 0.7 and 69 items have discrimination index of greater than 0.2. In finality, 42 items should be retained, 27 items should be revised, and 51 items to be rejected.

Based on the results and conclusions drawn from the study, it is suggested that a more comprehensive validation process may be applied to further validate the developed SAT like item-bias analysis. Also, it is proposed that policies and standards regarding test construction and validation be implemented to produce valid, reliable and bias-free test items, which will serve as item banks.

Keywords


Disaster Readiness and Risk Reduction, School Achievement Test, Test Reliability, Test Validity

Full Text:

PDF

References


Department of Education, DO no.21, s. 2019 Policy Guidelines on the K to 12 Basic Education Program, Pasig City, Philippines: Department of Education, Republic of the Philippines, 2019.

RA 10121, Philippine Disaster Risk Reduction and Management Act of 2010, Metro Manila, Philippines, 2010.

RCC Secretariat, Asian Disaster Preparedness Center (ADPC), RCC Guideline 6.1: Integrating Disaster Risk Reduction into School Curriculum, Consultative Version 3.1, Pathumthani 12120, 2007.

Edutopia, "Why is assessment important," George Lucas Educational Foundation.

McTighe, Jay and Ferrera, Steven, Assessing Learning in the Classroom. Student Assessment Series, United States: National Education Association, 1998.

Hemant Lata Sharma and Poonam, "Construction and standardization of an achievement test in English grammar," International Journal of Advanced Educational Research, vol. 2, no. 5, pp. 230-235, September 2017.

Vandeyar, Saloshna and Killen, Roy, "Educators' conceptions and practice of classroom assessments in post-apartheid South Africa," South African Journal of Education, vol. 27, no. 1, pp. 101-115, 06 02 2007.

W. J. Popham, "Assessment Literacy Overlooked: A Teacher Educator's Confession," The Teacher Educator, vol. 46, no. 4, pp. 265-273, 2011.

W. B. Dowler, Validity, Reliability and Item Bias, 2011, pp. 1-6.

Jack R. Fraenkel and Norman E. Wallen, How to Design and Evaluate Research in Education, 6th ed., New York: McGraw-Hill , 2007, pp. 150-151.

Johnson, Robert and Riazi, Mehdi, "Validation of a locally created and rated writing test used for placement in a higher education EFL program," Assessing Writing, vol. 32, September 2016.

S. Livingston, "Test reliability—Basic concepts (Research Memorandum No. RM-18-01).," Educational Testing Service, Princeton, New Jersey, 2018.

Chapter 5. Test Construction and Try Out, pp. 169-213.

U. O. Washington, "Understanding of Item Analyses," Seattle.

Michael, Happy Linda and Iyekekpolor, Solomon, "The Impacts of Senior Secondary Schools’ Quality and the Content Validity of Their Geography Tests Items," International Journal of Science and Research (IJSR), vol. 8, no. 8, pp. 443-447, August 2019.

Hambleton, R.K., Hambleton T. and Swaminathan H., Item Response Theory: Principles and Applications, New York: Springer Science + Business Media, 1985.

C. A. Glas, "Item response theory in educational assessment and evaluation," Mesure et évaluation en éducation, vol. 31, no. 2, pp. 19-34, 2008.

Stephanie, "Item Response Theory: Simple Definition," Statistics How To, 2015.

F. B. Baker, The Basics of Item Response Theory, 2nd ed., Washinton DC.: ERIC Clearinghouse on Assessment and Evaluation, 2001, p. 15.

S. M. Downing, "Test Development," in International Encyclopedia of Education, Illinois, Chicago: Elsevier, 2010, pp. 159-165.

R. A. Childs, "Constructing Classroom Achievement Test. ERIC Digest," ERIC Clearinghouse on Tests Measurement and Evaluation Washington DC., American Institutes for Research Washington DC., Washington DC, 1989.

A. Choudhury, "Top 4 Steps for Constructing a Test," Your Article Library.

D. o. Education, Disaster Readiness and Risk Reduction Curriculum Guide, Pasig City, 2013.

D. Scully, "Constructing Multiple-Choice Items to Measure Higher-Order Thinking," Practical Assessment, Research, and Evaluation, vol. 22, no. 1, pp. 1-13, May 2017.

Yale University, "Considering Validity in Assessment Design," Yale.

J. Izard, "Module 6. Overview of Test Construction," UNESCO International Institute for Educational Planning.

Bachelor of Education India, Achievement Test, 2012.

Stages of Test Construction. Chapter 7.

D. B. B. Frey, "Quality Test Construction," KU The University of Kansas, Kansas.

S. E. Team, "Psychological testing: Construction, Administration, Validity," St. Rosemary Institution@2010-2020.Creative Commons 4.0, 2019.

R. Ebel, Essentials of Educational Measurement, 3rd ed., Englewood Cliffs, New Jersey: Prentice Hall, 1979.

"Interpreting the Item Analysis Report".

J. Brown, Testing in Language Programs, Upper Saddle River, New Jersey: Prentice Hall, 1996.

T. R. Guskey, "How Classroom Assessments Improve Learning," Educatoinal Leadership, vol. 60, no. 5, pp. 6-11, February 2003


Refbacks

  • There are currently no refbacks.