P-22 Validation of an Educational Kinesiology Tutorial for Entry-Level DPT Students

Presenter Status

Graduate Student, Department of Physical Therapy

Second Presenter Status

Graduate Student, Department of Physical Therapy

Third Presenter Status

Graduate Student, Department of Physical Therapy

Fourth Presenter Status

Graduate Student, Department of Physical Therapy

Fifth Presenter Status

Department of Physical Therapy

Location

Buller Hallway

Start Date

31-10-2014 1:30 PM

End Date

31-10-2014 3:00 PM

Presentation Abstract

Introduction/Clinical Relevance: Few educational aids have been developed and validated in the field of kinesiology to help physical therapy students learn and understand kinesiology and biomechanics. The purpose of this study is to examine the validity and reliability of a newly developed kinesiology tutorial as determined by a panel of kinesiology experts. Methods: For this study, a rubric was developed to evaluate the tutorial against course objectives, APTA’s Foundational Sciences Matrix, and CAPTE accreditation standards. The electronic-based tutorial included video clips and images of human movement activities. Items from the multiple-choice question tutorial, along with the developed rubric, were distributed to content experts, selected based on their expertise in kinesiology concepts as evidenced by teaching foundational kinesiology content to physical therapist or exercise science students. Content experts reviewed their set of questions against the rubric, returning the rubric with constructive feedback to the researchers. Cronbach’s alpha, ICC, and kappa statistic were used to analyze the data. Results: Data analysis included calculating the intra-class coefficient (ICC) for reliability among reviewers (overall ICC = .756, p<0.001). Cronbach’s Alpha was calculated, comparing similar questions of the rubric among the five reviewers for consistency as well as individually comparing questions of the objectives of a Pathokinesiology course with those questions in the CAPTE Problem Solving Skills section of the rubric (scores ranging α = .656-.921, overall α = .954, p<0.001). Kappa statistic evaluating the percent agreement between reviewers’ scores revealed overall agreement of 62.7%. Discussion: Validation and reliability scores for the tutorial were significant, with good content and construct validity and excellent inter-rater reliability. Further research investigating the effects of using this newly validated tool in the classroom is suggested.

This document is currently not available here.

Share

COinS
 
Oct 31st, 1:30 PM Oct 31st, 3:00 PM

P-22 Validation of an Educational Kinesiology Tutorial for Entry-Level DPT Students

Buller Hallway

Introduction/Clinical Relevance: Few educational aids have been developed and validated in the field of kinesiology to help physical therapy students learn and understand kinesiology and biomechanics. The purpose of this study is to examine the validity and reliability of a newly developed kinesiology tutorial as determined by a panel of kinesiology experts. Methods: For this study, a rubric was developed to evaluate the tutorial against course objectives, APTA’s Foundational Sciences Matrix, and CAPTE accreditation standards. The electronic-based tutorial included video clips and images of human movement activities. Items from the multiple-choice question tutorial, along with the developed rubric, were distributed to content experts, selected based on their expertise in kinesiology concepts as evidenced by teaching foundational kinesiology content to physical therapist or exercise science students. Content experts reviewed their set of questions against the rubric, returning the rubric with constructive feedback to the researchers. Cronbach’s alpha, ICC, and kappa statistic were used to analyze the data. Results: Data analysis included calculating the intra-class coefficient (ICC) for reliability among reviewers (overall ICC = .756, p<0.001). Cronbach’s Alpha was calculated, comparing similar questions of the rubric among the five reviewers for consistency as well as individually comparing questions of the objectives of a Pathokinesiology course with those questions in the CAPTE Problem Solving Skills section of the rubric (scores ranging α = .656-.921, overall α = .954, p<0.001). Kappa statistic evaluating the percent agreement between reviewers’ scores revealed overall agreement of 62.7%. Discussion: Validation and reliability scores for the tutorial were significant, with good content and construct validity and excellent inter-rater reliability. Further research investigating the effects of using this newly validated tool in the classroom is suggested.