First Advisor

Merchant, William

First Committee Member

Tsai, Chia Lin

Second Committee Member

Larkins, Randy

Third Committee Member

Bowen, Sandy

Degree Name

Doctor of Philosophy

Document Type

Dissertation

Date Created

8-2025

Department

College of Education and Behavioral Sciences, APCE Student Work, Applied Statistics and Research Methods

Abstract

The purpose of the current dissertation study was to investigate how different delivery formats (written vs. sign language) of the course survey evaluation affect the response rate, psychometric properties and experiences of deaf and hard of hearing students taking course evaluation of total effectiveness. The following research questions guided this study:

Q1 To what extent do the different delivery formats of the online course evaluation of total effectiveness affect the response rate of DHH college students in KSA? (Quantitative question)

Q2 Do psychometric characteristics of the online course evaluation of total effectiveness differ by the design format of the online evaluation survey? (Quantitative question)

Q3 How do deaf college students perceive online course evaluation of total effectiveness with sign language formats? (Qualitative question)

A sequential explanatory mixed-methods design was used. In the first phase, the quantitative data were gathered and analyzed. Ninety deaf college students were surveyed (experiment group = 48; control group = 42). The results found that the overall response rate was very low (12.5%). The results also indicated the experimental group attained a marginally higher response rate (RR = 13.41%) in comparison to the control group (RR = 11.73%). The chi-square test indicated this difference in response rate between the experimental group and control group of deaf college students was not statistically significant (p = 0.573). This result is a relatively low response rate, which is not uncommon for surveys.

Next a confirmatory factor analysis was performed with the data for only the evaluation course subscale (four items). The model fit results indicated that the data aligned very well with the course survey evaluation model. I also tested a one-factor model across groups for configural invariance, metric invariance, and scalar invariance. All groups demonstrated configural, metric invariance, and scalar invariance, which confirmed that the psychometric properties of the online course survey were not varied based on the delivery formats of online survey used to evaluate overall effectiveness. The internal consistency reliability (Cronbach’s alpha) and McDonald Omega reliability were conducted on the CFA data for the CSE instrument, and the results met acceptable standards.

In the second phase, qualitative data and its analysis were implemented to clarify and explain those statistical results by delving more deeply into the participants' perspectives on their course survey evaluation and their response rates. Eight deaf college students participated. Relevant data were collected by in-depth semi-structured interviews. From the semi-interviews, several themes were identified as follows: (a) more support needs; (b) being authentic in communication; (c) problems with online surveys; and (d) visual quality; and (e) recommendations for increasing response rate. Participants believed these themes to be worthy of note in order to increase deaf college students’ response rates. Finally, several limitations, future research, and implications were reported.

Abstract Format

html

Places

Greeley, Colorado

Extent

169 pages

Local Identifiers

Alqahtani_unco_0161D_11372.pdf

Rights Statement

Copyright is held by the author.

Digital Origin

Born digital

Share

COinS