Online
graduate
educational
technology
program:
An
illuminative
evaluation
Andrew
Topper,
Ph.
D.
a,*
,
Sean
Lancaster,
Ph.
D.
b aEducationalPsychology,GrandValleyStateUniversity,GrandRapids,MI49504,UnitedStates b
SpecialEducation,GrandValleyStateUniversity,GrandRapids,MI49504,UnitedStates
A R T I C L E I N F O
Articlehistory:
Received7January2016
Accepted11October2016
Availableonline21October2016
Keywords: Educationaltechnology Programdevelopment Programevaluation Graduateeducation Onlineinstruction Highereducation A B S T R A C T
Withcontinuedgrowthinonlinecoursesandprogramsinhighereducationapressingneedexiststo evaluate theirperceived qualityand effectiveness.Evaluation criteria–courseevaluations,student surveysandretentiondata–frompreviousonlineprogramevaluationswereusedinthisstudy.An illuminativeevaluationusingdescriptiveandscientificanalysiswasundertakenforagraduatedegree programineducationaltechnology.Courseandprogram-leveldatawereanalyzedtocomparequalityfor twoprograms–anexistinghybridandnewonline.Analysisofstudentenrollments,courseevaluations, surveyresults,retention,andtimetocompletionrevealsimilarexperiencesreportedfromstudentsin bothprograms.Resultssuggestthatamajorityofstudentsweresatisfiedwiththeirgraduateexperience andviewthoseexperiencesasworthwhile.Thisilluminativeevaluationprovidesevidencethatonline graduateprogramsarecomparableandcansatisfystakeholders’expectationswhilemaintaininghigh levelsofquality.
ã2016TheAuthors.PublishedbyElsevierLtd.ThisisanopenaccessarticleundertheCCBY-NC-ND license(http://creativecommons.org/licenses/by-nc-nd/4.0/).
1.Introduction
Allen and Seaman (2011) reveal that 77% of individuals surveyedinpublicuniversitiesagreewiththestatement“online educationiscriticaltothelong-termstrategyofmyinstitution”(p. 29).Thesamestudyreportedonlineenrollmentas31.3%oftotal enrollmentinthosepublicuniversities.Regardingonline educa-tion,AllenandSeamanfounda34.4%steadyenrollmentin2011(p. 38)andagrowthrateforonlineenrollmentof9.3%,with32%of studentstakingatleastoneonlinecourse.Theirdefinitionofan “online course” is one having at least 80% of course content deliveredonline.
This growth in onlinecourses and programsraise questions concerning effectiveness and student success when compared withtraditional, on-campusofferings. Concernsaboutstudents’ persistenceandsuccessin onlinecoursessurfacedshortlyafter institutionsstartedofferingthem(Simonson,Schlosse,&Orellana, 2011).DekaandMcMurry(2006)offeredabaselinedefinitionof studentsuccess:“Twocommonindicesformeasuringsuccessare classgradeandretentionrates”(p.2).
Tallent-Runnelsetal.(2006)examinedearlystudiesofonline coursesandfoundthatmostweredescriptiveandlackedsufficient rigor.Theauthorsdefinedthreetypesofcourses:traditional, face-to-face;hybridorblended,withsomeonlineactivities;andonline, withno face-to-faceactivities. Fourthemes that impact online instruction emerged from their review: course environment, learneroutcomes,learnercharacteristics,andinstitutional admin-istrative characteristics. Conclusions drawn from this research identify students’ preferences for convenience and self-paced approachtoonlinecourses,especiallythosewithpriorexperience, the critical role of interactions in student success. Other researchers have identified loss of social connectedness, often operationalizedintheliteratureassocialpresence,asanadditional drawbackwithonlinecourses.
1.1.Studentsuccessinonlinecourses
Initial online course research demonstrated mixed results, whichlikelyreflectsthemultiplefactorssuccessfulonlinelearning is dependenton – i.e., institutional support, pedagogy, faculty objectives, content, student characteristics,etc. Hara and Kling (2000)evaluatedstudentexperiencesinInternet-enabledcourses (hybrid)using a qualitativecase studyapproach and foundno evidenceofisolation, increasedstudentanxietyandfrustration. They identified adequate technical support, clear expectations,
*Correspondingauthor.
E-mailaddresses:toppera@gvsu.edu(A.Topper),lancasts@gvsu.edu
(S.Lancaster).
http://dx.doi.org/10.1016/j.stueduc.2016.10.002
0191-491X/ã2016TheAuthors.PublishedbyElsevierLtd.ThisisanopenaccessarticleundertheCCBY-NC-NDlicense(http://creativecommons.org/licenses/by-nc-nd/4.0/).
ContentslistsavailableatScienceDirect
Studies
in
Educational
Evaluation
developmentofsocialpresenceandpromptfeedbackfromfaculty asindicatorsofsuccess.
Morerecentresearchcomparingstudentcourseevaluationsin online,hybridandtraditionalformats(Topper,2007)didnotfind statisticallysignificantdifferencesusingmeasuresofinstructional quality.Richardson(2009)alsofoundnosignificantdifferencein academic quality for online courses, with students adopting a differentapproach tolearningspending more time learning,as wellasappropriatetrainingandsupportrequiredforfaculty.
In contrast, Atchley,Wingerbach and Akers (2013) reported statistically significant differences in course completion and academicperformancebetween2004and2009basedonformat anddiscipline.Mayes,Luebeck,Akarasriworn,andKorkmaz(2011)
identified learners, faculty,medium, community and discourse, pedagogy, assessments and content as elements of successful onlinecourses.
Lee and Choi (2011) reviewed research on online course dropoutratesandidentifiedthreemaincategoriesthatinfluence students’decisions: studentfactors,course/programfactorsand environmental factors. Student factors include academic back-ground,knowledgeandskills,andpsychologicalattributes,while course/programfactorsincludedesign,institutionalsupportand interactions. Environmental factorsinclude work commitments andsupportivelearningenvironments.Theauthorsalsoprovide specificstrategiesforaddressingthesefactorsintheirreview.
Hart (2012), in her review of the literature on student persistenceinonlinecourses,providesamorenuanced interpre-tationof persistence,contrastingit withattrition – withdrawal from an online course – and identified factors that might contribute to persistence: satisfaction with online learning, a senseof belongingor community,motivation, peerand faculty support,time management and increasedcommunication with instructors. Crawford-Ferre and Wiest (2012) identified course design, interactions and faculty preparation and support as necessaryforeffectiveonlineinstructionalpractices.
Whileinitialevidenceregardingonlinecoursesindicatesome areasofconcern–retention,increasedstudentanxiety,frustration, timelyfacultycommunicationandlackofsocialpresence–more recentresearchmeasuringstudentexperiencesinonlinecourses are comparable with traditional and hybrid formats. Research examiningexperiencesofstudentsinonlineordistantprogramsis lessprominentbutearlyresultsarepromising.
1.2.Studentsuccessinonlineprograms
Onlinegraduateprogramevaluationsarelessprominentinthe literature,asindicatedbyHorneandSandmann(2012).Ofover150 published research articles they reviewed, only five met the author’s criteria for inclusion in their literature review. The author’sfoundthat:“Programevaluationresearchisneededtotest theoreticalevaluationmodelsorapproachestodeterminewhich aremostusefulandvaluableinprogramplanningandevaluation” (p.575).
Martinez,Liu,Watson,andBichelmeyer (2006) evaluatedan onlineinstructionaldesignandtechnologymaster’sdegreeusing faculty and administrator interviews, and student surveys collectedin2004.Theirresultsindicatetheonlineprogramwas equivalentintermsofquality,admissionandevaluationcriteria, whilefacultyfounditmoredifficultandtimeconsumingteaching online. Mills (2007) evaluation of an online and on-campus nursingprogramfrom1997to2003includedstudentadmissions, outcomemeasures,coursegrades,timetocompletion,retention andgraduationrates.Theauthorfoundthatonlinestudentstook longer to complete their program but had a higher overall retention rate, and while online graduate program enrollment increased,on-campusenrollmentsteadilydeclined.
Muller(2008)interviewedundergraduateandgraduate wom-en enrolled in on-campus and online programs focusing on learners’persistencetocompletionandfoundmultiplebarriersor factorsthatcontributetopersistence:motivation,engagementin learningcommunicationandappreciationfortheconvenienceof onlineprograms.
Facultyresponsibilitiestypicallyincludecoursedevelopment, instruction,coursestructure, evaluationand assessment among otherfactors(Crews,Wilkinson,Hemby,McCannon,&Wiedmaier, 2008).Facultymembersarealsoresponsiblefortimely communi-cation with students, developing a sense of community or belonging, assessment, and structuring course materials in pedagogicallyappropriateandaccessibleforms.
McDonnelletal.(2011)evaluatedanonlineteachereducation programattheUniversityofUtahinseveredisabilitiesusing pre-andpost-testscores,IEPscores,performancewithintheprogram, averageGPA inspecialized courses,PRAXISIIcompositescores, andstudentcourseevaluations.Theauthors’reportnosignificant differencesforstudentsintheonlineprogramcomparedwiththeir on-campuscohortsonmeasuresoflearning.
PaulandCochran(2013)describeinstitutionalresponsibilities including infrastructure (e.g., server space, reliable internet speeds,andlearningmanagementsystems),tutorialsforstudents and faculty, instructional technology support, and help desks amongotherfactorsassociatedwithsuccessfulonlineprograms. Whileessentialforsuccessfuldevelopmentandimplementationof online programs, institutional factors were addressed prior to programimplementationintheNorthCentralAssociation(NCA) accreditation proposal and are not considered as part of this illuminativeevaluation.
A case study by Czerkawksi (2013) described an online educational technology master’s degree implemented in 2008 focusingonemergingtechnologiesusingacasestudyapproach, highlighting the importance of pedagogical effectiveness for measuring program quality with attention on influences of universityculture.Theauthorrecommendsconductinga prelimi-naryevaluationbeforeamorecomprehensiveprogramevaluation.
GazzaandHunker(2014)focusedonfactorsthatcontributeto increasedstudentretentioninonlineprograms–socialpresence, course/program quality and individual student characteristics. Their analysis of twenty-three articles exploring retention in onlineprogramsindicatethattheissueismultidimensionaland recommend specific strategies including holding virtual office hours,promptlyreplyingtostudentinquiries,establishingclearing criteria,soliciting feedbackviacourseandprogramevaluations, offering mandatory online student orientation and facilitating student-to-studentandstudent-to-facultyinteraction.
Thesmallnumberofonlineprogramevaluationspublishedto dateprovidesomeoptimismforthefuture.Comparisonswith on-campusandhybridprogramsmeasuringquality,grades,timeto completion, retention and graduate rates all indicate similar resultsonline.Avarietyoffactorsclearlyarerequiredforsuccess, includingstudentcharacteristics,facultydevelopmentand peda-gogy,andinstitutionalsupport.
Basedonareviewofthesalientresearchononlinegraduate courseandprogramevaluations,thefollowingdatawasusedin this study: course-level – enrollments, student evaluations, perceptions of course quality, and retention rates; and at the programlevel–enrollments,retentionrates,timetograduation andstudentperceptionsofqualityandvalue.
1.3.Purpose
Thepurposeofthisilluminativeevaluationwastwo-fold:(a)to examine data reflecting enrollments and quality of an online graduate degree program in educational technology, compared
withahybridprogram,anddeterminetheextenttowhichtheyare similarordifferent;and(b)toaddtothesmall,butgrowing,body of evidence regarding the quality and effectiveness of online graduatecoursesandprograms.
2.Researchquestions
1.Canenrollmentsinanonlinegraduateprogramoffsetreductionsin ahybridprogram?
2.How do student experiences in online graduate courses and programcomparewithotherformats–hybrid?
3.Howdotheresultsofthisilluminativeevaluationcomparewith andcontributetopreviousresearchononlinegraduatecoursesand programs?
3.Background
AregionalinstitutionintheMidwestUnitedStatesprimarily servingstudentsinthelocalgeographicareawithinaCollegeof Educationprovidesundergraduateteachereducation,aswellas master’s level programs,certificates and courses, serves asthe contextforthisilluminativeevaluation.Thecollegehasthelargest populationofpart-timegraduatestudentsintheuniversity,but hasseenasteadydeclineinenrollmentsoverthepastfewyears due toa varietyof factors– changes in staterequirements for teacherprofessionalcertification,eliminationofgraduatetuition reimbursement,changesinadultpopulationsandothereconomic conditions.
The online master’s degree in educational technology was designedtoservefull-timeeducatorswhoresideoutsidethelocal areawhilemaintaininghighlevelsofinstructionalqualityallowing theuniversitytooperatein acompetitive market.For students whocurrentlydonothaveaccesstograduateeducation,including alumni, an online master’s degree provides opportunities for continuedprofessionaleducationinareasnotservedbyexisting on-campusofferings.
TheinstitutionisaccreditedbytheNCA,aregionalassociation comprisedofhighereducationalinstitutionsinnorthcentralstates oftheU.S.WithintheNCA,theHigherLearningCommission(HLC,
http://www.hlcommission.org/)isresponsibleforpostsecondary educationaccreditation.Followinginternalapprovalfortheonline graduateprogram,an on-siteHLCvisitwas requiredbeforethe online/distant program could be offered and the institution receivedapprovalin2011.
Criteria evaluatedbyHLC focusoninstitutional context and commitment, curriculum and instruction, faculty and student support,andevaluationandassessmentusingrequiredstandards establishedbytheWesternCooperativeforEducational Commu-nications (Best practices for Electronically offered Degree and CertificatePrograms,2002).Theinstitutionrequiresnewgraduate programsdevelop a formal evaluationplan targeted at sharing resultswithuniversitystakeholders.Theoriginalevaluationplan focusedonassessmentdata,forpurposesofaccreditation,anda more formal illuminative evaluation approach was adopted followingprogramimplementation.
3.1.Commonprogramelements
Bothmaster’sprogramsprovideanemphasisoneducational technologyintegration,havethesamefacultymembers,courses, curricula,assessments,andresultinthesamedegreeawarded–a master’sdegreeineducationaltechnology.The33-creditmaster’s degreeincludessix graduatecoursesineducationaltechnology, two foundations of education courses, two elective graduate coursesandacapstoneexperience.Allcoursesare3-creditsand
used synchronous and asynchronouscomputer-mediated com-munication tools in BlackBoard1 along with other web-based toolsanddigitalmedia.Datawereanalyzedacrossbothgraduate programs, focusing oncourse as well as program-level experi-ences.
Thecollegeofferedamaster’sdegreeineducationaltechnology in a traditional format until2004, when it was converted toa hybridformatwithsomeon-campusactivitiesandmajorityofthe workcompleted online.Afteran analysisof interest, an online optionwasdevelopedandapprovedin2011.
Tenuredandtenure-trackfacultydevelopedandteachmostof thehybridand onlinegraduatecourses, witha smallernumber taughtbyadjunctoraffiliatefaculty.Facultymembersalsohave experience and expertise related to educational technology integration and online/hybridinstruction. Instructors use tools, likesynchronouschats,todevelopasenseofcommunityintheir online courses and an online advising portal is available for studentsaswell.
3.2.Onlineprogram
Noon-campusactivities arerequiredfor theonline program and studentswho liveoutside thestatepayin-statetuitionfor theirgraduatecredits.Onlystudentswhoresideoutsidethelocal geographicareaareeligibletoenrollinonlinecourses.
4.Significance
Theresultsofthisprogramevaluation,whenaddedtosimilar publishedstudies,provideevidenceofthequalityand effective-nessofgraduateprogramsinanexpandingmarketplace.Together with previous studies of this kind, the results reported here contribute to evidence of additional quality online graduate offerings, as well as suggested criteria for evaluating those offerings.
5.Theoreticalframework 5.1.Illuminativeevaluation
Traditional educational program evaluations have inherent limitationsandaretypicallyfocusedonmeasurementasopposed todescription.Alternativeevaluationapproachesofferflexibility and a more naturalisticmethod for educational programs. The theoreticalframeworkusedisprogramevaluation,or implemen-tation analysis (Ryan, 1999), and the methodology used is illuminativeevaluation(Parlett&Hamilton,1972).
The illuminativeapproach toevaluation provides a rangeof informationandadegreeofflexibilitythatcannotbeduplicatedby using any evaluation paradigm concentrated on effectiveness testing.Gordon(1991)p.373.
ParlettandHamilton(1972)developeda programevaluation method focusing on details, such as goals and objectives, pedagogical approach, course content and overall philosophy. Illuminative evaluationinvolvesobservation, inquiryand expla-nation,withdual focusoninstructional systemsas well asthe learningmilieu.Theinstructionalsystemisdefinedbytheoriginal scope,goalsandassumptionsoftheprogramdeveloped,whilethe learningmilieuistherealizationoftheprograminfluencedbythe actualassumptions,contributionsandexperiencesof the stake-holders.
Illuminativeevaluationhasbeenusedinavarietyofdomains, including government educational initiatives (Alderman, 2015; Miles,1981),professionalnursingeducation(Ellis&Nolan,2005), workplacelearning(VanRensburg,2008)andsocialwork(Gordon, 1991).Illuminativeevaluationallowsresearcherstofocusonboth
intentionalandunintentionalconsequenceswhichtypicallyoccur inprogramimplementation(Alderman,2015).
Stepsinilluminativeevaluationare:designtheevaluationby identifyingevaluationquestions,datasets,aswellasthemesfrom the literature; data collection and analysis, in this case using descriptive statistical analysis, student surveys and course evaluations,andprogram-leveldata;interpretationofresultsby reviewingthemes fromliterature makingconnections tothem; and preparation of an evaluation report and dissemination of findingsforstakeholders.
This approach to program evaluation is rooted in social anthropology, has an explicit focus on the contexts where educational innovation operates and is designed to develop a comprehensiveexaminationofacomplexsocialsystem. Illumina-tiveevaluationwasselectedbecauseitisformative,emphasizing interpretationandunderstanding,whereastraditionalmethodsof evaluation are rooted in positivistic theories and prescriptive, focusing oneffectivenessand causality. A copy of theprogram evaluationreport hasbeensharedwith stakeholderstoinform program evolution and the illuminative evaluation process is ongoing.
6.Materialsandmethods 6.1.Participants&sample
Apurposeful,conveniencesamplewasdrawnfromstudentsin anonlinegraduateeducationaltechnologyprogrambetweenMay 2011andApril2014.AsofMay1,2014,71studentswereenrolled intheonlineprogram,withamajorityofstudentsbeingfull-time educators,enrolledpart-time,with81%female,primarilyteaching in K-12 settings, and two-thirds under the age of 35. Student demographics are similar with those of the hybrid graduate program.
6.2.Datasources
Studentcourse evaluation (SCE) data wereprovided by the institutionandsupplementedwithsurveysfromrecentgraduates oftheprogram,aswellasotherprogram-specificdata.Coursedata included(C1)enrollments,(C2)studentevaluations,(C3)surveys, and (C4)retention rates.Program-specific dataincluded– (P1) enrollments, (P2)student surveys,(P3)time tograduationand (P4)retentionrates.Whereapplicable,hybridprogramandcourse dataareusedasabaselineforcomparingstudents’experiences. AnalysisincludeduseofdescriptivestatisticsalongwithKruskal– WallisandMann-WhitneyU.
6.3.Ethicalissues
Auniversityinstitutionalreviewboardapprovedallaspectsof thisresearchandparticipantswereprovidedallrequiredsecurity andprivacyprotectionsforethicaleducationalresearch.Alldata collected and analyzed for this illuminative evaluation meet ethicalstandardsfortreatmentofhumansubjectsinresearch.The authors’donothaveanyvestedinterestsintheoutcomeofthis evaluation.
7.Results
7.1.Course-levelresults
Comparisons of course-specific data for different formats, includingfullyonline,hybrid–withsomeon-campusactivitiesbut themajoritydoneonline–andother–traditionalweeklyclass sessionsoncampuswith3–4onlinesessions,aswellasweekend
classsessions(FridayeveningandalldaySaturday)andcondensed sections(8-classsessionsover2-weeks,8:30am–3:00pm)are included in this section. Table 1 (below) summarizes student enrollments(C1)forcoursesofferedbetween2010and 2014by format.
Course enrollments (C1) have fluctuated since the online programwasimplemented,withonlinecoursesincreasinginitially and againin 2013–14, while hybrid enrollments have dropped since 2011–12 and other formats increased initially before dropping between 2012 and 14. A downwardtrend in student courseenrollmentshasbeenseeninothergraduatecoursesinthe collegeduringthesametimeperiod.
7.1.1.Studentcourseevaluations
SCEresponseratesforonline courseshavebeenreportedby researchers(Stowell,Addison&Smith,2012)assignificantlylower thanthoseforface-to-facecourses.AverageresponseratesforSCE data(C2)collectedforthisilluminativeevaluationwheremultiple sections of some courses are offered each term, show small variations–seeTable2below–indicatingalowerrateforhybrid formatsthanonlineorother.
Usingamulti-levelmodelincorporatingcourse,year/termand format,providesanaccuraterepresentationoftheeffectformat hasonSCEscoresbytakingintoaccountthevariabilityassociated withdifferentcourses.Dataanalysisconductedusingapairwise comparison showsthree significantdifferences– seeResults in
Table3below.
An overall F test before post hoc analysis using pairwise comparisons was used only where overallF test is statistically significant–seeTable4below.
ResultsoftheanalysisrevealthatmeanSCEscoresaregenerally lowerforhybridcourseswhencomparedwithonlineandother formatswhilemeanscoresforonlineandotherformatsarenot statisticallydifferent.
7.1.2.Studentsurveyresults
Thirty-four online survey (C3)invitations via SurveyMonkey wereemailedtoallcompletersof themastersprograms.Thirty responseswerereceivedforaresponserateof88%andthesedata included13completersoftheonlineprogramand17completersof the hybrid program. Students also enroll in other, traditional courseformats,whilethoseintheonlineprogramdonotsothese dataareanalyzedusingthetwoprogramformats–onlineversus hybrid/on-campus. Responses reveal that students in both programsratedcourseofferingsas“sufficientlyflexibilitytomeet their needs,” with online program students preferring online course(offeringswith0face-to-faceclasssessions)whilestudents in the hybrid program rated hybrid highest, followed by fully onlinecourses.
AssuggestedbydeWinterandDodou(2010),anonparametric Mann-Whitney-Wilcoxon(MWW)testwasusedtocomparethe twogroups(seeTable5)–hybrid(H)andonline(O)programs–on a5pointLikert-typescalefromStronglyDisagree(1)toStrongly Agree(5).
Table1
Graduatecourseenrollments2010–2014.
AcademicYear Hybrid Online Other
2010–11 114 15 43
2011–12 128 52 56
2012–13 118 42 76
Datarevealthatthetwogroupsofstudentshadnostatistically significant differences on any of the Likert-type scale course itemssurveyed.
7.1.3.Courseretentiondata
Studentselecttoleaveonlinecoursesforavarietyofreasonsas
Willgingand Johnson(2009) discovered.Theauthorsbegan by defining elements of “retention” or student success in online coursesandincludedtwofactorsintheirdefinition:withdrawals
fromcoursesafterthetermbeginsandstudentsreceivingfailing grades(DorF)inacourse.
Courseretentiondata(C4)suggestthatasignificantpercentage ofstudentscompletedcoursesreceivingapassinggradewhilea smallnumberdroppedout.Whentakenasapercentageoftotal course enrollments, retention data looks even more positive.
Table6(above)representsstudentwithdrawalsandfailinggrades from2011to2014withpercentagesforallcourseformats.
Table7(below)providesstudentcourseretentiondata,absent failinggrades,forallgraduatecoursesinthecollegebetween2011 and2014.Coursewithdrawalordropoutdatafromhybridcourses offeredduringthesameperiodarecomparablewiththosefrom othergraduatecourseswithinthecollege,whileonlinewithdrawal ordropoutratesarelower.
7.2.Program-levelresults
Analysisofhybridandonlineprogram-leveldataareanalyzed inthissection.AsofMay1,2014therewere71studentsinthe onlineprogram.Table8 (below)comparesprogramenrollments (P1)for both master’s degreeprograms – hybrid and online – between2011and2014.
Program enrollments have declined steadily since 2011–12, withthebothprogramsseeingreductionssince2011–12.Offering anonlineprogramhasoffset,tosomeextent,theoveralldropin programenrollmentswhichislikelytocontinueinthefuture. Table2
Sceresponserates2011–2014.
ResponseRates Hybrid Online Other
Sections 14 6 7
Avg.ResponseRate 61.82% 70.11% 69.61%
Table3
Multi-levelscemodelresults.
Mean,SD OverallFTest
Item Hybrid Online Other Statistic,p-value*
7 4.08,0.43 4.33,0.39 4.58,0.21 6.52,0.0252 8 4.19,0.49 4.40,0.35 4.52,0.32 2.98,0.1158 9 4.11,0.45 4.28,0.29 4.61,0.23 4.51,0.0552 10 3.85,0.65 4.19,0.41 4.31,0.35 3.02,0.1131 11 3.84,0.68 4.21,0.38 4.50,0.49 4.40,0.0579 12 4.11,0.47 4.18,0.44 4.57,0.29 3.47,0.0899 13 3.96,0.42 4.16,0.34 4.49,0.42 6.25,0.0277 14 4.11,0.48 4.17,0.36 4.56,0.27 3.70,0.0802 15 4.16,0.27 4.23,0.35 4.49,0.16 3.75,0.0781 16 3.92,0.66 4.06,0.39 4.46,0.25 2.54,0.1477 17 4.02,0.43 4.21,0.30 4.44,0.15 6.50,0.0254
Note:DF=7,95%confidencewithBonferronicorrectionp-valuelessthan0.05/
3=0.0168impliesstatisticallysignificantdifferences.
Table4
Courseformatpairwisecomparisons**.
OthervsHybrid OthervsOnline HybridvsOnline
0.0118 0.2995 0.0751
0.0098 0.1060 0.2338
0.0130 0.3578 0.0633
Note: *p-values less than or equal to 0.05 indicate statistically significant
differences,NumDF=2,DenDF=7.
Table5
Studentsurveyresultsregardingonlinecourses.
Surveyitem N Mean StdDev p
CoursesintheEdTechprogramseemrelevantforpositionsin
theedtechfield(e.g.,teachingandortechnologysupport).
H=17 O=13 1.47 1.69 .717 .751 0.36
Theamountofcourseworkrequiredseemsappropriate H=16
O=13 1.31 1.77 .479 .832 0.18
Courseofferingsaresufficientlyflexibletomeetmyneeds H=17
O=13 1.81 2.00 .656 .707 0.53
Ireceivedhonestandusefulfeedbackonmyclass
performance. H=17 O=13 1.41 1.38 .618 .506 0.96
Facultyuseavarietyofeffectiveinstructionalpractices H=17
O=13 1.59 1.69 1.00 0.630 0.39
Note:p>0.05,H=hybridstudents,O=onlinestudents,StdDev=standarddeviation.
Table6
Courseretentionrates.
Hybrid Online Other
Enrollment 322 169 129
Withdrawals 10:3.1% 2:1.18% 0
FailingGrades 2:0.62% 2:1.18% 0
RetentionRate 96.28% 97.64% 100%
Table7
College-widecourseretentiondata.
AcademicYear 2011–12 2012–13 2013–14
Enrollment 854 1397 916
Withdrawals 31:3.6% 45:3.2% 31:3.4%
7.2.1.Studentsurveyresults
Studentsurvey(P2)results(Table3)includedsomeadditional itemsthatwereadministeredtostudentsintheonlineprogram. Thirteen(13)graduatesoftheonlineprogram(65%)respondedto itemsregardingtheirexperiencesandperceptionsofthevalueof thoseexperiencesintheireducationaltechnologyprogram.Table9
(below) reflectsthe results of survey itemsrelated toprogram qualitywherestudentsAgreeorStronglyAgreewiththestatement usingtheLikert-scale.
Surveysalsorevealthat77%haverecommendedtheprogramto colleaguesand86%aresatisfied,overall,withtheirexperiences. Comparingresultsacrossbothprograms,forsurveyitemsrelated to program quality, there were no statistically significant differencesobserved.Whenthesenumbersare combinedwith responses for other items – recommending the program to colleaguesandoverallsatisfaction–theresultsindicatethatthe majority of students in both programs regard their graduate experienceassatisfyingandworthwhile.
7.2.2.Timetograduation
Areviewofstudents’averagetimetograduation(P3)overthe past5-yearsrevealsthatstudentsarefinishingtheonlineprogram earlierand in largernumbers. Table 10 (below) represents the numberofgraduatesfromeachprogram,byacademicyear,with averagetimetograduation(TTG)intermsorsemesterssincethe onlineprogramwasimplementedin2011.
Timetograduationdatasuggests,atleastinitially,thatstudents intheonlineprogrammovemorequicklythroughthecoursesthan theircounterpartsinthehybridprogram.Giventhesmallsample, itistoosoontodeterminewhetherthistrendwillcontinueinthe futureandfutureresearchshouldbuildontheseresultstoconfirm theirvalidityovertime.
7.2.3.Retentionrates
Researchers examined retention rates (P4) – percentage of studentswhocontinuetoenrollinavailablecoursesorgraduate –acrossboth programsbyidentifyinginactivestudentsasthose whohavenotenrolledinagraduatecoursewithina3-yearperiod. Twenty-five students were identified as inactive in the hybrid programwithtenstudentsinactiveintheonlineprogram.
The authors also identified the number of students who graduatedfrombothprogramssince2011–eightfromthehybrid programand fourfromtheonline program.Finally,researchers calculatedtheretentionratebydividingthenumberofinactive
studentsbytotalprogramenrollment.Studentretentiondatafor bothprogramsissummarizedinTable11above.
Aswiththeotherprogram-leveldata,retentionratesarehigh forbothprograms,withaslightlyhigherrate(83%)fortheonline program. Futurework will evaluategraduation ratesas well as revisedittimetograduationtoprovideadditionalevidenceofthe qualityoftheonlineprogram.
8.Implications
This illuminativeevaluation provides evidenceindicating no significantdifferencesintheexperiencesofstudentsinanonline graduateprogramcomparedwithstudentsinahybridprogram. Weconsidertheseresultsanintentionalconsequenceofdesigning and implementing an online program following established guidelinesandstandards.Returningtotheilluminativeevaluation frameworkappliedinthisstudy,studentsatisfactionlevelswere anexpectedoutcome,givenpreviousworkbyotherresearchers. Thequalityoftheeducationalexperience,measuredviastudent course evaluations, surveys and retention rates, is comparable acrosscoursesandprograms.
One unintended consequenceof theilluminative evaluation was the discovery that students in the hybrid program prefer online courses.While theoriginal expectationthat offeringthe onlineprogramwouldoffsetthedropinthehybridprogram,this didoccur,butatboththecourseandprogramlevel,enrollments continuetodecline.Theonlineprogramdidextendandexpand graduateofferingsbeyond thelocalgeographic area.Continued researchmayresultineliminationofthehybridprogrambasedon reducedfundingandavarietyofotherinstitutionalfactors.
Illuminativeevaluationprovedtobeaneffectiveframeworkfor onlinegraduateprogramevaluationandshouldbeconsideredfor futureevaluations.Theseresultshaveimportantimplicationsfor Table8
Programenrollments2010–2015.
AcademicYear Hybrid Online Total
2010–2011 31 0 31
2011–2012 38 25 63
2012–2013 27 26 53
2013–2014 24 20 44
Table9
Studentsurveyresultsregardingonlineprogram.
Surveyitem
Edtechprogramsupportedmyprofessionalgoals 85%
Coursesintheprogramseemedrelevantforpositionsintheeducationaltechnologyfield 85%
Ifeelwellpreparedtoworkintheeducationaltechnologyteachingorrelatedfields 85%
Ihavebeenpreparedadequatelytouseavailabletechnologiesinmywork 85%
Theprogramfosteredasenseofacademicandintellectualcuriosity 100%
Theprogramwasworthwhile 85%
Facultyinmyprogramareaccessibletostudents 100%
TheacademicadvisingIreceiveissatisfactory 62%
Edtechprogramsupportsmyprofessionalgoals 85%
Table10
Averagetimetograduation.
Hybrid TTG Online TTG 2011–12 15 10.47 1a 20 2012–13 14 13.57 3 6.67 2013–14 8 12.75 5 8.2 Totals 37 10.97 9 9 a
Startedinthehybridprogrambutshiftedtotheonlineprogramaftermoving
outoftheU.S.
Table11
Programretentiondata.
Hybrid Online Total
Enrollments 104 70 174
Inactive 25 10 35
Graduated 8 4 12
thesmallbut growing researchofonline graduatecourses and programs.Certainlythisstudysuggeststhatwhendevelopedand implementedeffectively,onlinegraduateofferings–coursesand programs – are of similar quality as hybrid options. Graduate studentsreportnosignificantdifferencesinqualityatthecourse andprogramlevel,usingavarietyofdatasources.
8.1.Course-levelanalysis
Analysis of graduate course enrollments, student course evaluations, survey responses and retention data indicate that therearefew,ifany,significantdifferencesbetweenonline,hybrid and other course formats. Enrollments (C1) in hybrid/other formatshavedeclinedandfluctuatedwhileonlinecourseshave seenaslightincreaseovertime.Studyresultsconfirmwhatothers haveobservedregardingsimilarstudents’perceptionsofcourse qualityreflectedinSCE(C2)acrossdifferentformats(Topper,2007; Nguyen,2015;Richardson,2009):threesignificantdifferencesfor hybrid format courses with no other significant differences observed. Student survey results (C3) indicate no significant differencesbasedoncourseformat.Onlinecourseretention(C4) hasbeenidentifiedasaconcernbyAtchleyetal.(2013)andothers, butthis datasuggeststhat amajorityofstudents,regardless of format,arelikely tocomplete courserequirements andreceive passinggrades–over95%.
8.2.Program-levelanalysis
Programleveldataprovidesadditionalevidenceoftheoverall qualityoftheprogram,especiallywhencomparedwiththehybrid program. Enrollments (P1) in the online program have offset steadydeclinesinthehybrid/on-campusprogram.Studentsurvey results (P2) indicate high levels of satisfaction with graduate experiences in the program, while time to graduation (P3) is slightlyhigherfortheonlineprogramcomparedwiththehybrid/ on-campusprogram,indicatingthatthesearelikelytofinishthe program sooner. Program retention rates (P4) have also been identifiedintheliterature(Gazza&Hunker,2014;Rovai,2002)as anareaofconcern,butthedataanalyzedinthisstudyreflecta slightlyhigherretentionrateforstudentsintheonlineprogram withan admittedlysmall samplesize.Aswiththecourse-level analysis,evidencegatheredregardingsimilaritiesanddifferences acrosstheprogramsindicatenosignificantdifferences.
9.Summary/Conclusions
Graduatecourseandprogramenrollments(researchquestion #1) havecontinued todecline acrossthe college, althoughthe onlineprogramhasoffsetthesereductions.Enrollmentsarelikely to continue to drop in the future, possibly leading to the elimination of competitive graduate programs in the state as morecompetitionarises.Othergraduatedegreeprogramswithin thecollegehave seenreductions in enrollmentsand somewill likely cease to operate in the near future. Offering an online programthatdrawsstudents fromoutsidethelocalgeographic areaappearstolessentheimpactofcontinueddeclinesinstudent enrollments allowingtheeducational technologygraduate pro-gramstothrive.
Regardingresearchquestion#2,datasuggeststhatstudents have similar experiences, perceptions of course and program quality,andaresatisfiedwiththeirexperiencesoverall.Students also indicated they will suggest the online program to their colleagues.Alloftheseresultscombinedwithprogramdataspeak tothe overall quality and effectiveness of the online graduate programineducationaltechnology.The course-levelresultsare consistentwithexistingresearchcomparingonlinewithhybridor
on-campusformats (Atchleyet al.,2013;Topper,2007; Stowell, Addison&Smith,2012;Young&Duncan,2014).
This illuminative evaluation also demonstrates that when developed and implemented following available standards and benchmarks, onlinegraduatedegree courses andprograms can matchthequalityandeffectivenessofexistinghybridprograms. PreviousstudiesbyMartinezetal.(2006),McDonnelletal.(2011), andMills(2007)reportedsimilarresults:nosignificantdifferences betweenonline andon-campus graduateprograms.Theresults reportedhere,addedtothesmall,butgrowing,bodyofevidence suggestthatonlinegraduateprogramscanbesimilarinqualityand effectivenesscomparedwithhybridprograms(researchquestion #3).
There are clear limitations in this illuminative evaluation, includinga smallsamplesizeintheonlineprogram,a homoge-neouspopulationofstudents–mostlyfemale,younger,white,etc. –and3-yearsofdataforanalysis.Weplantocontinuegathering andanalyzingdatainthefuturetohopefullyconfirmfindingsfrom this study. Also, any comparison between master’s programs within a college may not generalize to other institutions or colleges,basedondifferentstudentdemographics,faculty exper-tise, program goals, course content,pedagogy, etc. Given these limitations,moreworkremainsfromalargersampleinsupportof the claims made regarding overall online program quality. Hopefully,theconclusionsherewillprovideadditionalevidence regardingqualityonlinegraduatecoursesandprograms. AppendixA.
Studentcourse evaluationitems related tocourseelements, teachingpracticesandgoals/outcomes
Goals&Outcomes:Igainedanunderstandingofmajorconcepts inthefield.
IdevelopedskillsorlearnedinformationandconceptsthatIcan applytomyprofessionallife.
Iamabletothinkmorecriticallyordeeplyabouttheissuesand topicsrelatedtothiscourse.
TeachingPractices[Instructor]:Theinstructorexplainedcourse objectivesandexpectationsclearly.
Theinstructorpresentedcoursematerialsinaclearandmeaningful way.
Thecourseworkallowedmetothinkcritically,problemsolve,and inquiremoredeeplyintofacts,concepts,andissuesrelatedtothe courseobjectives.
Theinstructorreturnedpapersandotherassessmentsinatimely manner.
CourseElements[Assessments]:Assessmentshelpedmethink more deeplyaboutthe facts,issues,andconcepts relatedtothe course.
Thecourseprovidedenoughopportunitiesformetodemonstrate whatIhadlearned.
Coursediscussions,lectures,activities,and/orreadingspreparedme forcourseassessments.
Coursereadingmaterialswerehelpfulinunderstandingconcepts/ factors/issuesrelatingtothegoalsofthecourse.
Likert scale {1=Strongly Disagree, 3=Neutral, 5=Strongly Agree}
References
Alderman,L.(2015).Illuminativeevaluationappliedtogovernmentpolicyinhigher
education.EvaluationJournalofAustralia15(1),4–14..Retrievedon10.04.16.:
Allen,I.E.,&Seaman,J.(2011).Goingthedistance:OnlineeducationintheUnited
States.SloanConsortium..Retrievedonlineon02.05.14.:http://eric.ed.gov/?
id=ED529948.
Atchley,T.W.,Wingerbach,G.,&Akers,C.(2013).Comparisonofcoursecompletion andstudentperformancethroughonlineandtraditionalcourses.The InternationalReviewofResearchinOpenandDistributedLearning,14(4),104–116.
Crawford-Ferre,H.G.,&Wiest,L.R.(2012).Effectiveonlineinstructioninhigher education.TheQuarterlyReviewofDistanceEducation,13(1),11–14.
Crews,T.B.,Wilkinson,K.,Hemby,K.V.,McCannon,M.,&Wiedmaier,C.(2008). Workloadmanagementstrategiesforonlineeducators.DeltaPiEpsilonJournal,
50(3),132–149.
Czerkawksi,B.C.(2013).Strategiesforintegratingemergingtechnologies:Case studyofanonlineeducationaltechnologymaster’sprogram.Contemporary EducationalTechnology,4(4),309–321.
Deka,T.S.,&McMurry,P.(2006).Studentsuccessinface-to-faceanddistance telecastsenvironments:Amatterofcontact?TheInternationalReviewof ResearchinOpenandDistanceLearning,7(1),1–15.
Ellis,L.,&Nolan,M.(2005).Illuminatingcontinuingprofessionaleducation: Unpackingtheblackbox.InternationalJournalofNursingStudies,42(1),97–106. http://dx.doi.org/10.1016/j.ijnurstu.2004.05.006.
Gazza,E.A.,&Hunker,D.F.(2014).Facilitatingstudentretentioninonlinegraduate nursingeducationprograms:Areviewoftheliterature.NurseEducationToday,
34(7),1125–1129.http://dx.doi.org/10.1016/j.nedt.2014.01.010.
Gordon,K.H.(1991).Improvingpracticethroughilluminativeevaluation.Social ServiceReview,65(3),365–378.
Hara,N.,&Kling,R.(2000).Studentdistressinweb-baseddistanceeducation.
Information,CommunicationandSociety,3(4),557–579.http://dx.doi.org/ 10.1080/13691180010002297.
Hart,C.(2012).Factorsassociatedwithstudentpersistenceinanonlineprogramof study:Areviewoftheliterature.JournalofInteractiveOnlineLearning,11(1),19–
42.
Horne,E.M.,&Sandmann,L.R.(2012).Currenttrendsinsystematicprogram evaluationofonlinegraduatenursingeducation:Anintegrativeliterature review.JournalofNursingEducation,51(10),570–576.http://dx.doi.org/10.3928/ 01484834-20120820-06.
Lee,Y.,&Choi,J.(2011).Areviewofonlinecoursedropoutresearch:Implicationsfor practiceandfutureresearch.EducationalTechnologyResearch&Development,
59,593–618.http://dx.doi.org/10.1007/s11423-010-9177-y.
Martinez,R.,Liu,S.,Watson,W.,&Bichelmeyer,B.(2006).Evaluationofaweb-based master’sdegreeprogram:Lessonslearnedfromanonlineinstructionaldesign andtechnologyprogram.QuarterlyReviewofDistanceEducation,7(3),267–283 345–346.
Mayes,R.,Luebeck,J.,Ku,H.-Y.,Akarasriworn,C.,&Korkmaz,O.(2011).Themesand strategiesfortransformativeonlineinstruction:Areviewoftheliteratureand practice.QuarterlyReviewofDistanceEducation,12(3),151–166221–222.
McDonnell,J.,Jameson,J.M.,Riesen,T.,Polychronis,S.,Crockett,M.A.,&Brown,B.E. (2011).Acomparisonofon-campusanddistanceteachereducationprogramsin severedisabilities.TeacherEducationandSpecialEducation,34(2),106–118. http://dx.doi.org/10.1177/0888406410380424.
Miles,W.R.(1981).Illuminativeevaluationforformativedecision-making:Acase studyofteaching/learningstylesinmathematics.EvaluationReview,5(4),479–
499.http://dx.doi.org/10.1177/0193841x8100500403.
Mills,A.C.(2007).Evaluationofonlineandon-siteoptionsforMaster’sdegreeand post-master’scertificateprograms.NurseEducator,32(2),73–77.http://dx.doi. org/10.1097/01.NNE.0000264326.10297.e7.
Muller,T.(2008).Persistenceofwomeninonlinedegree-completionprograms.
InternationalReviewofResearchinOpenandDistanceLearning,9(2),1–18.
Nguyen,T.(2015).Theeffectivenessofonlinelearning:Beyondnosignificant differenceandfuturehorizons.JournalofOnlineLearningandTeaching,11(2), 309–319.
Parlett,M.,&Hamilton,D.(1972).Evaluationasillumination:Anewapproachtothe
studyofinnovatoryprograms.Occasionalpaper9.Scotland:CentreforResearch
intheEducationalSciences,UniversityofEdinburgh..[Retrievedonlineon
22.11.15.:]http://files.eric.ed.gov/fulltext/ED167634.pdf.
Paul,J.A.,&Cochran,J.D.(2013).Keyinteractionsforonlineprogramsbetween faculty,students,technologies,andeducationalinstitutions:Aholistic framework.QuarterlyReviewofDistanceEducation,14(1),49–62.
Richardson,J.T.E.(2009).Face-to-faceversusonlinetutoringsupportinhumanities coursesindistanceeducation.ArtsandHumanitiesinHigherEducation,8(1),69–
85.http://dx.doi.org/10.1177/1474022208098303.
Rovai,A.P.(2002).Insearchofhigherpersistenceratesindistanceeducationonline programs.TheInternetandHigherEducation,6(1),1–16.http://dx.doi.org/ 10.1016/S1096-7516(02)00158-6.
Ryan,N.(1999).Rationalityandimplementationanalysis.JournalofManagement History,5(1),36–52.
Simonson,M.,Schlosser,C.,&Orellana,A.(2011).Distanceeducationresearch:A reviewoftheliterature.JournalofComputinginHigherEducation,23,124–142. http://dx.doi.org/10.1007/s12528-011-9045-8.
Stowell,J.R.,Addison,W.E.,&Smith,J.L.(2012).Comparisonofonlineand classroom-basedstudentevaluationsofinstruction.Assessment&Evaluationin HigherEducation,37(4),465–473.
Tallent-Runnels,M.K.,Thomas,J.A.,Lan,W.Y.,Cooper,S.,Ahern,T.C.,Shaw,S.M.,et al.(2006).Teachingcoursesonline:Areviewoftheresearch.Reviewof EducationalResearch,76(1),93–135.
Topper,A.(2007).Aretheythesame?Comparingtheinstructionalqualityofonline andface-to-facegraduateeducationcourses.Assessment&EvaluationinHigher Education,32(6),681–691.
VanRensburg,E.(2008).Anilluminativeevaluationofworkplacelearningcomponent
ofUnisa’sdiplomainAnimalHealth...Retrievedonlineon22.09.15:http://hdl. handle.net/10539/4964.
Willging,P.A.,&Johnson,S.D.(2009).Factorsthatinfluencestudents’decisionto dropoutofonlinecourses.JournalofAsynchronousLearningNetworks,13(3), 115–
127.
Young,S.,&Duncan,H.E.(2014).Onlineandface-to-Faceteaching:Howdostudent ratingsdiffer?JournalofOnlineLearning&Teaching,10(1),70–79.
deWinter,J.C.F.,&Dodou,D.(2010).Five-pointLikertitems:t-testversus
Mann-Whitney-Wilcoxon.PracticalAssessment,Research&Evaluation15(11),1–16..