• No results found

Army Educational Outreach Program College Qualified Leaders 2014 Annual Program Evaluation Report

N/A
N/A
Protected

Academic year: 2021

Share "Army Educational Outreach Program College Qualified Leaders 2014 Annual Program Evaluation Report"

Copied!
164
0
0

Loading.... (view fulltext now)

Full text

(1)

Army Educational Outreach Program

College Qualified Leaders

2014 Annual Program Evaluation Report

(2)

U.S. Army Contacts

Jagadeesh Pamulapati, Ph.D. Andrea Simmons-Worthen

Acting Executive Director, Strategic & Program Planning Army Educational Outreach Program Director on Office of the Assistant Secretary of the Army behalf of the Office of the Deputy

Acquisition, Logistics, and Technology Secretary of the Army for Research and Technology

(703) 617-0309 (703) 617-0202

jagadeesh.pamulapati.civ@mail.mil andrea.e.simmons.ctr@mail.mil AEOP Cooperative Agreement Managers

Louie Lopez Jennifer Carroll

AEOP Cooperative Agreement Manager AEOP Deputy Cooperative Agreement Manager U.S. Army Research, Development, and U.S. Army Research, Development, and

Engineering Command (RDECOM) Engineering Command (RDECOM)

(410) 278-9858 (410) 306-0009

3louie.r.lopez.civ@mail.mil jennifer.j.carroll2.civ@mail.mil

College Qualified Leaders Program Administrators

Artis Hicks Tim Turner

CQL Program Manager Principal Investigator

American Society for Engineering Education American Society for Engineering Education

(202) 331-3558 (202) 331-3514

a.hicks@asee.org t.turner@asee.org

Report CQL_03_05152015 has been prepared for the AEOP Cooperative Agreement and the U.S. Army by Virginia Tech under award W911NF-10-2-0076.

Virginia Tech Evaluation Contacts Tanner Bateman

Senior Project Associate, AEOP CA Virginia Tech

(703) 336-7922 tbateman@vt.edu

Donna Augustine Burnette Program Director, AEOP CA Virginia Tech

(540) 315-5807

donna.augustine@vt.edu

Eric Banilower Senior Researcher Horizon Research, Inc. (919) 489-1725

(3)

Contents

Executive Summary ... 4

Introduction ... 11

Program Overview ... 11

Evidence-Based Program Change ... 14

FY14 Evaluation At-A-Glance ... 14

Study Sample... 18

Respondent Profiles ... 19

Actionable Program Evaluation ... 23

Outcomes Evaluation ... 49

Summary of Findings ... 67

Recommendations ... 70 Appendices ... AP-1 Appendix A FY14 CQL Evaluation Plan ... AP-2 Appendix B FY14 CQL Student Questionnaire and Data Summaries ... AP-5 Appendix C FY14 CQL Mentor Questionnaire and Data Summaries ... AP-44 Appendix D FY14 CQL Student Focus Group Protocol ... AP-78 Appendix E FY14 CQL Mentor Focus Group Protocol ... AP-80 Appendix F APR Template... AP-82 Appendix G American Society for Engineering Education (ASEE) Evaluation Report Response…………..……….AP-92

(4)

Executive Summary

The College Qualified Leaders (CQL) program, managed by the American Society for Engineering Education (ASEE), is an Army Educational Outreach Program (AEOP) that matches talented college students and recent graduates (herein referred to as apprentices) with practicing Army Scientists and Engineers (Army S&Es, herein referred to as mentors), creating a direct apprentice-mentor relationship that provides apprentice training that is unparalleled at most colleges. CQL allows alumni from Gains in the Education of Mathematics and Science (GEMS) and Science and Research Apprentice Program (SEAP) to continue their relationship with the mentor and/or laboratory, and also allows new college students to enter the program. CQL offers apprentices the provision of summer, partial year, or year-round research at the Army laboratory, depending on class schedules and school location. CQL apprentices receive firsthand research experience and exposure to Army research laboratories. CQL fosters desire in its participants to pursue further training and careers in STEM while specifically highlighting and encouraging careers in Army research.

This report documents the evaluation of the FY14 CQL program. The evaluation addressed questions related to program strengths and challenges, benefits to participants, and overall effectiveness in meeting AEOP and program objectives. The assessment strategy for CQL included questionnaires for students and mentors, three focus groups with students and one with mentors, and an annual program report compiled by ASEE.

2014 CQL sites included the US Army Research Laboratory – Aberdeen Proving Ground (ARL-APG), the US Army Research Laboratory – Adelphi (ARL-A), the Walter Reed Army Institute of Research (WRAIR), the US Army Medical Research Institute for Infectious Diseases (USAMRIID), the US Army Aviation & Missile Research Development and Engineering Center – Redstone Arsenal (AMRDEC), the Engineering Research and Development Center Construction Engineering Research Laboratory (ERDC-CERL), the US Army Medical Research Institute of Chemical Defense (USAMRICD), the US Army Center for Environmental Health Research (USACEHR), the Defense Forensic Science Center (DFSC), and the Engineering Research and Development Center in Vicksburg, MS (ERDC-MS).

2014 CQL Fast Facts

Description STEM Apprenticeship Program – Summer or school year, at Army laboratories with Army S&E mentors

Participant Population College undergraduate and graduate students

No. of Applicants 550

No. of Students (Apprentices) 307

Placement Rate 56%

No. of Adults (Mentors) 288

No. of Army S&Es 288

No. of Army Research Laboratories 10 No. of Colleges/Universities 104

No. of HBCU/MSIs 13

(5)

Stipend Cost (paid by participating labs) $3,534,144 Administrative Cost to ASEE $129,319 Cost Per Student Participant $11,933

Summary of Findings

The FY14 evaluation of CQL collected data about participants; their perceptions of program processes, resources, and activities; and indicators of achievement in outcomes related to AEOP and program objectives. A summary of findings is provided in the following table.

2014 CQL Evaluation Findings Participant Profiles

CQL had limited success at serving students of historically

underrepresented and underserved populations.

 CQL attracted some participation of female students—a population that is historically underrepresented in engineering fields. However, enrollment data suggests that participation of female students was limited: 75% of enrolled apprentices were male, 25% were female.

 CQL served some students from historically underrepresented and underserved race/ethnicity groups, however that involvement was limited. The vast majority of enrolled apprentices identified themselves as “White” or “Asian”; only 8% identified themselves as being from an underrepresented or underserved minority group (5% Black or African American & 3% Hispanic or Latino).

CQL had limited success in recruiting past AEOP program participants.

 Questionnaire data indicate that the vast majority of responding apprentices had participated in CQL at least once (although it’s not clear whether the one time was including or in addition to current participation), and 30% had participated more than once. In addition, just over 30% of students had participated in SEAP at least once. However, for other AEOP programs, the vast majority of responding apprentices have never participated (ranging from 87% to 98%).

Actionable Program Evaluation

CQL recruitment was largely the result of pre-existing relationships

 Mentor questionnaire data indicate that recruitment of students was most

commonly done through colleagues, personal acquaintances, and contact from the student.

 Apprentice questionnaire data indicate that apprentices most commonly learned about CQL from someone who works at an Army laboratory, teachers or professors, immediate family members, university resources, friends, mentors, or past CQL participants. In addition, apprentice focus group data support the idea that pre-existing relationships were instrumental in making students aware of CQL. CQL apprentices were

motivated to participate in CQL by a variety of factors.

 Apprentices were motivated to participate in CQL, according to questionnaire data, by an interest in STEM, the desire to expand laboratory and research skills, and the opportunity to learn in ways that are not possible in school. Other highly

motivating factors included building a college application or résumé, earning a stipend or award while doing STEM, networking opportunities, and opportunities to use advanced laboratory technology. Focus group data also suggest that

(6)

apprentices were motivated by the opportunity to gain job and research experience.

CQL engages apprentices in meaningful STEM learning.

 Most apprentices (67-93%) report learning about STEM topics, applications of STEM to real-life situations, STEM careers, and cutting-edge STEM research on most days or every day of their CQL experience.

 Most apprentices had opportunities to engage in a variety of STEM practices during their CQL experience. For example, 93% reported participating in hands-on STEM activities; 88% practicing using laboratory or field techniques, procedures, and tools; 81% working as part of a team; 77% carrying out an investigation; and 76%

analyzing and interpreting data or information on most days or every day.  Apprentices reported greater opportunities to learn about STEM and greater

engagement in STEM practices in their CQL experience than they typically have in school.

 A clear majority of mentors report using strategies to help make learning activities relevant to apprentices, support the needs of diverse learners, develop apprentices’ collaboration and interpersonal skills, and engage apprentices in “authentic” STEM activities.

CQL promotes DoD STEM research and careers but can improve marketing of other AEOP opportunities.

 Most mentor interviewees and questionnaire respondents reported limited

awareness of AEOP initiatives. Subsequently, mentors did not consistently educate their apprentices about AEOPs or encourage apprentices to participate in them. The majority of responding mentors (61-89%) mentioned never experiencing AEOP informational resources including the AEOP website, AEOP instructional supplies, the AEOP brochures, and AEOP social media.

 Nearly all CQL participants reported learning about at least one STEM career, and about half (51%) reported learning about 4 or more. Similarly, 86% of students reported learning about at least one DoD STEM job, with 54% reporting they learned about 3 or more. Mentors and the CQL experience contributed the most to this impact.

The CQL experience is valued by apprentices and mentors, although program administration is an area for improvement.

 Responding apprentices reported satisfaction with their mentor and working experience during the CQL program. For example, over 90% of responding apprentices reported being at least “somewhat” satisfied with their mentor, the time they spent with their mentor, and the research experience overall.

 In an open-ended item on the questionnaire, almost all of the responding

participants had something positive to say about the program. However, about 30% described frustration with administrative aspects of the program including a lack of communication, payment problems, and delays in getting clearance and access that limited their ability to do meaningful work. Perhaps more notably, when asked how the program could be improved, the most common theme by far (86% of students responding to the question) was logistical issues including payment,

communication, and obtaining clearance and access. In addition, in focus groups, apprentices described difficulties associated with late notification of acceptance

(7)

(e.g., having to decide on other job opportunities before being notified of CQL acceptance, having to find housing on short notice).

Outcomes Evaluation

CQL had positive impacts on apprentices’ STEM knowledge and competencies.

 A majority of apprentices reported large or extreme gains in their knowledge of what everyday research work is like in STEM, how professionals work on real problems in STEM, research conducted in a STEM topic or field, a STEM topic or field in depth, and the research processes, ethics, and rules for conduct in STEM. These impacts were identified across all apprentice groups.

 Many apprentices also reported impacts on their abilities to do STEM, including such things as carrying out procedures for an investigation and recording data accurately; supporting a proposed solution with relevant scientific, mathematical, and/or engineering knowledge; using mathematics or computers to analyze numeric data; reading technical or scientific tests, or using other media, to learn about the natural or designed worlds; deciding what type of data to collect in order to answer a question; identifying the limitations of data collected in an

investigation; asking a question that can be answered with one or more

investigations; designing procedures for investigations, including selecting methods and tools that are appropriate for the data to be collected; and using data or interpretations from other researchers or investigations to improve a solution. CQL had positive impacts

on apprentices’ 21st Century Skills.

 A large majority of apprentices reported large or extreme gains in the areas of making changes when things do not go as planned, building relationships with professionals in the field, learning to work independently, patience for the slow pace of research, sticking with a task until it is complete, and sense of being part of a learning community.

CQL positively impacted apprentices’ confidence and identity in STEM, as well as their interest in future STEM engagement.

 Many apprentices reported a large or extreme gains on items related to STEM identify including feeling prepared for more challenging STEM activities, building academic or professional credentials in STEM, confidence to do well in future STEM courses, feeling responsible for a STEM project or activity, confidence to contribute in STEM, feeling like part of a STEM community, and feeling like a STEM

professional.

 Apprentices also reported positively on the likelihood that they would engage in additional STEM activities outside of school. A majority of apprentices indicated that as a result of CQL they were more likely to talk with friends or family about STEM, mentor or teach other students about STEM, work on a STEM project or experiment in a university or professional setting, receive an award or special recognition for STEM accomplishments, and look up STEM information at a library or on the internet.

CQL succeeded in raising apprentices’ education aspirations, but did not

 After participating in CQL, apprentices indicated being more likely to go further in their schooling than they would have before CQL, with the greatest change being in the proportion of apprentices who wanted to get a Ph.D. (19% before CQL, 35% after).

(8)

change their career aspirations.

 Apprentices were asked to indicate what kind of work they expected to be doing at age 30, and the data were coded as STEM-related or non-STEM-related. Although the vast majority of apprentices indicated interest in a STEM-related career, there was not a statistically significant difference from before CQL to after.

CQL apprentices are largely unaware of AEOP

initiatives, but apprentices show interest in future AEOP opportunities.

 Apprentice and mentors were largely unaware of other AEOP initiatives, but 73% of responding apprentices were at least somewhat interested in participating in CQL in the future, 54% in SMART, 40% in NDSEG, and 34% in URAP. Apprentices reported that their CQL participation and their mentors had the most impact on their awareness of AEOPs.

CQL apprentices have positive opinions about DoD researchers and research.

 The vast majority of apprentices reported that they agreed or strongly agreed that DoD researchers solve real-world problems (95%), DoD researchers advance science and engineering fields (95%), DoD research is valuable to society (94%), DoD

researchers develop new, cutting edge technologies (92%), and DoD researchers support non-defense related advancements in science and technology (86%).

Recommendations

1. The CQL program has the goal of broadening the talent pool in STEM fields. Overall, the program has had limited success in attracting students from groups historically underrepresented and underserved in these fields. In addition, personal relationships continue to factor highly into how students learn about and are recruited to CQL. The program may want to consider doing more to increase the number and diversity of students who participate in CQL. In particular, the program may consider how to more actively recruit students nationwide. Given that the program involves college students and includes a stipend to help with housing expenses, recruitment does not need to be limited to locations near CQL sites. By more actively recruiting, and broadening recruitment efforts beyond local sites, the program is likely to receive more applications, including more from groups that are historically underrepresented and underserved. Mentor focus groups elicited some suggestions for changes to recruitment strategies. These suggestions include having a centralized CQL recruitment and application process (rather than site specific) as well as advertising more with high schools (so that future college students are aware of the program) and with colleges, including working with college job placement services and posting fliers prominently where students will see them. In addition, the program may want to consider how students are recruited and subsequently selected to serve as apprentices. Although some mentors did not know how students were recruited, others reported that there were no targeted recruitment strategies for students from underrepresented and underserved groups. In order to meet the goal of serving more students from underrepresented or underserved groups, the program could develop guidance to balance selecting the strongest candidates (e.g. best match between apprentice interest and mentor work), regardless of race or gender, and providing more opportunities for students from underrepresented and underserved groups to participate.

(9)

2. Similarly, efforts to recruit mentors should be considered. The number of apprentices who can participate in CQL is limited by the number of mentors available. In order to broaden participation and provide more opportunities to qualified candidates, the program needs to recruit more mentors. One potential factor impacting mentor participation – time – came out in a focus group; mentors noted that colleagues were not interested in serving as mentors because of the time it takes them to work with apprentices, which can detract from other responsibilities. In addition, on the questionnaire, some responding mentors suggested providing more support for mentors. As a result, it may be productive to consider what supports can be put in place to help mentors efficiently and effectively utilize their apprentices. For example, mentors may benefit from ideas for ways in which apprentices can productively contribute to ongoing research. In addition, potential mentors should be made aware of these supports as well as potential benefits to their project from involving apprentices in their work.

3. Given the goal of having students progress from other AEOP programs into CQL, and from CQL into other programs, the program may want to consider implementing marketing and recruitment efforts targeting past AEOP participants and to work with sites to increase both mentors’ and students’ exposure to AEOP. Apprentice questionnaire data indicate that few apprentices had previously participated in other AEOPs. Implementing marketing and recruitment efforts targeted at past AEOP participants may increase the number of participants in other AEOP programs who progress into CQL and may broaden CQL participation of students from underrepresented and underserved groups as several other AEOP programs specifically target these students. In addition, responding CQL mentors and apprentices tended to lack knowledge of AEOP programs beyond CQL. In focus groups, mentors indicated that they would be willing to educate students about other AEOP programs if they knew more about those programs themselves, suggesting that improving mentor awareness of programs would also improve student awareness. Alternatively, given that CQL participants are completing internships on active research, and potential mentors may already be hesitant to participate due to time considerations, the program may want to consider ways to educate apprentices about AEOP opportunities that do not rely on the mentor (e.g., presentations during an orientation; information provided during the student symposium). In addition, given the limited use of the AEOP website, print materials, and social media, the program should consider how these materials could be adjusted to provide students with more information and facilitate their enrollment in other AEOPs, or what alternative strategies may be more effective.

4. Efforts should be made to address administrative difficulties. Although participants were pleased with their experience, frustration with administrative and logistical aspects was quite evident in responses, and in some cases detracted from program goals. In particular, students reported difficulties due to late notification of acceptance, including missing out on participating in the past, and late payment. Students also reported negative impacts on their ability to do meaningful work because of delays in getting clearance and computer access. In addition, some students indicated that they, and their mentors, expended considerable time and effort to remedy these administrative issues. Although some students indicated that these issues would not keep them from participating again, other students indicated that they would not participate again, may work at the lab again but would do so through other channels, or were discouraged from participating in CQL or working for the DoD in the

(10)

future. Given that one AEOP goal is to “broaden, deepen, and diversify the pool of STEM talent in support of our defense industry base,” efforts should be made to remedy these administrative issues so as not to detract from apprentices’ or mentors’ experience with the program. One suggestion that came out of apprentice questionnaire and focus group data is to begin the process for students to obtain clearance and computer access early, so that they have computer access when they begin the internship and can begin doing meaningful work.

5. Additional efforts should be undertaken to improve participation in evaluation activities, as the low response rates for both the student and mentor questionnaires raise questions about the representativeness of the results. Improved communication with the individual program sites about expectations for the evaluation may help. In addition, the evaluation instruments may need to be streamlined as perceived response burden can affect participation. In particular, consideration should be given to better tailoring questionnaires to particular programs and whether the parallel nature of the student and mentor questionnaires is necessary, with items being asked only of the most appropriate data source. Given that CQL apprentices are career age, as well as the significant investment that Army research installations make in each apprentice, it may prove important to conduct a CQL alumni study in the near future. The purpose of which would serve to establish the extent to which CQL apprentices subsequently become employed in the Army or DoD.

(11)

Introduction

The Army Educational Outreach Program (AEOP) vision is to offer a collaborative and cohesive portfolio of Army sponsored science, technology, engineering and mathematics (STEM) programs that effectively engage, inspire, and attract the next generation of STEM talent through K-college programs and expose them to Department of Defense (DoD) STEM careers. The consortium, formed by the Army Educational Outreach Program Cooperative Agreement (AEOP CA), supports the AEOP in this mission by engaging non-profit, industry, and academic partners with aligned interests, as well as a management structure that collectively markets the portfolio among members, leverages available resources, and provides expertise to ensure the programs provide the greatest return on investment in achieving the Army’s STEM goals and objectives. This report documents the evaluation study of one of the AEOP elements, the College Qualified Leaders (CQL) program. CQL is managed by the American Society for Engineering Education (ASEE). The evaluation study was performed by Virginia Tech, the Lead Organization (LO) in the AEOP CA consortium. Data analyses and reports were prepared in collaboration with Horizon Research, Inc.

Program Overview

The College Qualified Leaders (CQL) program, managed by the American Society for Engineering Education (ASEE), is an Army Educational Outreach Program (AEOP) that matches talented college students and recent graduates (herein referred to as apprentices) with practicing Army Scientists and Engineers (Army S&Es, herein referred to as mentors), creating a direct apprentice-mentor relationship that provides apprentice training that is unparalleled at most colleges. CQL allows alumni of Gains in the Education of Mathematics and Science (GEMS) and/or Science and Engineering Apprentice Program (SEAP) to continue their relationship with the mentor and/or laboratory, and also allows new college students to enter the program. CQL offers apprentices the provision of summer, partial year, or year-round research at the Army laboratory, depending on class schedules and school location. CQL apprentices receive firsthand research experience and exposure to Army research laboratories. CQL fosters desire in its participants to pursue further training and careers in STEM while specifically highlighting and encouraging careers in Army research.

AEOP Goals

Goal 1: STEM Literate Citizenry.

 Broaden, deepen, and diversify the pool of STEM talent in support of our defense industry base.

Goal 2: STEM Savvy Educators.

 Support and empower educators with unique Army research and technology resources.

Goal 3: Sustainable Infrastructure.

 Develop and implement a cohesive, coordinated, and sustainable STEM education outreach infrastructure across the Army.

(12)

In 2014, CQL was guided by the following objectives:

1. To nurture interest and provide STEM research experience for college students and recent graduates contemplating further studies;

2. To provide opportunities for continued association with the DoD laboratories and STEM enrichment for previous SEAP, GEMS, and other AEOP participants as well as allow new college students the opportunity to engage with DoD laboratories;

3. To outreach to participants inclusive of youth from groups historically underrepresented and underserved in STEM;

4. To increase participant knowledge in targeted STEM areas and develop their research and laboratory skills as evidenced by mentor evaluation and the completion of a presentation of research;

5. To educate participants about careers in STEM fields with a particular focus on STEM careers in DoD laboratories; 6. To acquaint participants with the activities of DoD laboratories in a way that encourages a positive image and

supportive attitude towards our defense community; and

7. To provide information to participants about opportunities for STEM enrichment and ways they can mentor younger STEM students through GEMS, eCYBERMISSION, and other AEOP opportunities.

Apprenticeships were completed at 10 Army research laboratories in 5 states, summarized in Table 1. Table 1. 2014 CQL Sites

2014 CQL Site CommandLocation

US Army Research Laboratory – Aberdeen Proving Ground (ARL-APG) RDECOM Aberdeen, MD

US Army Research Laboratory – Adelphi (ARL-A) RDECOM Adelphi, MD

Walter Reed Army Institute of Research (WRAIR) MRMC Silver Spring, MD

US Army Medical Research Institute for Infectious Diseases (USAMRIID) MRMC Fort Detrick, MD US Army Aviation & Missile Research Development and Engineering Center –

Redstone Arsenal (AMRDEC)

RDECOM Huntsville, AL Engineer Research & Development Center Construction Engineering Research

Laboratory (ERDC-CERL)

USACE Champaign, IL US Army Medical Research Institute of Chemical Defense (USAMRICD) MRMC Aberdeen, MD US Army Center for Environmental Health Research (USACEHR) MRMC Fort Detrick, MD

Defense Forensic Science Center (DFSC) USACIDC Forest Park, GA

Engineer Research and Development Center – Vicksburg, MS (ERDC-MS) USACE Vicksburg, MS

Commands: “MRMC” is the Medical Research and Material Command, “RDECOM” is the Research, Development and Engineering Command, and

“USACE” is the U.S. Army Corps of Engineers

The 10 host sites received applications from substantially more qualified students than they had positions for the 2014 CQL program: 550 students applied and 307 enrolled, which represents a slightly larger enrollment from slightly fewer applicants compared to FY13 (588 students applied and 260 enrolled). Table 2 summarizes interest and final enrollment by site.

(13)

Table 2. 2014 CQL Site Applicant and Enrollment Numbers

2014 CQL Site FY2013 FY2014

No. of Applicants No. of Enrolled Participants No. of Applicants No. of Enrolled Participants US Army Research Laboratory – Aberdeen Proving Ground

(ARL-APG) 133 59 161 79

US Army Research Laboratory – Adelphi (ARL-A) 93 48 118 75

Walter Reed Army Institute of Research (WRAIR) 184 97 94 76

US Army Medical Research Institute for Infectious Diseases

(USAMRIID) 32 14 40 18

US Army Aviation & Missile Research Development and

Engineering Center – Redstone Arsenal (AMRDEC) 32 2 69 16

Engineer Research & Development Center Construction

Engineering Research Laboratory (ERDC-CERL) 24 8 27 12

US Army Medical Research Institute of Chemical Defense

(USAMRICD) 22 9 20 9

US Army Center for Environmental Health Research (USACEHR) 19 8 8 12

Defense Forensic Science Center (DFSC) 11 11 13 8

Engineer Research and Development Center – Vicksburg, MS

(ERDC-MS) 4 4 NA 2

Total 588260 550 307

Twenty-one individuals applied at The US Army Criminal Investigation Laboratory (USACIL) but did not enroll there as there was no CQL program

at USACIL in 2014.

The total cost of the 2014 CQL program was $3,666,463. This includes administrative costs to ASEE of $129,319 and $3,534,144 for participant stipends (including cost of required eye exams for apprentices in laser labs and work boots when required). The average cost per 2014 CQL participant taken across all CQL sites was $11,933. Table 3 summarizes these expenditures.

Table 3. 2014 CQL Program Costs 2014 CQL - Cost Per Participant

Total Participants 307

Total Cost $3,666,463

Cost Per Participant $11,933

2014 CQL - Cost Breakdown Per Participant

Average Administrative Cost to ASEE Per Participant $421

Average Participant Stipend (including eye exam and/or work boots if required) $11,512

(14)

Evidence-Based Program Change

Based on recommendations from the FY13 summative evaluation report, the AEOP identified three key priorities for programs in FY14: 1) Increase outreach to populations that are historically underserved and underrepresented in STEM; 2) Increase participants’ awareness of Army/DoD STEM careers; and 3) Increase participants’ awareness of other AEOP opportunities. ASEE initiated the following program changes/additions to the FY14 administration of the CQL program in light of the key AEOP priorities, the FY13 CQL evaluation study, and site visits conducted by ASEE and the LO.

I. Increase outreach to populations that are historically underserved and underrepresented in STEM.

a. ASEE wrote and implemented a 2014 Outreach Plan for CQL that included: i. Outreach efforts at conferences/expos that serve diverse audiences

1. Event it. Build it. Career Expo at the Society of Women Engineers Conference 2. Hispanic Association for Colleges and Universities Conference

3. University of Maryland Career Fair 4. George Mason University Career Fair 5. Howard University Career Fair 6. Columbia University Career Fair

ii. Bi-Weekly meetings with LPCs to identify new targets and strategies for outreach

II. Increase participant’s awareness of other AEOP opportunities.

a. ASEE emailed previous CQL participants with links to AEOP social media.

FY14 Evaluation At-A-Glance

Virginia Tech, in collaboration with ASEE, conducted a comprehensive evaluation study of the CQL program. The CQL logic model below presents a summary of the expected outputs and outcomes for the CQL program in relation to the AEOP and CQL-specific priorities. This logic model provided guidance for the overall CQL evaluation strategy.

(15)

Inputs Activities Outputs Outcomes (Short term) Impact (Long Term)  Army sponsorship  ASEE providing oversight of site programming  Operations conducted by 10 Army Labs  307 students participating in CQL apprenticeships  288 Army S&Es serving as CQL mentors  Stipends for apprentices to support meals and travel  Centralized branding and comprehensive marketing  Centralized evaluation   Apprentices engage in authentic STEM research experiences through hands-on summer, partial year, and year-round apprenticeships at Army labs

 Army S&Es supervise and mentor apprentices’ research  Program activities that

expose apprentices to AEOP programs and/or STEM careers in the Army or DoD

 Number and diversity of student participants engaged in CQL  Number and diversity of

Army S&Es engaged in CQL  Apprentices, Army S&Es,

site coordinators, and ASEE contributing to evaluation

 Increased apprentice STEM competencies (confidence, knowledge, skills, and/or abilities to do STEM)

 Increased apprentice interest in future STEM engagement

 Increased apprentice awareness of and interest in other AEOP

opportunities  Increased apprentice

awareness of and interest in STEM research and careers

 Increased apprentice awareness of and interest in Army/DoD STEM research and careers  Implementation of evidence-based recommendations to improve CQL programs  Increased apprentice participation in other AEOP opportunities and Army/DoD-sponsored scholarship/ fellowship programs

 Increased apprentice pursuit of STEM degrees  Increased apprentice

pursuit of STEM careers  Increased apprentice pursuit of Army/DoD STEM careers  Continuous improvement and sustainability of CQL

The CQL evaluation study gathered information from apprentice and mentor participants about CQL processes, resources, activities, and their potential effects in order to address key evaluation questions related to program strengths and challenges, benefits to participants, and overall effectiveness in meeting AEOP and CQL program objectives.

Key Evaluation Questions

 What aspects of CQL motivate participation?

 What aspects of CQL structure and processes are working well?  What aspects of CQL could be improved?

 Did participation in CQL:

o Increase apprentices’ STEM competencies?

o Increase apprentices’ interest in future STEM engagement?

o Increase apprentices’ awareness of and interest in other AEOP opportunities?

(16)

The assessment strategy for CQL included on-site focus groups with apprentices and mentors at 4 CQL sites, a post-program apprentice questionnaire, a post-post-program mentor questionnaire, and one Annual Program Report (APR) prepared by ASEE using data from all CQL sites. Tables 4-8 outline the information collected in apprentice and mentor questionnaires and focus groups, as well as information from the APR that is relevant to this evaluation report.

Table 4. 2014 Apprentice Questionnaires

Category Description

Profile Demographics: Participant gender, grade level, and race/ethnicity

Education Intentions: Degree level, confidence to achieve educational goals, field sought

AEOP Goal 1

Capturing the Apprentice Experience: In-school vs. In-program experience; mentored research

experience and products

STEM Competencies: Gains in Knowledge of STEM, Science & Engineering Practices; contribution of

AEOP

Transferrable Competencies: Gains in 21st Century Skills

STEM Identity: Gains in STEM identity, intentions to participate in STEM, and STEM-oriented

education and career aspirations; contribution of AEOP

AEOP Opportunities: Past participation, awareness of, and interest in participating in other AEOP

programs; contribution of AEOP, impact of AEOP resources

Army/DoD STEM: Exposure to Army/DoD STEM jobs, attitudes toward Army/DoD STEM research

and careers, change in interest for STEM and Army/DoD STEM jobs; contribution of AEOP, impact of AEOP resources

AEOP Goal 2 and 3

Mentor Capacity: Perceptions of mentor/teaching strategies (apprentices respond to a subset)

Comprehensive Marketing Strategy: How apprentices learn about AEOP, motivating factors for

participation, impact of AEOP resources on awareness of AEOPs and Army/DoD STEM research and careers

Satisfaction & Suggestions

Benefits to participants, suggestions for improving programs, overall satisfaction

Table 5. 2014 Mentor Questionnaires

Category Description

Profile Demographics: Participant gender, race/ethnicity, occupation, past participation Satisfaction &

Suggestions

Awareness of CQL, motivating factors for participation, satisfaction with and suggestions for improving CQL programs, benefits to participants

AEOP Goal 1

Capturing the Apprentice Experience: In-program experience

STEM Competencies: Gains in their apprentices’ Knowledge of STEM, Science & Engineering

Practices; contribution of AEOP

Transferrable Competencies: Gains in their apprentices’ 21st Century Skills

AEOP Opportunities: Past participation, awareness of other AEOP programs; efforts to expose

apprentices to AEOPs, impact of AEOP resources on efforts; contribution of AEOP in changing apprentice AEOP metrics

(17)

Army/DoD STEM: Attitudes toward Army/DoD STEM research and careers, efforts to expose

apprentices to Army/DoD STEM research/careers, impact of AEOP resources on efforts; contribution of AEOP in changing apprentice Army/DoD career metrics

AEOP Goal 2 and 3

Mentor Capacity: Perceptions of mentor/teaching strategies

Comprehensive Marketing Strategy: How mentors learn about AEOP, usefulness of AEOP resources

on awareness of AEOPs and Army/DoD STEM research and careers Table 6. 2014 Apprentice Focus Groups

Category Description

Profile Gender, race/ethnicity, grade level, past participation in CQL, past participation in other AEOP programs

Satisfaction & Suggestions

Awareness of CQL, motivating factors for participation, satisfaction with and suggestions for improving CQL programs, benefits to participants

AEOP Goal 1 and 2

Program Efforts

Army STEM: AEOP Opportunities – Extent to which apprentices were exposed to other AEOP

opportunities

Army STEM: Army/DoD STEM Careers – Extent to which apprentices were exposed to STEM and

Army/DoD STEM jobs Table 7. 2014 Mentor Focus Groups

Category Description

Profile Gender, race/ethnicity, occupation, organization, role in CQL, past participation in CQL, past participation in other AEOP programs

Satisfaction & Suggestions

Perceived value of CQL, benefits to participants, suggestions for improving CQL programs

AEOP Goal 1 & 2 Program Efforts

Army STEM: AEOP Opportunities – Efforts to expose apprentices to AEOP opportunities

Army STEM: Army/DoD STEM Careers – Efforts to expose apprentices to STEM and Army/DoD

STEM jobs

Mentor Capacity: Local Educators – Strategies used to increase diversity/support diversity in CQL

Table 8. 2014 Annual Program Report

Category Description

Program Description of course content, activities, and academic level (high school or college) AEOP Goal 1

and 2

Program Efforts

Underserved Populations: Mechanisms for marketing to and recruitment of apprentices from

underserved populations

Army STEM: Army/DoD STEM Careers – Career day exposure to Army STEM research and careers;

Participation of Army engineers and/or Army research facilities in career day activities

Mentor Capacity: Local Educators - University faculty and apprentice involvement

Detailed information about methods and instrumentation, sampling and data collection, and analysis are described in Appendix A, the evaluation plan. The reader is strongly encouraged to review Appendix A to clarify how data are

(18)

summarized, analyzed, and reported in this document. Findings of statistical and/or practical significance are noted in the report narrative, with tables and footnotes providing results from tests for significance. Questionnaires and respective data summaries are provided in Appendix B (apprentice) and Appendix C (mentor). Focus group protocols are provided in Appendix D (apprentices) and Appendix E (mentors); the APR template is located in Appendix F. Major trends in data and analyses are reported herein.

Study Sample

Apprentices from 9 of 10 CQL sites responded to questionnaires, as did mentors from 5 of the 10 sites. Table 9 shows the number of apprentice and mentor respondents by site.

Table 9. 2014 CQL Site Survey Respondent Numbers

2014 CQL Site Apprentices Mentors

No. of Participants No. of Survey Respondents No. of Participants No. of Survey Respondents US Army Research Laboratory – Aberdeen Proving

Ground (ARL-APG)

79 37 54 0

US Army Research Laboratory – Adelphi (ARL-A) 75 36 109 0

Walter Reed Army Institute of Research (WRAIR) 76 26 59 3

US Army Medical Research Institute for Infectious Diseases (USAMRIID)

18 16 32 9

US Army Aviation & Missile Research Development and Engineering Center – Redstone Arsenal

(AMRDEC)

16 7 9 0

Engineer Research & Development Center Construction Engineering Research Laboratory (ERDC-CERL)

12 7 9 3

US Army Medical Research Institute of Chemical Defense (USAMRICD)

9 6 4 2

US Army Center for Environmental Health Research (USACEHR)

12 3 3 2

Defense Forensic Science Center (DFSC) 8 1 6 0

Engineer Research & Development Center – Vicksburg, MS (ERDC-MS)

2 0 3 0

Total 307 139 288 19

Table 10 provides an analysis of apprentice and mentor participation in the CQL questionnaires, the response rate, and the margin of error at the 95% confidence level (a measure of how representative the sample is of the population). The margin of error for both the apprentice and mentor surveys is larger than generally considered acceptable, indicating that

(19)

the samples may not be representative of their respective populations. Note that the apprentice response rate is higher than in 2013 (which had a response rate of 36%).

Table 10. 2014 CQL Questionnaire Participation

Participant Group Respondents

(Sample) Total Participants (Population) Participation Rate Margin of Error @ 95% Confidence1 Apprentices 139 307 45% ±6.2% Mentors 19 288 7% ±21.8%

A total of four apprentice focus groups were conducted at 4 of the 10 CQL sites. Apprentice focus groups included 17 apprentices, 11 female and 6 male. It should be noted that the gender proportion in the focus group sample (35% male) was not representative of that in the population of CQL apprentices at large (75% male), suggesting that females may have been oversampled in focus groups. Apprentices in focus groups ranged from college sophomores to recent graduates and graduate-school students. A total of four mentor focus groups were also conducted at the same 4 sites. Mentor focus groups included 13 mentors (7 females, 6 males). Mentors were predominately STEM professionals, but also included an architect and a teacher. Focus groups were not intended to yield generalizable findings; rather they were intended to provide additional evidence of, explanation for, or illustrations of questionnaire data. They add to the overall narrative of CQL’s efforts and impact, and highlight areas for future exploration in programming and evaluation.

Respondent Profiles

Apprentice Demographics

Demographic information collected from questionnaire respondents is summarized in Table 11. More males (56%) than females (43%) completed the questionnaire. More apprentices responding to the questionnaire identified with the race/ethnicity category of White (55%) than any other single race/ethnicity category, though there is substantial representation of the category of Asian (21%). The majority of respondents (64%) were in the 2nd to 4th year of college. The APR included demographic data for a larger proportion of the enrolled participants (n = 185). Those data were similar to the questionnaire data for race/ethnicity and grade-level; however, were quite different for gender (75% male, 25% female).

1 “Margin of error @ 95% confidence” means that 95% of the time, the true percentage of the population who would select an answer

lies within the stated margin of error. For example, if 47% of the sample selects a response and the margin of error at 95% confidence is calculated to be 5%, if you had asked the question to the entire population, there is a 95% likelihood that between 42% and 52% would have selected that answer. A 2-5% margin of error is generally acceptable at the 95% confidence level.

(20)

FY14 evaluation data and enrollment data reveals that CQL had limited success in engaging female students (43% of questionnaire respondents, 25% of enrollment survey respondents). The same data suggest CQL had limited success in providing outreach to students from historically underrepresented and underserved race/ethnicity groups (13% of questionnaire respondents, 8% of enrollment survey respondents). This remains an area for growth, one that is dependent upon other AEOPs for appropriately preparing a diverse body of students (e.g., in GEMS and/or SEAP) and encouraging them to pursue CQL as a more competitive apprenticeship. Growth in this area is also dependent upon the success of the marketing and outreach of the program administrator in recruiting applicants and upon mentors for initiating a balanced applicant selection process.

Table 11. 2014 CQL Apprentice Respondent Profile (n = 139)

Demographic Category Questionnaire Respondents

Respondent Gender

Male 78 56%

Female 60 43%

Choose not to report 1 1%

Respondent Race/Ethnicity

Asian 29 21%

Black or African American 9 6%

Hispanic or Latino 7 5%

Native American or Alaska Native 0 0%

Native Hawaiian or Other Pacific Islander 2 1%

White 77 55%

Other race or ethnicity, (specify):† 5 4%

Choose not to report 10 7%

Respondent Grade Level

College freshman 1 1% College sophomore 27 19% College junior 32 23% College senior 30 22% Graduate program 35 25% Other, (specify) 8 6%

Choose not to report 6 4%

Other = “Bi-racial,” “Iranian,” “Middle Eastern,” “White-Asian,” and “Korean, White.”

Other = “Graduated” (n = 3), “Applying to Graduate Program,” “College Super Senior,” “Continued internship,” “I will have graduated at the end

of this term and will take a class as a non-degree seeking student in the fall,” and “Research Technician at the WRAIR and NIH.”

Apprentices were asked how many times they participated in each of the AEOP programs. As can be seen in Chart 1, 30% of responding apprentices reported participating in CQL two times or more; 32% reported participating in SEAP at least once. Few apprentices (13% or less) reported participating in any of the other AEOP programs. Compared to 2013, a higher percentage of 2014 responding apprentices had previously participated in SEAP, but for all other AEOP programs, the percentage was lower for 2014.

(21)

Mentor Demographics

The 2014 Mentor Questionnaire collected more extensive demographic information on the mentors than past years, FY14 data is summarized in Table 12. The number of male responding mentors was approximately equal to the number of female responding mentors (9 males vs. 10 females or 47% vs 53%). Nearly three-fourths of the responding mentors identified themselves as White (74%). All responding mentors were scientist, engineer, or mathematics professionals; the majority (74%) identified their primary area of research as biological science. Additional characteristics of the mentors are included in Appendix C.

98% 98% 98% 98% 98% 98% 97% 96% 96% 94% 93% 87% 68% 16% 14% 2% 3% 4% 11% 16% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

National Defense Science & Engineering Graduate (NDSEG)… Camp Invention Junior Science & Humanities Symposium (JSHS) Undergraduate Research Apprenticeship Program (URAP) UNITE Junior Solar Sprint (JSS) High School Apprenticeship Program (HSAP) Research & Engineering Apprenticeship Program (REAP) eCYBERMISSION GEMS Near Peers Science Mathematics, and Research for Transformation… Gains in the Education of Mathematics and Science (GEMS)

Science & Engineering Apprenticeship Program (SEAP) College Qualified Leaders (CQL)

Chart 1: Student Participation in AEOP Programs (n = 107-108)

(22)

Table 12. 2014 CQL Mentor Respondent Profile

Demographic Category Questionnaire Respondents

Respondent Gender (n = 19)

Female 10 53%

Male 9 47%

Respondent Race/Ethnicity (n = 19)

Asian 3 16%

Black or African American 0 0%

Hispanic or Latino 1 5%

Native American or Alaska Native 0 0%

Native Hawaiian or Other Pacific Islander 0 0%

White 14 74%

Other race or ethnicity, (specify): 0 0%

Choose not to report 1 5%

Respondent Occupation (n = 19)

Scientist, Engineer, or Mathematics professional 19 100%

Teacher 0 0%

Other school staff 0 0%

University educator 0 0%

Scientist, Engineer, or Mathematician in training

(undergraduate or graduate student, etc.) 0 0%

Other, (specify): 0 0%

Respondent Primary Area of Research (n = 19)

Biological Science 14 74%

Physical science (physics, chemistry, astronomy,

materials science) 2 11%

Engineering 2 11%

Medical, health, or behavioral science 1 5%

Earth, atmospheric, or oceanic science 0 0%

Agricultural science 0 0%

Environmental science 0 0%

Computer science 0 0%

Technology 0 0%

Mathematics or statistics 0 0%

Social science (psychology, sociology, anthropology, etc.) 0 0%

(23)

Actionable Program Evaluation

Actionable Program Evaluation is intended to provide assessment and evaluation of program processes, resources, and activities for the purpose of recommending improvements as the program moves forward. This section highlights information outlined in the Satisfaction & Suggestions sections of Tables 4-8.

A focus of the Actionable Program Evaluation is efforts toward the long-term goal of CQL and all of the AEOP to increase and diversify the future pool of talent capable of contributing to the nation’s scientific and technology progress. Thus, it is important to consider how CQL is marketed and ultimately recruits participants, the factors that motivate them to participate in CQL, participants’ perceptions of and satisfaction with activities, what value participants place on program activities, and what recommendations participants have for program improvement. The following sections report perceptions of apprentices and mentors that pertain to current programmatic efforts and recommend evidence-based improvements to help CQL achieve outcomes related to AEOP programs and objects.

Marketing and Recruiting Underrepresented and Underserved Populations

The CQL manager, ASEE reported marketing to and recruiting students for CQL in a variety of ways. ASEE marketed CQL at the following FY14 outreach events:

 Event it. Build it. Career Expo at the Society of Women Engineers Conference  Hispanic Association for Colleges and Universities Conference

 University of Maryland Career Fair  George Mason University Career Fair  Howard University Career Fair  Columbia University Career Fair

The mentor questionnaire included an item asking how students were recruited for apprenticeships. As can be seen in Chart 2, mentors most often indicated recruiting their apprentices through a personal network such as workplace colleagues (32%), personal acquaintances (32%), and direct contact from the student (32%). Interestingly, 32% reported that they had no knowledge of how their apprentices were recruited.

(24)

In focus groups, mentors were asked what strategies had been used that year to recruit students from underrepresented and underserved populations. Most commonly mentors indicated that they recruited through university contacts, they were not involved in selecting apprentices, or there was no targeted recruitment strategy. One said:

We did not look for any specific gender or race or anything, we had a billet that we put out to several universities, and we were indiscriminate, as far as looking at their resumes, we didn’t take the brightest, I mean we just took the student that seemed to have the most interest in what we were doing. And we interviewed several people, and the student that we picked has worked well. We didn’t have any goal in mind for, you know, minority, you know, gender. (CQL mentor)

In order to understand which recruitment methods are most effective, the questionnaire asked apprentices to select all of the different ways they heard about CQL. Chart 3 summarizes apprentices’ responses. The most frequently mentioned source of information about CQL was someone who works at an Army laboratory (26%). Other sources mentioned relatively frequently were teachers or professors (23%); immediate family members (19%); school or university newsletter, email, or website (18%); friends (17%), CQL mentors (16%), and past CQL participants (16%). The “Other” category typically included references to finding out about CQL indirectly through interest in another program (e.g., a co-op job, the SMART program).

5% 32% 0% 0% 5% 5% 5% 5% 5% 11% 16% 16% 16% 32% 32% 32% 0% 5% 10% 15% 20% 25% 30% 35% 40% 45% 50% Other, Specify: I do not know how student(s) was recruited for apprenticeship STEM conference(s) or event(s) Career fair(s) Other, Specify: K-12 school teacher(s) outside of my workplace Informational materials sent to K-12 schools or Universities…

Education conference(s) or event(s) Communication(s) generated by a K-12 school or teacher… Organization(s) serving underserved or underrepresented… University faculty outside of my workplace Communication(s) generated by a university or faculty… Applications from American Society for Engineering Education or… Student contacted mentor Personal acquaintance(s) (friend, family, neighbor, etc.) Colleague(s) in my workplace

(25)

These data were analyzed by apprentice gender and race/ethnicity to determine if different groups of apprentices learned about CQL in a different manner. No meaningful differences were found in how apprentices learned about CQL by either factor. Taken together, these findings suggest that responding apprentices were most likely to learn about CQL through personal contacts or university media resources rather than other media sources.

Apprentice focus group data reflect the importance of personal contacts in making apprentices aware of CQL. Most apprentice focus group participants indicated that they learned about CQL through a pre-existing relationship with either a mentor or the site (e.g., they had worked at the site before; their parents work at the site). For example:

Both my parents work out here on the [site name] and there was an email sent around saying, “SEAP and CQL people…apply now”. So I applied. My mom was actually working to get me into her office but that fell through, really badly fell through at the last minute, so my dad stepped up and said, “hey, do you think you have a spot?” and they said, “yes we always want new people.” (CQL apprentice)

I knew [Mentor’s name], my supervisor. I’ve known his family for a long time. (CQL apprentice) 4% 0% 0% 1% 1% 4% 7% 9% 13% 16% 16% 17% 18% 19% 23% 26% 0% 5% 10% 15% 20% 25% 30% 35% 40% Other, (specify): Facebook, Twitter, Pinterest, or other social media News story or other media coverage American Society for Engineering Education website Guidance counselor Extended family member (grandparents, aunts, uncles, cousins) Friend of the family Someone who works with the Department of Defense Army Educational Outreach Program (AEOP) website Past participant of CQL Mentor from CQL Friend School or university newsletter, email, or website Immediate family member (mother, father, siblings) Teacher or professor Someone who works at an Army laboratory

(26)

The reason I chose CQL is because the program I was under was unavailable for a period of time so then I chose to be under this program...I was a student contractor, undergraduate and graduate. When that ended, I needed a new program to work here. (CQL apprentice)

My previous mentor recommended me to my mentor here and he advised me to apply to CQL in order to intern for him. (CQL apprentice)

Friends who had previously participated in CQL, college professors, and neighbors were also cited as sources of information about the program.

Mentors were also asked how they learned about CQL (see Chart 4). Almost all of the responding mentors learned about CQL through work and/or Army/DoD personnel, indicating the source as a colleague (32%), the CQL site host/director (26%), workplace communications (21%), someone who works at an Army laboratory (16%), a supervisor/superior (16%), or someone who works with the Department of Defense (11%).

To examine whether mentors are expanding their participation in AEOP programs, the questionnaire asked how many times they participated in each of the AEOP programs. Approximately half of the responding mentors (53%) reported participating in an AEOP program between one and three times (32% participated once, 0% participated twice, and 21%

0% 0% 0% 0% 0% 0% 0% 5% 5% 5% 11% 16% 16% 21% 26% 32% 0% 5% 10% 15% 20% 25% 30% 35% 40% 45% 50% Other, (specify): American Society for Engineering Education website

Facebook, Twitter, Pinterest, or other social media State or national educator conference STEM conference A news story or other media coverage A student Army Educational Outreach Program (AEOP) website School, university, or professional organization newsletter, email,…

Past CQL participant Someone who works with the Department of Defense A supervisor or superior Someone who works at an Army laboratory Workplace communications CQL site host/director A colleague

(27)

participated three times). Thirty-seven percent indicated participating 4 or more times (11% indicated never participating in any AEOP program, perhaps because they were not including their current participation in CQL when answering the question). Despite responding mentors’ continued participation in at least one AEOP program, for nearly half of the AEOP programs (6 of 14), including URAP and NDSEG in which their apprentices were eligible to participate, the majority indicated having never heard of the program.

Factors Motivating Apprentice Participation

Apprentice questionnaires and focus groups included questions to explore what motivated apprentices to participate in CQL. Specifically, the questionnaire asked how motivating a number of factors were in apprentices’ decision to participate. As can be seen in Table 13, more than 7 in 10 responding apprentices indicated that interest in STEM (81%), the desire to expand laboratory or research skills (81%), learning in ways that are not possible in school (80%), the desire to learn something new and interesting (76%), and building a college application or résumé (73%) were “very much” motivating to them. Earning a stipend or award while doing STEM (61%), networking opportunities (61%), the opportunity to use advanced laboratory techniques (59%), and exploring a unique work environment (53%) were each indicated by a majority of respondents as motivating them very much.

Table 13. Factors Motivating Apprentices “Very Much” to Participate in CQL (n = 136-137)

Item Questionnaire Respondents

Interest in science, technology, engineering, or mathematics (STEM) 81%

Desire to expand laboratory or research skills 81%

Learning in ways that are not possible in school 80%

Desire to learn something new or interesting 76%

Building college application or résumé 73%

Earning stipend or award while doing STEM 61%

Networking opportunities 61%

Opportunity to use advanced laboratory technology 59%

Exploring a unique work environment 53%

Interest in STEM careers with the Army 40%

The program mentor(s) 38%

Serving the community or country 36%

Having fun 31%

Teacher or professor encouragement 26%

Parent encouragement 22%

Opportunity to do something with friends 14%

An academic requirement or school grade 6%

In focus groups, apprentices were also asked why they chose to participate in CQL. The majority of apprentices indicated that they wanted to participate in order to gain job experience, a category that is not included on the questionnaire, but

(28)

may be related to some of the motivations commonly indicated on the questionnaire (e.g., desire to expand laboratory or research skills; building a college application or résumé). As two apprentices said when asked why they chose to participate in CQL:

Honestly, the experience, just being able to work in the lab, see how everything functions, and just getting all of that experience is what made me interested in it, because it will give me a leg up when searching for jobs. (CQL apprentice)

Well I did this program because I’ve actually been debating between going to med school or getting my masters or Ph.D. in a biotech laboratory related field, and I really wanted this experience to see what it would be like working in a lab every day, just to kind of give me a vision of what my career would be like. And these internships certainly give me great experience. (CQL apprentice)

For each item in Table 14, differences between females and males as well as minority apprentices and non-minority apprentices were tested to identify whether different factors were more or less motivating for different apprentice groups. Overall, there were few significant differences. Males were somewhat more likely than females to indicate being motivated by an academic requirement or school grade2 (effect size,3 d=0.41 standard deviations); females were somewhat more likely than males to be motivated by exploring a unique work environment4 (d=0.46 standard deviations). Minority apprentices were much more likely than non-minority apprentices to be motivated by teacher or professor encouragement5 (d = 0.99 standard deviations).

The CQL Experience

The apprentice questionnaire included several items asking about the nature of apprentices’ experience in CQL, and how that experience compared to their STEM learning opportunities in school. When asked what field their CQL experience focused on, 50% of responding apprentices selected science, 37% engineering, 11% technology, and 3% mathematics. As can be seen in Chart 5, over half of the responding apprentices indicated that they had at least some input in their project, either through working with their mentor to design the project (18%), working with their mentor and other research team members to design the project (18%), choosing from project options suggested by the mentor (17%), or designing the project on their own (4%). The remaining apprentices reported being assigned a project by their mentor (44%) or not having a project at all (1%).

2 Two-tailed independent samples t-test, t(134) = 2.43, p = 0.017

3 Effect size calculated as Cohen’s d: the difference in means of the two groups divided by the pooled standard deviation. Effect sizes

of about 0.20 are typically considered small, 0.50 medium, and 0.80 large. Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.

(29)

Although most apprentices worked in close proximity with others during their experience (see Chart 6), they tended to work independently on their projects. For example, 31% reported working in a shared laboratory/space with others, but on different projects. Similarly, 21% indicated working alone on a project closely connected to other projects in their group, while 14% reported working alone (or along with their research mentor) and 15% alone with regular meetings for reporting progress. Only 19% indicated they worked with a group on the same project.

44% 18% 18% 17% 3% 1% 0% 20% 40% 60% I was assigned a project by my mentor I worked with my mentor to design a project I worked with my mentor and members of a research team to

design a project

I had a choice among various projects suggested by my

mentor

I designed the entire project on my own

I did not have a project Chart 5: Apprentice Input on Design of Their Project (n = 124)

31% 21% 19% 15% 14% 0% 20% 40%

I worked with others in a shared laboratory or other

space, but we work on different projects

I worked alone on a project that was closely connected with projects of

others in my group

I work with a group who all worked on the same

project

I worked alone on my project and I met with others regularly for general reporting or

discussion

I worked alone (or alone with my research mentor) Chart 6: Apprentice Participation in a Research Group (n = 124)

(30)

Apprentices were also asked about the types of activities they engaged in during their experience. As can be seen in Chart 7, the vast majority of respondents indicated interacting with STEM professionals and applying STEM knowledge to real life situations on most days or every day of the experience. The majority of apprentices also reported learning about STEM topics, learning about cutting-edge STEM research, and communicating with other apprentices about STEM on most days or every day. Mentors were asked similar questions about the nature of their apprentices’ experiences. Overall, their responses paint a similar picture of the CQL experience (responses to these items can be found in Appendix C).6

Because increasing the number of students who pursue STEM careers is one goal of the CQL program, the apprentice questionnaire also asked how many jobs/careers in STEM in general, and STEM jobs/careers in the DoD more specifically, apprentices learned about during their experience. As can be seen in Table 14, nearly all apprentices reported learning about at least one STEM job/career, and the majority (51%) reported learning about 4 or more. Similarly, 86% of apprentices reported learning about at least one DoD STEM job/career, with 54% reporting learning about 3 or more.

6 Because of the low response rates on both the student and mentor questionnaires, it is not possible to determine whether any 39% 20% 19% 19% 10% 6% 25% 42% 36% 26% 32% 18% 23% 35% 40% 41% 55% 75% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Learn about different STEM careers Learn about new science, technology, engineering, or mathematics

(STEM) topics

Learn about cutting-edge STEM research Communicate with other students about STEM Apply STEM knowledge to real life situations Interact with STEM professionals

Chart 7: Nature of Student Activities in CQL (n = 132-136)

(31)

Table 14. Number of STEM Jobs/Careers Apprentices Learned about During CQL (n = 119)

STEM Jobs/Careers DoD STEM Jobs/Careers

None 9% 14% 1 10% 19% 2 13% 13% 3 16% 11% 4 3% 5% 5 or more 48% 38%

Apprentices were also asked which resources impacted their awareness of DoD STEM careers. Participation in CQL (72%) and apprentices’ mentors (69%) were most often reported as being somewhat or very much responsible for this impact (see Chart 8). In contrast, the AEOP resources (website, social media, brochure, and instructional supplies) were not particularly impactful as, for each source, more than 65% of apprentices reported not experiencing it or it having no impact on their awareness of DoD STEM careers.

The questionnaire also asked apprentices how often they engaged in various STEM practices during CQL. Results indicate that apprentices were very actively engaged in doing STEM during the program (see Chart 9). For example, 93% of

66% 69% 71% 50% 37% 9% 10% 14% 15% 16% 16% 11% 6% 3% 18% 13% 8% 26% 20% 14% 18% 2% 3% 4% 5% 25% 35% 27% 0% 0% 0% 3% 8% 37% 42% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

AEOP instructional supplies (Rite in the Rain notebook, Lab Coat, etc.)

AEOP brochure AEOP social media Army Educational Outreach Program (AEOP) website Invited speakers or “career” events Participation in CQL My mentor(s)

Chart 8: Impact of Resources on Student Awareness of DoD STEM Careers (n = 122-124)

References

Related documents

broadband high speed internet. Where the ‘digital divide’ previously was an issue of internet access, it is now an issue of speed. Public and private efforts are being made around

Guidelines/Requirements (pages 9-10) for clarification and additional details about this designation. All students who participate in a UNITE program must be U.S. citizens or

Irrigation Practices. Rome, Italy: United Nations FAO. Grain yield response and N-fertilizer recovery of maize under deficit irrigation. Field Crops Res. Long-term response of corn

In addition, larger alternator sizes can provide a cost-effective use of engine power in across-the-line motor-starting applications and can be used to minimize voltage

AR, Attack Rate; CFR, Case Fatality Rate; DENV, Dengue Virus; DHF, Dengue Hemorrhagic Fever; EPHI, Ethiopian Public Health Institute; IGM, Immune globulin M;

However, the result of efficiency of removing heavy metals especially iron and chromium still not obvious enough to convince the society or investor to invest constructed wetland

In determining whether bisoprolol, carvedilol, or sustained-release metoprolol succinate was prescribed at discharge, it is not uncommon to see conflicting documentation

_gaq exists before the family tree questionnaire templates printable family reunion, then you with a family reunion, however this page did your extended family.. Effort and