Full text



The policies and implementation of Medallion Learning's Calibrate ensure the use of best practices in online education in the areas of Content Development, Course Design, Learner Activity and Assessment, and Program Assessment. Calibrate, a Dynamic Learning Platform (DLP), has been designed to support the advantages offered by an integrated environment for content development and delivery, active learning, learner assessment, and program assessment.


Content Development: Content is developed by subject matter experts guided by the Medallion Learning publishing team. Material is thoughtfully and clearly organized into an instructional hierarchy from basic objectives to topics that cohesively group objectives and units of associated topics. Opportunities for independent practice and validation of learning are created for the unit, topic, and objective levels.

Course Design: The course interface features a course outline with indications of learner progress. Calibrate incorporates multiple media: text, video, interactive experiences, and opportunities for collaboration. Both instructional and technical support are provided throughout the course.

Learner Activity and Assessment: In Calibrate, each learner completes a pre-course assessment to inform the construction of an individual learning path. Learners then have frequent interactive opportunities to check their understanding of topics.

Assessments are available at the unit level and at the course level.

Program Assessment: Calibrate is designed to capture high-quality comprehensive learning data for use with both innovative and established learning analytics.

Learning analytics are used to assess and improve course design and to assess and improve reliability of learning validation.

Sound Technology: The Calibrate platform is based in a secure, scalable

infrastructure. The authoring interface promotes the development of content in accordance with the Calibrate instructional model. The learner interface allows intuitive navigation of varied instructional material, while maintaining clarity of the learning path. This DLP gathers a rich, high-quality range of data on how the learner is using the course materials, and is flexibly configured to allow rapid

implementation of automated or researched, data-informed, updates.



The term “online education” spans a range of instructional designs—including online tools that enhance a classroom course, online tools that serve as the primary contact between a cohort of students and an instructor sharing a schedule of

activities, and online tools and prepared materials that create a free-standing learning experience.

The latter, known as asynchronous Web-based education, offers benefits of flexible scheduling and self-paced learning that are particularly attractive in the context of professional development. With well-designed courses, this level of Web-based learning may be as, if not more, effective than face-to-face education in terms of conveying the same material. This conclusion is supported by meta-studies of comparisons of online and face-to-face courses in a variety of settings (U.S.

Department of Education Office of Planning, Evaluation, and Policy Development, 2010).

As a case in point, one study (Lovett, Meyer, & Thille, 2008) based on a statistics course offered through the Open Learning Initiative (OLI) found that students taking a free-standing version of the course online had test scores somewhat better than those of the students taking the course in a traditional classroom. Moreover, they completed the course in half the number of weeks and half the number of hours on task as the classroom students. It should be noted that the OLI course was

designed to have a clear structure and to offer students timely and targeted feedback on their work.

A 2009 comparison of Web-based and live instruction in the setting of a professional development course also found that students in the Web-based course scored higher on the post-test than the participants in the face-to-face version. The pedagogy for the online course was based on constructivist theory (Pang, 2009).


While online learning is, at its core, learning, it presents opportunities and

challenges of its own. Sound educational principles and practices must underpin and be interpreted for online course development and delivery (Terry

Anderson). Asynchronous online learning requires the design and implementation of a full course in an LMS or LCMS. This offers the opportunity for the designers to plan and to review the course as a whole. One response to this opportunity has been the proliferation of rubrics to assist in the planning and review (Ko & Rossen, 2010).


Widely used rubrics classify areas of online education, articulate goals within those areas, and give indications of ways to meet those goals.

For example, the North American Council for Online Learning and the Southern Regional Educational Board (SREB) use standards developed by SREB (North American Council for Online Learning, 2011) (Southern Regional Education Board, 2006). These standards are representative of those developed by other organizations, such as the Quality Online Course Initiative (Illinois Online Network, 2010), and the Quality Matters TM Program.

SREB organizes the standards into the areas of Course Content, Instructional Design, Student Assessment, Technology, and Course Evaluation and Management.

Standards are provided for each area, along with indicators that the standards are met. As applicable to self-contained corporate instruction: (1) the standards and salient indicators are summarized here; (2) emphasis is on indicators relevant to asynchronous, free-standing online courses; and (3) methods by which Calibrate addresses the standards are indicated at the end of each area.

1. Course Content

The course provides online learners with engaging learning experiences that promote their mastery of content and are aligned with accepted content, most specifically:

 Goals and objectives are measurable and clearly state what the participants will know or be able to do at the end of the course.

 Content is aligned with accepted content standards for the field.

 Content and assignments have sufficient rigor to address those standards.

 A clear, complete course overview and syllabus are provided.

 Requirements are consistent with course goals and are clearly stated.

The Calibrate Content Development and Course Design components target these indicators specifically.

2. Instructional Design

The course uses learning activities that engage students in active learning and provides students with multiple learning paths based on student needs. This involves the following:

 The design reflects student needs, and incorporates varied ways to learn and multiple levels of mastery.

 The course is organized into units and lessons.

 Unit and lesson overviews are provided.

 Instruction includes opportunities for active learning.


 Instruction provides students with multiple learning paths, based on student needs.

 The design provides opportunities for student-student interaction and instructor-student interaction, including timely feedback.

The Calibrate Learner Activity and Assessment component provides for

construction of an Individual Learning Path. Course design uses the “chunked”

hierarchy advocated here. A variety of tools in multiple media are available to learners. Assessments and other interactive content provide opportunities for active learning. Calibrate supports student interaction, and this may be further promoted by an organization’s course manager.

3. Student Assessment

The course uses multiple strategies and activities to assess student readiness for and progress in course content and provides students with feedback on their progress. For example:

 Ongoing and frequent assessments are conducted to verify each student’s readiness for the next lesson.

 Students are continuously made aware of progress in class.

The Calibrate Course Design and Learner Activity and Assessment components address these indicators. Assessment strategies are limited to assessments with automatic scoring and open-ended activities that are self-assessed by reference to examples provided in the course materials.

4. Technology

The course takes full advantage of a variety of technology tools, has a user friendly interface, and meets accessibility standards, as evidenced by:

 The course is easy to navigate.

 The course provider offers the students and course manager technical support.

 The course provider offers orientation and training.

The Calibrate Instructional Window is designed for ease of navigation. Course Design includes tutorials on use of the course and the dashboard. Medallion provides on-call technical support.

5. Course Evaluation and Management

The course is evaluated regularly for effectiveness and the findings are used as a basis for improvement. The content and delivery of the course are kept up to date, including:


 The course provider regularly uses multiple ways of assessing course effectiveness as a basis for improvement.

 The course is updated periodically to ensure timeliness.

Data-informed revisions enabled by learning analytics are a feature of the Program Assessment component, supported by Calibrate.

It’s important to note that indicators relating to the fostering of a community of learners have been de-emphasized in this section because development of community is not a core strength of this type of course. An organization using a free-standing course in an online learning environment that supports collaborative tools may certainly promote collaboration as part of the organization’s use of the course. Variety in assessment modes, an indicator scored in the full rubric, is also somewhat limited in a free-standing course, due to the need for automation of feedback.


With the proliferation of online education, organizations face the issue of selecting the learning technologies that best meet their needs. Some organizations develop their own criteria and process for learning technology selection. Others take advantage of available rubrics of learning technology features to select the

implementation that suits their situation (Longsight, 2013). Researchers have noted that rubrics for learning technology selection are summative. Consequently, such rubrics must be organized and interpreted with emphasis on make-or-break features and technical, pedagogical, and business use cases (Winer, Finklestein, Deutsch, &

Masi, 2005).

The standards for learning technologies generally include that the platform support best practices in online pedagogy, while adding categories for hardware, software, licensing, and pricing. As an illustration, a framework might suggest rating (1) speed of system, (2) server requirements, (3) browser setup, (4) scalability, and (5) technical support (Longsight, 2013). On technical aspects such as these, Calibrate compares very favorably to the typical LCMS, or any learning technologies, for that matter. The Calibrate Platform is a secure Web application deployed in the cloud. It has exemplary software properties, and no client hardware requirements beyond access to a browser.


Learning Analytics are a particular growth area for online education. In online education, learners generate data about the learning process that can be recorded by


the learning system. Institutions are using this data to identify and retain at-risk students and to improve learning outcomes (EDUCAUSE Brief, 2012).

Online assessment data can be collected and analyzed at a level of detail that is impractical in other learning environments, and methods of test analysis developed for large-scale standardized tests can be brought to bear on individual courses, tests, and even test items (Thissen & Wainer, 2001). Data on frequency of the use of the course resources has predictive power for success in the course, and can be used to identify at-risk students (Macfadyen & Dawson, 2010). Innovative methods may be applied to uncover associations between use of particular instructional materials and course success. Learning analytics based on comprehensive, high-quality data generated by online learners enable teachers, learners, and stakeholders to make data-informed decisions about course design and delivery.

Calibrate was designed from its inception to generate and capture high-quality data on learner actions. In standard courses, the data is used in automated analytics to monitor assessment item performance. The data is also used in multiple learning analytics, among them analysis of resource use by effective and ineffective learners for optimization of instructional tools, learner path construction, and assessment reliability. For custom courses, Calibrate has the flexibility to filter and report data to suit the needs of the custom application.


The Calibrate instructional model and the Calibrate platform dovetail to offer a teaching and learning environment uniquely suited to the development and delivery of excellent e-learning. Content is current, varied, and clearly organized. The

delivery promotes active learning in multiple learning styles. The learner has

seamless access to multiple media, interactions, and assessments. Learning analytics in the Calibrate DLP enable data-driven optimization of instruction. Due to the cloud-based implementation of Calibrate, these features are available to any learner with internet access.



EDUCAUSE Brief. (2012). EDUCAUSE Analytics Sprint. EDUCAUSE.

Illinois Online Network. (2010). QOCI Rubric. University of Illinios.

Ko, S., & Rossen, S. (2010). Teaching Online: A Practical Guide (3 ed.). New York:


Longsight. (2013). Criteria for the Evaluation of Learning Management Systems.

Lovett, M., Meyer, O., & Thille, C. (2008, May). The Open Learning

Initiative:Measuring the Effectiveness of the OLI Statistics Course in Accelerating Student Learning. Journal of Interactive Media in Education.

Macfadyen, L. P., & Dawson, S. (2010). Mining LEARNING TECHNOLOGIES data to develop an ‘‘early warning system” for educators: A proof of concept. Computers

& Education (54), 588-599.

North American Council for Online Learning. (2011). National Standards for Quality Online Courses. Vienna, VA: iNACOL.

Pang, K. (2009). Video-Driven Multimedia, Web-Based Training in the Corporate Sector: Pedagogical Equivalence and Component Effectiveness. International Review of Research in Open and Distance Learning , 10 (3), 1.

Southern Regional Education Board. (2006). Standards for Quality Online Courses.

Atlanta: SREB.

Thissen, D., & Wainer, H. (2001). Test Scoring. Mahwah: Lawrence Erlbaum Associates.

U.S. Department of Education Office of Planning, Evaluation, and Policy

Development. (2010). Evaluation of Evidence-Based Practives in Online Learning: A Meta-Analysis and Review of Online Learning Studies. Washington, D.C.: U.S.

Departmenty of Education.

Winer, L., Finklestein, A. B., Deutsch, M. D., & Masi, A. C. (2005). The Matrix Transformed: Achieving Better Focus and Insight in Learning Management System Selection. EDUCAUSE.