Browsing by Subject "Web-based training"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item The practitioner-driven system : an interactive qualitative analysis of the e-learning creation experience(2015-05) Derr, David Roy; Resta, Paul E.; McCoy, Danny; Hughes, Joan E.; Patterson, Jeffery; Riegle-Crumb, CatherineContemporary e-learning research often addresses a singular instructional topic, learning strategy, or authoring tool. Technological advancements and evolving delivery methods are changing the e-learning practitioner experience more rapidly than ever before, and the need for a holistic illustration of the modern-day practitioner experience has never been greater. This Interactive Qualitative Analysis (IQA) of the e-learning creation experience consists of Affinity Production Interviews, interviews, and an online survey of e-learning practitioners working with adult audiences. The result of the study is an e-learning creation experience system driven by participants’ stories. The system is comprised of twelve affinities including leadership, policy, the instructional systems design process, the client relationship, emotions, and more. Exercising the system reveals conditions that influence the ultimate outcome of the system: e-learning success.Item The ability of the Kolbe A Index action modes to predict learners' attitudes and achievements within a Web-based training context(Texas A&M University, 2004-09-30) Wongchai, SasichaThe purpose of this study was to investigate the ability of the Kolbe A Index to predict learners' attitudes and achievements within a web-based training context. The Index is used to measure the conative capacities of individuals. The Index translates raw scores into four Action Modes: Fact Finder, Follow Thru, Quick Start, and Implementor. A web-based simulation of training on customer service excellence was created containing four modules designed to match the respective learning style of each of the four Action Modes suggested by the Kolbe Corporation. Research questions were as follows. To what extent do the four Kolbe A Index Action Modes predict 1) how well learners will like content formatted to match the learning styles of the four Modes, 2) how well learners will remember content formatted to match the learning styles of the four Modes, and 3) how well the learners will remember the content regardless of the format? Three experts in applications of the Kolbe A Index then validated the simulation. Five other experts, each with a Ph.D. in the social sciences, validated the evaluation of learners' attitudes and achievements. Then a pilot study to collect data for a reliability analysis was conducted. Sixty graduates from an international program in economics in Thailand participated in this study. Data were collected entirely through the Internet and in English. Multiple linear regression analyses with backward stepping method were performed to answer the research questions. Based on the limitations and data analyses of this study, the Kolbe A Index Action Modes did not predict how well learners liked content formatted to match the learning styles of the four Modes, how well learners remembered content formatted to match the learning styles of the four Modes, nor how well the learners remembered the content regardless of the format. More research is needed to explore how the Kolbe A Index Action Modes can be used to predict learners' attitudes and achievements.Item Web-based training evaluation in the workplace: Practices, instructional architectures, and skills(2006-08) Rodgers, Ida L.; Carter, Joyce L.; Dragga, Sam A.; Baehr, CraigOn August 19, 2005, I successfully defended my doctoral dissertation with my committee, which included Dr. Locke Carter (Chair), Dr. Sam Dragga (Department Chair), and Dr. Craig Baehr. In addition to my dissertation committee, Dr. David Roach from Communications Studies represented the university. The research for the dissertation, "Web-based training evaluation in the workplace: Practices, instructional architectures, and skills," examined WBT evaluation (WBTE) in workplaces. The problem, to which this research responds, is the scarcity of information about WBTE in the field of technical communication. Thus, practitioners may lack expertise that would make them valuable to WBT development teams and their companies. In addition, academic program developers may lack courses or course components suiting the needs of students taking jobs with companies that create WBT. The problem stems from three factors: (a) training has mostly shifted from the classroom to the Web, (b) technical communicators work on Web-based training projects, and (c) evaluation is a necessary component of WBT projects. The problem affects both practitioners and academics because the field does not identify itself as WBT developers and evaluation experts. The study's multiple methods included a workplace site visit, an expert panel for validity review of survey items, a usability test, an online survey, a second expert panel validity review of results, and researcher reflections to identify results that triangulate. Another feature of this study is that it crossed many disciplinary boundaries, which presented challenges to validity. The challenges to validity became a methods thread in the study as I sought to build a case for validity. Results of the study include information valuable to technical communication practitioners and academics responsible for program development. Some results show that my participants are highly educated, come from widely varied fields, work on teams of three to five people, often perform the team role of project manager, and employ a wide variety of formative, summative, and reflective evaluation measures. Results of the instructional architecture methods used were less clear except to illustrate the technical communication maxim that form and content depend on context, audience, and purpose. The study results may affect practitioner self-study, program development, and research methods in our field because the results illustrate the desirability of expanding our field's definition of itself to include WBT developers and evaluation experts. The study, in addition to collecting data, represents a model of the three evaluation stages: formative, summative, and reflective (a term I adopted from the field of composition that applies to some evaluation and research methods). The study concludes with three practical products. One product includes suggestions for both practitioners and academic program developers for crossing disciplinary boundaries to achieve this expansion. Another product is a list of online types of training or education and suggestions for evaluation methods that apply to each type. The third product is a research methods model that includes formative, summative, and reflective practices. This study revealed many additional areas for research. These include examination of evaluation methods appropriate and useful for various types of online education particularly in the field of technical communication, of WBT evaluation measures of results and how these impact public discourse, and decision-making in the WBT and online educational development processes including rhetorical, ethical, and methodical considerations.