Vol. 4, No. 2 (2010)

 

http://www.eludamos.org

 

 

Exploring the Challenges in Designing and Implementing an Assessment Plan for a Virtual Engineering Materials Lab

Marilee J. Bresciani, Khaled Morsi, Allison Duncan, Mark Tucker, Mark Siprut,

Kris Stewart

Eludamos. Journal for Computer Game Culture. 2010; 4 (2), p. 277-285

 

 


Exploring the Challenges in Designing and Implementing an Assessment Plan for a Virtual Engineering Materials Lab

Marilee J. Bresciani, Khaled Morsi, Allison Duncan, Mark Tucker, Mark Siprut, and Kris Stewart

 

Particularly in times of increasing student populations and decreasing resources, educators must find ways to creatively engage students in learning. One such method is utilizing cutting-edge educational technology, incorporating the ever-popular video game format into the classroom.

This article describes the process of creating, implementing, and assessing an innovative engineering materials learning tool. The game-based materials laboratory simulation, MATERIALS-ISLE (Interactive Simulated Laboratory Experience) incorporated into the Engineering curriculum at a large public university, is intended to facilitate the same learning previously taught in a traditional hands-on materials laboratory. Through this technological tool, researchers hope to extend an integral learning opportunity to students currently unable to access the physical materials lab courses, as well as, to augment and reinforce the material taught to those currently enrolled in physical materials lab courses. Throughout the article, the research team discusses the assessment methodology, describes several challenges overcome, and offers recommendations for others interested in utilizing game-based technology in educational settings.

 

The Need for and Value of Game-Based Educational Tools

The combination of dwindling budgets and increasing numbers of students pursuing higher education has led leadership at many colleges and universities to decrease the instructional resources available to each student. In some cases, this includes limited availability of laboratory-based instructional time and tools. As educational resources are decreasing, the popularity of computer games is simultaneously increasing.  Computer and video game sales in the United States have been reported to exceed eleven billion dollars in 2008 (Waggoner 2010). It has also been reported that the average college graduate would have played video games in excess of 10,000 hours (Prensky 2001). In addition to the recreational value of such games, recent research shows that games also have an educational value. It has been reported that using computer games results in inductive reasoning, thus improving thinking abilities (Camaioni et al. 1990). Also, since computer games can encompass each of the main learning styles (e.g. visual, auditory, and kinesthetic) (Chang et al 2006), they have the potential to teach material to all types of learners. The interactive three-dimensional feel certainly adds to the experience. Aziz and his colleagues designed and piloted game-based engineering labs through industrial plant emulator system experiments used to teach students modeling of friction, inertia, stiffness, and backlash effects in machines (Aziz et al. 2010). In the online game Second Life (Linden Lab 2003), Pharmatopia and the SWIFT project on pharmaceutical and genetics laboratory experiences respectively are other similar examples of interactive virtual laboratories.

Desiring to harness students' enthusiasm for video games and apply it to the field of materials science education, this research team is developing and implementing a game-based materials laboratory simulation, with possible future implementation in Second Life. Utilizing this interactive and immersive simulation, researchers believe students will master skills and knowledge traditionally only acquired in a physical materials laboratory.

Researchers at a large, public institution in the southwestern United States are developing a game-based immersive and interactive computer simulation: MATERIALS-ISLE to simulate a traditional materials laboratory and are testing its effectiveness using a sample of undergraduate engineering students. The sample under investigation includes both Mechanical Engineering students who are currently taking, or have previously completed, the traditional hands-on lab component of the Introduction to Engineering Materials service course, as well as, students majoring in other areas of engineering (e.g. civil, aerospace) who take the service course but are not currently offered the hands-on materials laboratory component. MATERIALS-ISLE intends to enhance the learning of both engineering students currently deprived of a materials lab experience, as well as, those taking the traditional lab. For the former, it is expected to alleviate the apparent learning deprivation. For the latter, MATERIALS-ISLE will be used as a practice tool to augment the student's experience in a physical lab, and offer additional experiments not afforded during the normally allotted lab time.

The intent of this simulation is to provide students with a realistic materials laboratory experience, thus narrowing the gap between theory and practice. Designers incorporated realistic experimental/physical details and experiences, as well as, integrated unforeseen experimental roadblocks (UERs) into the simulation. In doing so, researchers aimed to expose students to subject matter not normally explained in textbooks and broaden their educational experience. In addition to the aspects described above, the simulation also contains several unique features through which students can learn. One such feature, "Journey through the Lattice," allows the player/student to be teleported inside the atomic crystal lattice where he/she roams amongst atoms while learning about crystallography.

Integral to the success of any new educational tool is proof that it is increasing students' learning. To that end, an assessment plan of MATERIALS-ISLE has been designed and implemented. This assessment is intended to document the value-added learning of this laboratory simulation, and perhaps lead to its improvement. The purpose of this article is to describe the process and resulting challenges involved in designing and implementing the quasi-experimental student learning assessment component of a game-based virtual lab.

 

Assessing Learning through Computer Games

1. Write Learning Outcomes for MATERIALS-ISLE

The first step in measuring the success of an educationally based computer simulation is to clearly state its educational objective.  MATERIALS-ISLE was designed to achieve or exceed learning outcomes identical to those gained through participation in a Mechanical Engineering (ME) lab course; therefore, each of the MATERIALS-ISLE  learning outcomes directly corresponds to a learning outcome expected of students taking the ME course held in a traditional lab setting. For example, the course's learning outcome: "[Demonstrate] an ability to apply knowledge of mathematics, science, and engineering" corresponds to three MATERIALS-ISLE-specific outcomes:

·        [Demonstrate] an ability to construct an engineering stress-engineering strain curve from load-displacement data.

·        [Demonstrate] an ability to calculate important tensile properties of engineering materials.

·        [Demonstrate] an ability to convert engineering stress-engineering strain to true stress-true strain.

The research team experienced no challenges in articulating the learning outcomes for MATERIALS-ISLE since the learning outcomes for the Mechanical Engineering physical lab course had already been clearly defined and agreed to by engineering faculty who taught the lab course.  The team designing and assessing MATERIALS-ISLE, simply used the learning outcomes from the course syllabus.

 

2. Assess Students' Previous Knowledge of the Topic Directly and Indirectly

In following Astin's I-E-O assessment model (1991), the researchers desired to assess students' knowledge of the topic prior to their exposure to any of the lecture materials pertinent to the lab or to any lab experience. Prior knowledge was directly assessed through pre-test methods. Pre-tests were comprised of questions related to the content being taught in the lab. The researchers had little difficulty in designing the initial pre-test of knowledge for the quasi-experimental design as instructors already had knowledge of what they expected students to learn in the traditional labs.

In addition, researchers explored students' attitude toward gaming technology. This information proved helpful, as a student's lack of experience with computer-gaming technology may inhibit his/her attainment of learning outcomes using such technology.  Surveys indirectly measured students' knowledge, through a subjective assessment of what a student thinks he/she understands about the topic. Students' self-perceptions of their learning and comfort level with instructional methods may or may not be accurate, yet gathering these self-perceptions allows instructional designers to adjust language and approaches to topics, and facilitate student awareness of learning.

 

3. Directly Assess Achievement of Learning Outcomes While Using MATERIALS-ISLE

The game-based simulation was designed with specific learning outcomes in mind. Advancement from one part of the simulation to another and/or successful completion of the interactive game-based simulation relied on players demonstrating achievement of the learning outcomes. Thus, evaluation of students' mastery of each learning outcome is assessed directly while they were playing the simulation. Methods of direct assessment included gathering the background transaction logs for the simulation to determine number of attempts to successfully complete the a) virtual experiments and b) answer Super-Tech's (the lab technician) test questions, as well as, c) creation of a poster presentation outlining the methodology and results from the experiment. The background logs of the students' navigation of MATERIALS-ISLE provide instructors with information about where in the learning process students may be misunderstanding information or lacking the skill needed to successfully demonstrate their learning.  Such data inform specific refinement in instruction that preceded the virtual lab and refinement of MATERIALS-ISLE itself.

Attainment of learning outcomes is demonstrated through the students' ability to successfully complete the virtual laboratory experiment. Successfully conducting the experiments demonstrates knowledge of engineering materials and their distinct properties because without this understanding, the experiments would fail. MATERIALS-ISLE-specific outcomes are tested via the 'Super-Tech' character as well. The Super-Tech character (lab technician) is a virtual instructor (chosen by the student to be either male or female). Super-Tech guides students through the interactive simulation and quizzes them. Students are unable to advance until correctly answering each of Super-Tech's questions. Finally, students demonstrate their knowledge by creating posters that summarize the property results of all materials tested throughout the simulation. Instructors, who use a rubric, to determine correct application of materials, evaluate posters. The posters provide an opportunity for students to demonstrate how clearly they communicate their learning in writing, tables, plots, photographs, and/or micrographs. The posters are uploaded and accessed through MATERIALS-ISLE to be viewed by other students. Thus allowing students to informally evaluate one another and  increase their own understanding.

For the most part, challenges encountered here were not related to assessment.  In other words, the research team could agree on what learning looked like and how it should be evaluated.  Rather, the challenge that arose was with the assessment of a quality poster presentation.  The components that would make up a quality poster presentation as it related to course content were not under dispute; what was under dispute was the quality of the aesthetic design of the poster.  Since the simulation does not consist of content to teach students how to aesthetically design a poster presentation, it only teaches the content that is contained in the poster, the researchers removed evaluation of the aesthetic quality of design of the poster presentation and focused on the evaluation of the content itself. Additionally, the researchers opted to include a poster constructed to their standard as an example. Instructions on poster formatting and an assessment rubric are given to students in advance.

 

4. Compare Data from Pre-assessments with Data from Mid & Post-assessments

Part way through the course, students are exposed to class-based lectures on the same topics they will be covering in MATERIALS-ISLE. After students participate in these lectures, and before they are exposed to both the physical lab and MATERIALS-ISLE (if they are Mechanical Engineering students) or MATERIALS-ISLE only (if they are not Mechanical Engineering students), students are tested a second time. This "post-lecture pre-lab" test is identical to the pre-test and used to gauge the depth and breadth of additional knowledge students have gained through the class lectures.

Once students have used the MATERIALS-ISLE, they are asked to complete post-surveys and post-tests identical to the pre-surveys and pre-tests they completed previously. By comparing the results of the pre- and post- documents, researchers were able to see where students' understanding of materials engineering had improved, and/or their attitudes toward the subject had changed. For example, a student may have answered two pre-test questions correctly and four post-test questions correctly. This increased knowledge can be attributed to playing the simulation. Additionally, this student may have rated her/himself more favorably in the post-survey than in the pre-survey. For example, moving from a self-rating of one (indicating a "strongly disagree") to a five (indicating "strongly agree") in his/her "ability to design the tensile test." This increased confidence is also attributed to utilizing the game-based simulation.

More than a general analysis, the researchers compared each student's pre-tests and surveys with his/her post-lecture pre-lab test and with his/her post-tests and surveys, noting each student's individual areas of improvement, as well as, continued challenges. Comparing the individual's results allows the researchers to compare how well a student thinks he/she has mastered the subject (assessed using his/her self-ratings on the survey) with how well the student objectively demonstrated this knowledge by successfully completing each of the experiments in MATERIALS-ISLE and answering the test questions correctly. 

The challenges that occurred here were numerous. When instructors who were not involved in the research project were asked to participate in administering the assessment tools, they graciously complied. Participation from instructors who were not directly involved in this research did, however, pose coordination challenges for the researchers such as: making sure that instruments were administered at the appropriate times and that all questions were included in the expected order.

In order to administer each assessment tool at the appropriate time, the instructors had to coordinate with one another, as well as, with the research team. For example, the instructor facilitating the Materials Laboratory course had to make sure that the post-lecture pre-lab test was administered by the research team before he exposed students to the relevant physical lab exercises. Similarly, the post-lab test and survey had to be administered at a specific time. Although all students were ultimately exposed to the simulation, so that no students were deprived of this learning opportunity, students were randomly assigned into two groups: "Game-based simulation" and "No Game-based simulation." In order to accurately assess the knowledge gained using MATERIALS-ISLE, the post-test and survey had to be administered before the students assigned to the "no Game-based simulation" group were actually given access to MATERIALS-ISLE. This coordination proved challenging. It required generating special usernames for all students, and coordinating with the computer science researchers to initially allow MATERIALS-ISLE access only to the students assigned to the "Game-based simulation" group. Later, once the post-test and survey had been completed, the computer science researchers granted access to students in the "no Game-based simulation" group. Giving out individualized usernames to more than 100 students was also challenging.

Some instructors administered paper surveys and tests during class. Others administered the tools via Blackboard to avoid using class time. While maintaining class time for instruction is admirable, Blackboard-based administration posed two additional problems: lack of student participation and more challenging data analysis. Instructors encouraged student participation using a reward system. Students were awarded points toward their class grade for completing the pre, mid, and post-tests and the pre and post-surveys. This reward system appears to be successful, as many students did complete the assessment items. The analysis challenge, however, was not as easily resolved. Instructors uploaded the assessment tools without time for researchers to ensure the tool's accuracy before being administered. Thus, in some cases, survey questions were placed in reverse order or inadvertently omitted altogether.  Although researchers could verify order of questions before comparing data from the current semester to previous semester data, this issue caused additional time to analyze and interpret results. Furthermore, formatting the data collected via Blackboard for analysis took longer than coding and inputting responses from paper documents.

Researchers recommend all instructors use survey specific technology (e.g. SurveyMonkey) to administer assessment tools. After facing the abovementioned challenges with other methods, we are currently using SurveyMonkey to administer all assessment tools. We are pleased with the ease and accuracy of this method. Students are able to access and complete the assessments at their leisure, class time is used solely for instruction, and data can be easily uploaded to SPSS software, thus eliminating the timely and error prone manual data entry required when paper tests were used.  Additionally, SurveyMonkey allows the research team to upload each test and survey once, resulting in more controlled consistency.

 

5. Review Findings and If Necessary Make Revisions

As with most projects in early phases, the game-based simulation and/or the assessment tools may not be perfect initially. If you find that students are not achieving the original learning outcomes, one of two things is likely the issue: either, the assessment tools are not accurately soliciting the demonstration of knowledge, or the game-based simulation is not properly teaching it. After the initial semester of base-line data collection, the instructors desired to change some of the survey questions.  The PIs fully complied realizing that it was better to refine the survey to more accurately evaluate student attitudes and perceptions, therefore, better assessing their learning. Changing survey questions did, however, result in challenges gathering corresponding data over several semesters for long-term comparative purposes.

 

Additional Challenges Faced & Advice for Future Assessment Teams

Although the researchers are excited by the impact that MATERIALS-ISLE may have on student learning, several difficulties arose during its development and assessment. In addition to the difficulties described above, several other challenges arose. The following is a list of additional assessment-related difficulties that this research team encountered, as well as, some tips we found helpful.

This research team included experts in the areas of: materials engineering, computer science, digital arts and design, and educational assessment. Based on his/her specialty and area of expertise, each researcher's priorities varied. For example, the digital arts/design expert was most concerned with the aesthetics and design of MATERIALS-ISLE, while the educational expert was primarily interested in ensuring that students achieved each proposed learning outcome. Although knowledge in each of the four areas was required, the conflicting priorities among researchers initially led to gaps in communication and slowed progress in the perfection and utilization of MATERIALS-ISLE.  The research team committed to weekly meetings, which significantly enhanced communication and solution finding, however, instructors who were not on the research team did not attend these meetings. The research team believes that if the instructors had a more thorough understanding of the research, it would have benefited the overall effort. Many of the abovementioned challenges (e.g. question order changes, and administering at assessment tools at inappropriate times) may have been avoided.  The researchers recommend that instructors who are not a part of the research team be invited and encouraged to attend at least three meetings throughout the course of the semester (beginning, middle, and end) to gain this more in-depth understanding.  The purpose of these meetings is to reinforce the larger context of what the instructors are being asked to do and increase their awareness of the challenges faced by this team (i.e. reinforcing the importance of and explaining the logistical challenges associated with the timely administration of the assessment tools).  

The research team's desire to make the accompanying assessment tools replicable to other institutions of higher education informed their decision to only utilize quantitative assessment methods (e.g. tests and surveys). They recommend, however, that future researchers augment these methods with qualitative data from students by facilitating student focus groups and conducting faculty interviews.  The responses elicited from such qualitative assessments will enrich the researchers' body of knowledge, as well as, potentially uncover additional areas of improvement with and the way in which students prefer to have their learning and attitudes evaluated.

 

Conclusion

As students grow, evolve and change so must the methods used to facilitate their learning. As access to educational resources becomes scarcer, educators have a responsibility to creatively fill in the learning gaps and increase educational access to all students. Through innovation, this research team intended to do both. We wished to engage students by incorporating engagement into learning through the utilization of educational gaming simulations similar to the video games many students enjoy playing. In addition, we hoped to increase educational access by inviting all engineering students to play MATERIALS-ISLE rather than limiting use to only the Mechanical Engineering students, as the current enrollment policy in traditional materials laboratory courses does. Furthermore, we optimistically expect all of the proposed learning outcomes for the traditional laboratory courses are either achieved or surpassed through the use of MATERIALS-ISLE. Utilizing the purposeful assessment process described above we are hoping to show that our goals of engagement and access, as well as, the intended learning outcomes, have been achieved.

 

Acknowledgments

The authors wish to thank the National Science Foundation (grant number 0837162) for their integral support, as well as, the students without whom this project would not have been possible.

 

Games Cited

Linden Lab (2003) Second Life. Linden Lab (PC).

 

References

Astin, A.W. (1991). Assessment for excellence: The philosophy and practice of assessment and evaluation in higher education. New York: Macmillan.

Aziz, E., Esche, S., & Chassapis, C. (2010). An interactive game-based engineering laboratory. World Transactions on Engineering and Technology Education, Vol. 8 (2), p.131-136.

Camaioni, L., Ercolani, A.P., Perrucchini, P., & Greenfield, P.M. (1990).Video games and cognitive ability: The transfer hypothesis. Italian Journal of Psychology, Vol. 17 (2), p.331-348.

Chang,C., Kodman, D., Esche, S., Chassapis, C. (2006, June). Immersive collaborative laboratory simulations using a gaming engine. American Society for Engineering Education Annual Conference and Exposition, Chicago, IL [Online]. Available at: http://soa.asee.org/paper/conference/paper-view.cfm?id=1742 [Accessed: 26 November 2009].

Prensky, M. (2001). Digital natives, digital immigrants. On the Horizon, Vol. 9 (5), p.1-2.

Waggoner, Zachary. (2010). Life in Morrowind: Identity, video games, and first-year composition. Currents in Electronic Literacy, 2010: Gaming Across the Curriculum.  [Online].  Available at: http://currents.cwrl.utexas.edu/2010/waggoner_life-in-morrowind [Accessed: 10 August 2010].